var/home/core/zuul-output/0000755000175000017500000000000015136471504014533 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015136502622015473 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000325741515136502534020273 0ustar corecore\zikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD  ?YˋI_翪|mvşo#oVݏKf+ovpZj>?xI[mEy},fۮWe~7Nû/wb~1;ZxsY~ݳ( 2[$7۫j{Zw鶾z?&~|XLXlN_/:oXx$%X"LADA@@tkޕf{5Wbx=@^J})K3x~JkwI|YowS˷j̶֛]/8 N Rm(of`\r\L>{Jm 0{vRFEbп3 FKX1QRQlrTvb)E,s)Wɀ;$#LcdHMZ8HGU=y&|'oZƧe7ΣԟRxxXԨkJ[8 ";ЗH F=y܇sθm@%*'9qvD]9X&;cɻs0I٘]_fy tt('/V/TB/ap+V9g%$P[4D2L'1bЛ]\s΍ic-ܕ4+ޥ^.w[A9/vb֜}>| TXNrdTs>RDPhإek-*듌D[5l2_nH[׫yTNʹ<ws~^B.Ǔg'AS'E`hmsJU # DuT%ZPt_WďPv`9 C|mRj)CMitmu׀s9r;:J_T{*T\hVQxi0LZD T{ /WHc&)_`i=į`PÝr JovJw`纪}PSSii4wT (Dnm_`c46A>hPr0ιӦ q:Np8>R'8::8g'h"M{qd 㦿GGk\(Rh07uB^WrN_Ŏ6W>Bߔ)bQ) <4G0 C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>bugbǎ\J;tf*H7(?PЃkLM)}?=XkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjh}iL;R:7A}Ss8ҧ ΁weor(Ё^g׬JyU{v3Fxlţ@U5$&~ay\CJ68?%tS KK3,87'T`ɻaNhIcn#T[2XDRcm0TJ#r)٧4!)'qϷכrTMiHe1[7c(+!C[KԹҤ 0q;;xG'ʐƭ5J; 6M^ CL3EQXy0Hy[``Xm635o,j&X}6$=}0vJ{*.Jw *nacԇ&~hb[nӉ>'݌6od NN&DǭZrb5Iffe6Rh&C4F;D3T\[ bk5̕@UFB1/ z/}KXg%q3Ifq CXReQP2$TbgK ء#AZ9 K>UHkZ;oﴍ8MEDa3[p1>m`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/P_]F@?qr7@sON_}ۿ릶ytoyמseQv^sP3.sP1'Ns}d_ս=f1Jid % Jwe`40^|ǜd]z dJR-Дxq4lZ,Z[|e 'Ƙ$b2JOh k[b>¾h[;:>OM=y)֖[Sm5*_?$cjf `~ߛUIOvl/.4`P{d056 %w ^?sʫ"nK)D}O >%9r}1j#e[tRQ9*ء !ǨLJ- upƜ/4cY\[|Xs;ɾ7-<S1wg y &SL9qk;NP> ,wդjtah-j:_[;4Wg_0K>є0vNۈ/ze={< 1;/STcD,ڙ`[3XPo0TXx ZYޏ=S-ܑ2ƹڞ7կZ8m1`qAewQT*:ÊxtŨ!u}$K6tem@t):êtx: `)L`m GƂ%k1羨(zv:U!2`cV, lNdV5m$/KFS#0gLwNO6¨h}'XvوPkWn}/7d*1q* c0.$\+XND]P*84[߷Q뽃J޸8iD WPC49 *#LC ءzCwS%'m'3ܚ|otoʉ!9:PZ"ρ5M^kVځIX%G^{;+Fi7Z(ZN~;MM/u2}ݼPݫedKAd#[ BeMP6" YǨ 0vyv?7R F"}8&q]ows!Z!C4g*8n]rMQ ;N>Sr??Ӽ]\+hSQזL {g6R/wD_tՄ.F+HP'AE; J j"b~+'h=TԫeVިO? )-1 8/%\hC(:=4< ,RmDRWfRoUJy ŗ-ܲ(4k%הrΒ]rύW -e]hx&gs7,6BxzxօoFMA['҉F=NGD4sTq1HPld=Q,DQ IJipqc2*;/!~x]y7D7@u邗`unn_ư-a9t_/.9tTo]r8-X{TMYtt =0AMUk}G9^UA,;Tt,"Dxl DfA\w; &`Ͱ٢x'H/jh7hM=~ ֟y[dI~fHIqC۶1Ik\)3 5Ķ']?SؠC"j_6Ÿ9؎]TTjm\D^x6ANbC ]tVUKe$,\ܺI `Qز@UӬ@B {~6caR!=A>\+܁<lW Gϸ}^w'̅dk  C 7fbU{3Se[s %'!?xL 2ڲ]>i+m^CM&WTj7ȗE!NC6P}H`k(FUM gul)b ;2n6'k}ˍ[`-fYX_pL +1wu(#'3"fxsuҮױdy.0]?ݽb+ uV4}rdM$ѢIA$;~Lvigu+]NC5ÿ nNჶT@~ܥ 7-mU,\rXmQALglNʆ P7k%v>"WCyVtnV K`pC?fE?~fjBwU&'ᚡilRї`m] leu]+?T4v\% ;qF0qV(]pP4W =d#t ru\M{Nj.~27)p|Vn60֭l$4԰vg`i{ 6uwŇctyX{>GXg&[ņzP8_ "J~7+0_t[%XU͍ &dtO:odtRWon%*44JٵK+Woc.F3 %N%FF"HH"\$ۤ_5UWd̡bh塘ZRI&{3TUFp/:4TƳ5[۲yzz+ 4D.Ճ`!TnPFp':.4dMFN=/5ܙz,4kA<:z7y0^} "NqK$2$ Ri ?2,ᙌEK@-V3ʱd:/4Kwm2$'dW<qIE2Ľ)5kJҼMЌ DR3csf6rRSr[I߽ogCc;S5ׂdKZ=M3դ#F;SYƘK`K<<ƛ G׌MU.APf\M*t*vw]xo{:l[n=`smFQµtxx7/W%g!&^=SzDNew(æ*m3D Bo.hI"!A6:uQզ}@j=Mo<}nYUw1Xw:]e/sm lˣaVۤkĨdԖ)RtS2 "E I"{;ōCb{yex&Td >@).p$`XKxnX~E膂Og\IGֻq<-uˮ◶>waPcPw3``m- } vS¢=j=1 W=&;JW(7b ?Q.|K,ϩ3g)D͵Q5PBj(h<[rqTɈjM-y͢FY~p_~O5-֠kDNTͷItI1mk"@$AǏ}%S5<`d+0o,AրcbvJ2O`gA2Ȏp@Z#"U4Xk1G;7#m eji'ĒGIqB//(O &1I;svHd=mJW~ړUCOīpAiB^MP=MQ`ʻfRBȨHM(2hys|F 94 DNlϒòκ-q|xC ,gKDzHR%t+E/wd#礱ºȄWEz o\JξB.wLKZ39(M +(PWՇfR6#ю3Ȋt ݪbh]MTw䀩S]'qf&)-_G;"1qz퇛0,#yiq$ՁɄ)KٮޓJ|̖D?:3mhW=rOf'/wѹ8BS8]`;=?,ڼ"ϴq*(A7? /W= #^ub"6q f+=^OI@߱^F[n4A#bYѤwd)J^Z{*ǥzw73LuaVad=$6)iI gC~.1%YmҪ+2gSt!8iIۛ*JgE7LGoş\bC}O i ycK1YhO6 /g:KT sPv6l+uN|!"VS^΄t*3b\N7dYܞLcn3rnNd8"is"1- ޑܧd[]~:'#;N(NknfV('I rcj2J1G<5 Nj̒Qh]ꍾZBn&Un' CyUM0nCj.&Oڣg\q0^Ϻ%4i" ZZG>Xr'XKc$2iσֹH<6N8HSg>uMik{Fm(W F@@{W+ߑ?X2hS4-=^YgpUHެbZ!y!ul@ڼ63" ۩:6=TZõ$E,ϓRV|G&$rr;J TtIHFE=RȬ]P pLm|?$%>Eü%mWO[>Xmw,*9.[G n >X8Ī;xW%dT:`ٓ~:QO,}j6j!yڦʲT:Pqҋh] H+&=>g| Z;D8ܶb:! Å{2:+au 6:!fF+0#+̬NY"!6a7#񕪰%:r|o5Znڧs?si/W qEU馥˟^_޶oڷOj'?nc]Rn\t3^邳塨Lɏ"8k8M~?M}OAH$77f|lgn I;.K*!<+"eK5c&`X:#;@B@[(K44sBFu M.MNWLlY]K᜴=/ VމYlϿ4i36$>m|_>9|dUA"{!$jKx E$K3hN(tÊ-#v#O N, 9g80Ǭ&VdӞ5W1!1KYd`,-*&>F~⯰&jb.~cNk BL_OG]Bv.A|'qT(Ol.' 4IE|@Iі)<-p JkQm1 `qacܗVc?)cl*&<}P媠E{-sVU>߇GUt\+n3X]Byoz)li$2cPs6D>TE-n# rve{椱I |p)U݋7yJw&PzDgi xs  xh\L r Ѥo Zt(I >|$>tnMdэoXf~TTX)QӅtӚe~=WtX-sJb?U'3X7J4l+Cj%LPFxŰAVG Y%.9Vnd8? ǫjU3k%E)OD:"Ϳ%E)=}l/'O"Q_4ILAٍKK7'lWQVm0c:%UEhZ].1lcazn2ͦ_DQP/2 re%_bR~r9_7*vrv |S.Z!rV%¢EN$i^B^rX؆ z1ǡXtiK`uk&LO./!Z&p:ˏ!_B{{s1>"=b'K=}|+: :8au"N@#=Ugzy]sTv||Aec Xi.gL'—Ʃb4AUqػ< &}BIrwZ\"t%>6ES5oaPqobb,v 2w s1,jX4W->L!NUy*Gݓ KmmlTbc[O`uxOp  |T!|ik3cL_ AvG i\fs$<;uI\XAV{ˍlJsŅjЙNhwfG8>Vڇg18 O3E*dt:|X`Z)|z&V*"9U_R=Wd<)tc(߯)Y]g5>.1C( .K3g&_P9&`|8|Ldl?6o AMҪ1EzyNAtRuxyn\]q_ߍ&zk.)Eu{_rjuWݚ;*6mMq!R{QWR=oVbmyanUn.Uqsy.?W8 r[zW*8nؿ[;vmcoW]"U;gm>?Z֒Z6`!2XY]-Zcp˿˘ɲ}MV<в~!?YXV+lx)RRfb-I7p)3XɯEr^,bfbKJ'@hX><[@ ,&,]$*բk-Yv5 '1T9!(*t 0'b@񲱥-kc6VnR0h& 0Z|ђ8 CGV[4xIIWN?Yt>lf@ Vi`D~ڇŁQLLkY <ZPKoma_u` !>Z;3F\dEB n+0Z ?&s{ 6(E|<ޭLk1Yn(F!%sx]>CTl9"و5 |ݹր|/#.w0ޒx"khD?O`-9C| &8֨O8VH5uH)28 Ǿ-R9~ +#e;U6]aD6Xzqd5y n';)VKL]O@b OIAG Lmc 2;\d˽$Mu>WmCEQuabAJ;`uy-u.M>9VsWٔo RS`S#m8k;(WAXq 8@+S@+' 8U˜z+ZU;=eTtX->9U-q .AV/|\ǔ%&$]1YINJ2]:a0OWvI.O6xMY0/M$ *s5x{gsəL3{$)ՆbG(}1wt!wVf;I&Xi43غgR 6 ݩJ$)}Ta@ nS*X#r#v6*;WJ-_@q.+?DK១btMp1 1Gȩ f,M`,Lr6E} m"8_SK$_#O;V 7=xLOu-ȹ2NKLjp*: 'SasyrFrcC0 ѱ LKV:U} -:U8t[=EAV$=i[mhm"roe5jqf$i>;V0eOޞ4ccc2J1TN.7q;"sդSP) 0v3-)-ٕAg"pZ: "ka+n!e߮lɹL V3Os\ဝ+A= 2䣔AzG\ ` \vc"Kj61O Px"3Pc /' PW*3GX liWv-6W&)cX |]O;C%8@*Z1%8Gk@5^NtY"Fbi8D'+_1&1 7U^k6v읨gQ`LRx+I&s5Www` q:cdʰ H`X;"}B=-/M~C>''1R[sdJm RD3Q{)bJatdq>*Ct/GǍ-`2:u)"\**dPdvc& HwMlF@a5`+F>ΰ-q>0*s%Q)L>$ćYV\dsEGز/:ٕycZtO 2ze31cDB/eWy!A/V4cbpWaPBIpqS<(lȣ'3K?e Z?ڠ8VSZM}pnqL f2D?mzq*a[~;DY〩b𻾋-]f8dBմVs6傊zF"daeY(R+q%sor|.v\sfa:TX%;3Xl= \k>kqBbB;t@/Cԍ)Ga[ r=nl-w/38ѮI*/=2!j\FW+[3=`BZWX Zd>t*Uǖ\*Fu6Y3[yBPj|LcwaIuR;uݷ㺾|47ߍeys=.EinE% 1zY\+͕߬VͭW_겼cazyU1wOw)Ǽn@6 |lk'Z|VZpsqL5 څB}>u)^v~,󿴝} 3+m𢛲Pz_Sp2auQAP*tLnIXA6L7 8UgKdT)*7>p{Pgi-b)>U6IXabPde Ӽ8Ģ8GɄnb'G ֤Mcv4?>HC78NE@UMc8>`TvZ:}O ޵+ȗ$Nvi.p)7$ߡ$'v^r<ƍۃf8΋.#e(A YJ~DUa2Q/Z#ڔ^Y쫒E;}U rͯU*D!, xGy!Kޕoʱ4UQJppu )bT뢬VYq{\~*ϣB6Id_Mg~o@"O`ﭥxT"=io5iAz\Eup]Xx] #OXdPXg<ٻn7 Cyj O=> q<^2=1l0j?]f\\/)`j?W;ljW y蔔ay))1SR6 ' byk2+0VD^_Kc/"3rGH[luY$*b$Y2+/ `s5 @R O 2e3p.P0RG'X#ͣ oxR,YCu5j o-o'g 䵁/4}^Z G푃L-SJ,8AF9,2hDVy)bj)g[v9؜?p8p |+ytQy5}ч+X%_N0N뤒GE4?Èo 5Mkk0ǭbLzlvh& =ͫK^Q(PW본G=^W E(QH| }!x:^WNEUȨw{K;Շ3Дϟޝd3 lc(PW8SN ;{m"[pퟠ/Y,CP(Mc|B %:ĺ톌 /_? uB}U5 rȐlip0|6B 6{fl~MȁaK>)$a' nD+~/]@=_Y/A6\Us1uREm[3DlHgA :">T\eQMAĺy_:YUl6xkԘ5K@EzuXn06jD@ ͘A"2LM)V O8Af7*&O9ukn[umO {M0sՔ@~nZFn|^kd eѝ*e8v 0x89YjX0݈V~HjVghVOSzv51ߌ]C?N|G0VE$ b&ˣS_لs\g \uQ+lD[$)w#ReYTH^a^+< [ӎ>=+`￱HVwaNL\{ts aTJپLy) }og9,3r_+dea0+ۂy}5MOj 9T!n20͝N9b:=#L{L&]ha ª 冖-z7ٮ e*$fOe#gF\oȴOm#Z#CdۻKq!D=;uhp=k:g]5µTQKLa! ~4m25<˦MwfulEǙ*+zB,#6SKh/~ݪʿCmn1^VX=`l9M"c.gss;W 8>.Hm!D]9و#j_.-HĤMkVz=Z c*"dS?iT9"㕻@+ָ"X&Ԗa5,xΛDZ]nUx´omD-bys{s汯LwoOTy.a1Nn_V6Wb6&YIt(F>J9Īkn7!Tjh0` @ߺVIgt=[ͮZ*ѝ(74n+sf{xnD6 QǏyzq`F~1#CFD2jp}u}7/ و l78 l@M~P*U$\(Pmnӹcu/ʁM z%:d,^YET˜d7!k \ߐ`}Oޔѕqݦ02ˑ:d=oSzmG=͟52HH_w ֣ǒFMY@IoMy.kOO]j .bOݎÓwי"1-d'0q aZb!ƫLPu5@`HK"Kb^.$ q{ve=Z}G4 F"#{3z[ <9+KPE~޲A,o?rŭcyץN%ե39kg@[MSe,Zˑ)^A긤vgfؐ>M3}#HEq[=t*W]~v9VF=etMZG/[h-~ ;TDnh>zX6nd5nfDm/v@wifG1S)p& n!>Q1h9g읺r:ڕ%xԡs4u=:#[R״w{"fz& vdvuCS 됝J]ܢF~QQn|;;s8uǰFۙ!(3ݝ%zO^3nlkL/:z9j$1,=SsKGN qbl3%vo\]aPb0ځ#عn.akh/;"J㥇C f0E)A8Ït^~|%pNF3)ŏh5*pKH 8 RD`A"jk.$ 0Q€)El:|89E?7 ߩj" 2"Z N lMq(] !zf$<DjIīϭL:PnEԕM5<F޼st?}8?)Ey:2$q7y:G0cZד0x v;J}7{T!˞g Dd 7?yF#}:G<>?X :MυXBo "A( ~G"KB@VAv{W$ibPu[`:_^_\'3ϫoXuĠ9>WA;fo GFܱ=s.vzQ;p^ܞzog%M#QH5yY csz H4S}{t}[F|H}腹W@)|@)|_a;6WN)zAGcW_ts!a5ed_(^7;@ KMĢ"̻I*dgy._. 'X32B2n=(S&"Mc;2M&U]Q4)(AKU[Jճ(j|f;Dj*v#uBMn~8Rӵ~[a*P^waTɕ4Z}@uXB֟NC^wsB-ϕ_ QO^0K޿Ajy쇉@O{tڝD+DʞCzLWzZhH"r$^Tbrt.B"Six6Y"6< H8q$\-2r f˰WH\rj]qJq:~N,L 2.)цa1a v8ֿ Rd vo]~,#:i\S\JL.j*\.+f 9+p 5G*hJI"VH&ς72K13Lg1fxxŒ8 l:T7|b8| 4/3.sbQ{vnv\w2U<_ s j}7pGAb p(J @ɁVPBA0LpIr?a}Z-N.UQ6)I6TwGH:7j~#D+hjdDa (`GRr lZN\v3`8`z cEׁcY&(|8=X_5_~AьހlfPjAϛ_pHII~ =XUpumFog1ɥX&sRR^:̔h) T\˽]|Pi[$0Q$UW$Zy'ة㚆IFRh ^7O./OAKG%1'-'f e"'a vxVc0Ogn:u K}ղǰ|wd񈎠 #s=0,ێ\7?aEV\4:E}\ P⏼ 2M34Ol\DxM`I-X5sjȬI̭Xf@[kŪ[fW>wd<*ޡoTp6;NAW^wheAQyqa7+CC /{#]\eV~XĨgq /1G5};Q .P'd7J@2,xc|3J P5-?dee!]; A^82$A'hO~Zz!6q@y qapN|=:'`<SN-BKʳ|09VHor& Vqu<pbZ?((Qegx,'k+3Ô۰\ #ݵ#IakQY#m7bt -zO OWM0zd9l3.|SX"I(Ag`;Aj͍ =0 '7057ǁi"kЖ |V!׍¶08Z˺ӵnjĂ |p kgm+-aua8Vj[ӮM7c'Vmvd`H m6nXnypLC>>B0؃cߖzn:8V#?h|"YG{m'qNv+EW|p ^dy1MnI@eX#ꦢ6WZW"l BmSE* MI 9.B᷺P&nA(ݞPm]R͘' *qTSn5>ehw'6Բ-ev#=RfPs B 5w#|"ZmB-'ڍP뉄Z[j ޞP{7B'joA&قPg{BuHnPw B uw#}"zmB-'ۍPz[ ߞP7B'o tI?61>M(K }Ajhse{AChUly_ [`80qyW#"w&_Itމ*tK_2)k^}5)+!_e_gRm50tA2Ϲr,Lj5,)ޗusޟ9ẙL4[|GĢltA`4OBѵpOTF5]8Z x6Nᅢȣ!* yxq.ßeHstTZR@}TA}} Sf!Ǐev,RΒH)./"/bwmavozJl 7DI4' f_5||-BrqPX$a3(G*@5Uh Zb9" Qj_hZV?DcM)q [;,zU(r 8 q6p XEᡷ[g?_hq^D C勇 iI4sٳEELm()cM T50iOj6{qCY89Xw_(uqPVw<̡"NחD ,N-%jDpGJ\ɯsQL$$KчbB{ϲ,qf̯Q:<0 (lKN \`9PAјU+YuH Q)/' ̌<,aĊtJ<9H~>Ԫ9{`d0kɔƫEuu*d`O؅$a0!jִ@)7Ĩ<݂q?)vLW[JETx +%Ff.`.|k O9rj%\$J^hu]p)z}MnLT1&gܔ"lpt~Lن~c꥔Gayw@HHOKR`2ccEEbP8DZg'xt@{F-k<^(T$C}$3_զ_+T]Yo[r+=ͨ_3^X-P}}տw}oZwN]z0}gWpϭ_a{7a?|єo%NR>L/wf8UŞ7E612ߠbym_2Fn>ip?G ˫-+^eiҒg#q~)-[Ϸ.^xo/3J;\N fЂzia\~ַp% Bv^喕xhoxN&%(?M2̇"#P\P`x T/?oS$UjK>յT4c46<`(g8Fp]L H(x0}ڬjv1|vyDiA A%!Z xig" APg Ě@@SxM4) G:7.ߪQ*u`+Rn.ɱ:|g9*&xw1ѩd":xTZm`Z|sn=c5QŨil!S.1t\| *{$xL݂YO E !3m`WL ):^bfx%d@)x# =ij4H:R@sm6]p|/Zd*ZBahT|Q77KCS@$@ؿ|=!J6.t'Вэ9AQNbZ(b;Ӓ DA%o@8/XVk4EEt)`0 o ,2oIʩ~ )uw-cS 2D raݷ@̌衈B3&IGDeu 8N"?OZ F<3$JW )wU'tS9`R𒦗IGz|O8*·,c[F๱Ui]+C3o1Kez䬳zSq10\4Rw |{}_i ItvQ=>+Em%rT,MXpQRt7ʌAC9kp6!K \"R#ezUoVxU(k^.W/'D14Sk&laƖ (UQsk/F56ŨxcmOw [W"ryݯ:f70y *{ԟ[Y= Ou =_Ͳ ZUdoUH]/."=t0x7a/{ ]y5) .i# 992d&f#^.8j;Prcm\sJI:hR"6^΍nl^c@8c#.G%)C1Tbpφ*ZORS 7/u]] c3Ek6xsAIcFg V;Я m2.ڀ`EB |hV- /g"f I*Y5`*LBˎAT|~VC/igo+2z><@μ %Oto:B},5aPE`y0`c +iD"GJ\oGρʟ2)*oqj郣gb}]nDK5l\fyh\F[Xp@ӹIA[v_`(~wo+ ݠ\r: MKyHS%' 'jVBuI(A}t6rJǒfptk#ΊTĨJ>*Yu/\J[&أ X܄'.aK1_2 ȎO{.F(qVu{8Iu^fpT7pI^{g:6=A"&9 r8 6Qsv;/RY֟r#y"-g+OԢ WxCXGUhGPBt}d[ەh;lڝRFA=J9+$6q{+@#R3A즡H:n~8xtn -GW]EII?LYq%j|owۍcbq2^\2nCVQ3/΅8Mt%cxx0H=@siuBj@cX_.nW={^&TEJNvTג1^Ri;-:/*[N^ǰkOwʿd>[Sx1QeZlB"1EґgiA`Y7u ҟ߹R@`fR, *1 \Z2ˉJ~@t;{(yYCNe&6gŤ2FQٔ)9FE*%!!åI0Zt ыN_)Ƌ,} Sv/&ͤO J w@jو6j=(j]p|^K߹U8z2?ԃudD֡ *JR#%H:Rǭ^.8t`=I3e5,eYMlnp̳,e9Yϱ ڥfk.(RiXu><|킠;}SLrS"M{h=+bޗ{DVi\u}eXu9nh.̆cS5ýoZ:ت.3bI"W!70FcpHAWE' A'(Z |}UB&]՞݁I omJN4쐵xf}h,' H-;SYG !y qsFVnzK)R:mTRyOe9x$^O7]p<)x*ΫwCצA,;؉cMZȈځaY<.] Eґ ~ڨGzba%m* D!fRjR1}R5츾XP^^P?vf[ ܈B[>J8tя &zdÜD1W0j;݅^~LI7fN͜xn+[n2'i祏 n?uwFM}A/s]p waW1~YC% l那gEI9_UO`|ֹG/ٵA%o@f(U TE$aH:Ruz æ. Nퟺ J:3VR}vg. 렧`vteY`?ξ)WqOfò[ÕK`_ND)K Q*ul}xZդ!#MJ^]3Tdk+[ g-0 ICt.zW?Qn[tۼ}#M?k=W_pGG'&z+5zT һ*1Y- Ʀ z{1w]p\jYK링bѼ$'@)::֚Kn6^(Iv񩄏K||kpkeD:>u^I;҄tX%T۾;7+RR o0ީall=9nXQ޲ve!KBP; p|K W&&&@_-vz_h) ?L9wLQ1amVFNaI&n-<>eQAٙB!ݶ(kjǔJp"*0ΣsBS+u(J褩AqdbGO]Q )9+KS8%w.j%i5 YVх0ػdr%ԣ)b58=xQvw=#Jo[~||肣: z ep\zp#235:[L5֩]p<$X (7 j:bE"_A~Up}#d3rd=~[|ms⼲ 3JYԊ̬YR a ٻFn,W &6F{4t0 veIed,%$? o^%Gdt$ nt^JָlRk|ߥ/a/Jlզfl*]9gf/(<%bM(Zym?r'oDb0q=;L4Cæoifn 8'(<;^o{EU3ōwng҆=(.>i>BR !踥LA"{S.8 DMux0\l٨l|3K`,ȁ{G%eKk~˽-0V$d,^*Y,$zn8ov:D0Qa_elzG ?Ct7Aix?~LCN<i3y8Sʵ@)cd(O?Nn@g̕O`T)~9#$趖 ~pGW{JT#er5Β(fuz=}_ ZKWg,Йs?"b^-e7(}g?3Mg砖P)^V?<}EVptOl84' < ;vޝbu$bLK$jI+ܗshAKjN?|?ZT2P0Uot7 Rς=s{ ⱡѠi R<+l }vgko?7|2֏O~Stz鍯o | >ǏDShpR8ac| QGU}3WyGI0񓷅&P7 Eb&'|Koz&ϻO {z<ݓv +$Od ^vO9B߀ LFM̓dl U3ƻ <$f’p[Dokoth0j^w#/ E ;0qhNph=D8ьaݤ:YI*֫JX*sX2yNY^$WC{6HU# \M]h]M~6e/v)2@ U#шJ|u+eձ:(733xUG`}K~ Y,]ƪfbVӻ~n̺ƎWs/I/KOwR_N $\D_-ϳw% %sI/M'Nnhߕvp7NXckǮ WU(R.77f"?QpmtIirVY،YpxvIJ'zeb/?Wg0n0.Nu޿3K•DsO@'9虃ltKߛeȳQj 3u_4.s(s3Z9d@ZQKJ+o<Ǡp$Qf$94K[X&%'}Ieye5RmXCiSL[ \ic5E{ Z_l7d)>% HKV [Fu!%bC:-+!g U U Jk6iŤM"mhGA8'a~֓~ q[L`"`%2QH`rڕ䄶)焣A.h6ۤ 77_'!ɧ`| Zh7O,'^/XRVGlS@iTX)@bhK.b*=)~GXk9O1S`LYz==F l#LTɓۉI d? = e _ssQqֺ(/}1ʍ?Mt>7%4F AW ̢,RtBox;%QnQǏsx7X@Q˦4sљ䣣^袓|[?مy^G`Di{j+)Z)]_tfEOb뚊_ޮc+aC;nЍۋ7:H^yFriRy3W8+hko%[/n1!t)gp8S^hoSXi #"ꬌúVWC(wa]d; |sNk7T t8gvl%mP7VSg[N28ut:Ae,ֵXl_cN\_hFI]kkȸeB籘VWО DԠFq]D6cR!ۢ릍pS@k ^cP! 1#̘!6!Т57֌Ҍ?^3(kV:Z"81u _#RB)qT2iQfƚQZQkhƺ\U3 Nթ21т [kN1.ւffUz=K?W;)ź&E(ЭIDqoKjGcވ´~{V[()QЃ^<;!S´GWx=Ə`'o z|J vHC4z{l{8F{e7܏of`|)5LWc*] s{Y:Q Xz\3y:|JE%ɳaorJZxRv\k䳂".f2uqsOr@❀DANhdW.%3_ru)y3*xxì\K:##ĻvMŰ_|fGUc )5[QҪ`۾y0&ߞS#;TN:`{k U;wb|3s?Ιxss o"pC@sJ-C0$9eHॖ2wz< ),v IE@e6n] P1_]ոߚ&MqւbC}\\Zj]sjA #j5E4v6An6d4ܚ &s/r6`wA\NClPrpz/!iJ31ѷ2,{qˍXusw; U |p.*(R]x l|,Sň2G~3yU؎?\#DBib܍п7OLaS]Jr3[ Sn'.!O:LqufsbcIWg6r@PIij~f ]c4ϭB|jt ht_U,GOox\=mΎ 4ߓRU,tm2>?0>~Hlnr7قd`TNefy:3BsKC4X H`I\PiNv`SJ;σ!xRZp;@~nzPnK1Ià<go:(7L:fhи(b{U<ycRK#Duc->M=SEzHM@ٳkZcV8Vj"( !%#+ 025^{mM@j|&jQ@kM%-"H*'![3[x4JS+Cև×Dt +$^Hʉ6)ⴥVlU3kE@d7T9ǭZu?˪|W:|'-;El)*S4,ỵe | k )XD (%<8&N+#;;M1nl"FZ8g@L6q@pUk r \eJ-RZ4jҒjT $\4L798c¹ ܀ hSJ50A"XB )7}7_6c.ڠUJp (.RGDz'vG"fFXmR i LW7H M3et?":(2ق̳F*ZѨ xqxZJc9L5raLFצ`%8 ( & MHlOK+ !:d47+*)K<GhܸɉEKStآPWo&OsFL OV'7ۉq[L/ޜ #b xA&}L&BjW3xRUt;&"R)C/R-B,GmoDJ cUp2OCZsHƁcb)1k7}6@ԈQD'1qjd+F9Ycq, Zg nZ HIj=D#Il=yec߮/59Q91]oFW-;KܢHR i?4E@..WA]ꃲ%eQ,ZMDQ䳳3Xd5XĘ`c,7KM4X'SVD2 p[aD֠3A1 1bqBd*Rn5u)j(9R q)a6@iQBr`4XItDalaI"X10|H0#0 "T|'%B6'm$I M!X(.HVE(V'" . n­_|I%hPLO LEt3s( 1 @{dFƢ$Q `yK*LZ-zpΠUb*O *n9(Bn'}1R鈓K viJNڲnXaBZs+I 1f΁ GSkl(ZFHh2$O7BSU2BQi##hm,e$H(# J`OϥK\D2F0kILL,X1$R!fRFKޑLNmNd0ڱ٘,z77{Aў! jGSꐱc[RNرrLb'F Ka˶ ItFd]"LeE9S&}BCK3ݦ.z5uTb€jdS xRx8UyA1EZEDbg %*JqwD֌ǚKM5aQ"CTDM,Ml &a4Ŕ$[ ꉊ$`,eDQPlc֒b!Jc}3 E%H p*m" sQéIR`4֤LDd$I,cEV|R!LY4`v j=Jvi( 2ݡ(oWѪk&f&MaU:0k:%mrڕYɹHHL %}ܹz\.5 AQSDD#wY eRw,Q.Wxjcm&O<y<Z`RͺSXh! %1Y֔ :5C) nYF-/h;&dVU'29ԁ >jV<+ ٴE*\>QwzP@"x@J}ۭ)pҭu0)RoXܶ+BO iY|K ZΔP5 B}ekb9\hHX/n[]ݚ [:2P Fzqۙfj-ꊩ):lO9Y{b9%:ؚY[LElV"WiGxςwN*( 2ůґ'l_˧O q)ľVrk~yiǾ$<&۬~o`F A+߹(Dѝ95 cz7\wv1Oxfu~כy4赽iL"M{b:3i?'rҥ8&\"{r xP}}ld@ .:5ז09Ibrv+Tl[YNl,9`>z]vc p[RI"w.OUoTjvyyN吹 GĤ8$22PK)m 1+bƢH+TO|qu`J_ &ܣkٲ@S1s's )@5~ bZ)RJw#[ QT .&v 8KP9q粥r` N%l-TH۶&`}M]PFjӌeAա'tT0tZM`-UVXwF98-ncU^R)PEHh>~RZ;zd `*;1l488~||[Σ\-F 8˧6vPA| #Jt]U*T^4@%h\ܻLavk)5oց4on9|MeX1 "'nRzy-^*rv6݄,4A͠p: L)R~޺B߿v_/U1L+6oyO X^CZJK4OI< As=wGd4~eW!gMLc^WK|;8>GytA ~mØgӹ)|^ Ծfy{|a5`~8bZ9oS*lRY.wDY&`;yCzHuɞ.z'Ro'rMFS 3)іw tߎa4˲q>!?-ؿkQZȋ"^\eA^F8ι9qϟf769[6SU%4.k@kAj'u6RwQ;eZ"3fY_ d&Aѣú_ >cb?8IZqC]v7'sGjĬJʾ$80{ MuEf{/>vu_ %H#uL'0̂-gG7u0fO 3y"5^Jo-_I9wWm(u [{ M&Mx6o$piaP&*8iO#2PtAZGibc>R|F/_CI&$Y6h`&yezg7:U#\?;.hX# 5Pu;`;{pKMڠ)9y!ZOl>oYl8@L6N#9 )~د4I9BGvrnµ rK⥱qegS0ٸ^͸:ʯڟ$+ Ίuڭ1_zd<}fg]M@OJ.. P3,?u|j&Kh}@U (m3pVLvuM " zgkUM^A '7iWMo| m=َ#GrER!`+1 E^5M5McQ0ό!t.m.%d}kA($K(j,ȉ՞2*V1@H8l936foy6]L)_ϒBw1C>0=&>Ƽ^w-^LAJ{=hIG#4*(,֌'#P'S ZT vy.INli0׳O8[պ>7u*2֥Ey?xb%[d;`*0D) zgw2Z|hYb9|(˭ާ;CdaR$WY0$琭gqIF p |%z2srFX#"2g!p"LVO`\}F\J>MXe0W#RH3Iy kl(cTXa/-8-ɽ%9V/4 JkRj-qغUz% *.<܁F8ګ>NV]vW^hˏ*e-MwGc*H8A[p@*6Y,hS\sH}%U8%Mx+/*rY~yŀe/KӪvMΉx bMP F"Fu, |(oxݬu+^kׁ<I)O]n, ߠc4es#yN|ӦJӪRF8+>rڄ͝=Lq䢪vQ ւ bQATI&kڢ|DWޯf91? 8,D)y*ƌC3TY\Q縪Xmj&@҂3N() `f>tt/r/ԎjJ*N&Y v\hYb9|(xy}ToGf!**\${!Ёx 8!Z@ h;4-iPCn ,ol 'FC˵(se(2UDem)'I4$E۝Ddә=K`Lq3U,uܺ*h#bGhqPsYRjʃpOC=Y`w(9w|2WY@ZRmqu'"8ic:)hYb9|(˝ӧTX_yP7#M*kt9ծ!1Qe*dz#] >I;q6 Ż`+_Y'"Upb#֫ǯȟqMQ!d|))bY XDI/ٮXU1.A.i$Gf䰆2 $_T欞H 0#p(,gKg/a5 /U%.',zFЇkfcΌrSP| Vq*A'^M[ǎz10ݲW-Ӻo3x lw7< XIqHP2Ͳ\;sϚO4| 刑׺l3Uو)j-KI1 V7f/ƳًFrTu(LIK&k\ k 2bhS)&MԞx̠v> 5ny渀Jԩۧ ("KѢ? ?҇YgVj{__ ^sdTw6BWe%SDj/S.]>( F@ŧxI5]tMw$ 5ڣSj s3Ca 3:}5kxtKLV+;w J="1]30ro GQ2L(I"xK5 1A;w(Fϒ]Z](> fx17 ˫J՛gRok+klIE,i*^NR:Ft w3ۆf'aPn ZG'=3 yen1uqk[\&@]H-|}Wnpv#U*:?W\q,(طxކ9^3|.o}!#ܵiH+*)=8Nxc13aV~j1SwKOj& *z4jC0:hLy^_SW'*T?h6\ߴ@Eh@EQ}܆->< }^"PpTsFGZ Ĩ0өU*fAј._qfp$Yj{YV61}w1p3ӦG4Yk+0 'R@@AHuZ@hK @op!.IXEw9|MU͂ ]PѵP< Ԑ&IJWoAuS0QP&x6Ļnha;VG@QU%X+"ZJ V[M3G>#^3[nYnsX'u&3D >a$n/VCbcQ}F>i X7?=loL#74ڕsQO v@/'b3{,VxCbL5<Lt\ "sg@ gKƭ%ˇ,(Oל {<+ktەe 2c#4rv9 |(˥**s5Whn7o&.Wd%O"[&hT>[Aq8N | St>V\nnsz$g!쉫ʫ90|Hy 䀳)AiȠtΓWbTˢLSj(HKMG<VlY}p*?Y2&#Zt/p{(l׻l!꿲}\جWWǘ,VFY4 84Ŭ.9$I{޺2ٮ~z}4n$v&w܅7{SvVmpauVxXofq]Rus )جM^j-&uWYw `wlgiOQ,@*4b%>^pc~Oyg篫WY@Am>.sCLJS/?׋*"{*->C3ςs No7Bs_fh8PFl%ꏿ:7?=(Xx͘<67ŪFqXm"-f48i~ײ5^?O6CKdg68x|_2$qjίSHs08;/XT9gYem{Xeʷ+ippe&Uo?v_ޘ2o9ج"LO)h#4kkgЌ_X 6AB^抰Jxpv] SRJph/c0W\ ΪnH_MȨ=zTWSXY UTU}0&᱋{{"j4wYWbt"hZ4݅!9;T_V/ilwߗELt\=&n #,A_&G걢ػ/~Q nCnev0b'?:7a5i402w/a1sfwM@=ճ7,Gtyͩh;*Tr)Ȅ5pDA7pwY:bC،ki>ixz}E|7l!5Ò|*U(|PZrlZT!sCpk=7 LY2y|f1OcWuGwG8q2WR49ɖAJSJ\Q̇]25r*XcC+ Uzϝbs`6(æe}}Ȉ ]1# 1B5)⃶@&tc1#vT 0;Z3wPWVŘwL[Q8iK]Z{Ї~8F~;Tx|4Gkuǵ%݇.671v!S]yzCw$eU"Nd KX q$4Ǽk|":Ga'6ӷj6#' r&MUN1#x]Sccm"iFa3q*!4i#3Bиi6 ʛxȍȍ"R:twTv 0bttW3 0Pь{4ȯyB\m@\U4N Ռ(Z %=xb/!u`T::1M4uPVHqyNh_Z&ȷ"\+Jz oDDn,o@Cbzx䋕ũ]o,"2G>|95O /`fk1bĀජO=np E~Us lTG/10{ø1;Ck1l8u|b6= ޣ oqk#ژ-wf ly=utTXt)rUhr똙"mU` / 05x߇3 1FLB_G/6B#1/y@.)E=1i{/] B6ъ ,IQJ;ۃǚ|ڽ]QWJW3J?hx٧[/)cw`yY:M] ZaB:x+U]K3Z}PJ/g^4 - . -cgU*ت=/L&3)Itp?snh(  J%ǩk(cdEV}:{xGQcBP+)+Hle[O Ra PnSUQpxZK?> 4cSada RpXbk 2JZdo}emW7Q+R7nBn z,#ͼ黎 [g$0,#*`Kq: %#7սF38 x)\bnv'FQ?Qi IK7\50z6u6C&QۨOܹT:%e+."(IJ<ǥWqymQ1 =w_y^+0Nw}?QvѾ?ULӢz}0j+S:=uyɀ(3' s[.\(xdGpvy;JH Mb5jF_P?g_WaE\=(Kٗi Ƀ?-.`CxhlEbp;($ nd9sKJ$9p8g8^d'ϑ\ɆǸ}pPϋs1pΥU\ 7(l GPyQo ҵajlcM ޻ FWe@?ewŠ?~Jh]{x,? \>3-C1"'؜5eMu1yC,1T+,,csJ}RrZ e16fE VF;2ʆ/:emQ ?Y'M G]">|-I|3V8qEc9MW:WWHV 6$c+viLQ˭`4j9rp-H5Z F~;m1FL~y˒q_Si FWկ1Fj 5 LG5)R2 WML$s~w1j=|p̌v G@+y`%R?H ڒ`}',)ʠךY u~QfPĽ[M؇˛[!]W^; biR0q֌Kp={nxj- Pәspژ <-U }**#Mc, qe~qӛх?GpPX='KGgɭHV+Fu2yW{X8wx~_%O9TA؞[\ q~؞Ts!SbiLa*D fKfĶCKS$EAX7[]1m$ ٶ;Fg)iOV[kҸuϸ`H[PXxpdx9sm u`Q:0 qE^I&P#VXrEk”XU#*|zDkÖbtzS\r&AEf+Zm_D!Iony|||GkN^:}|hTZZl)7hYO󴜦 q`bT"77[W.<)V -c?*8~rW,?;Ȥk&X.a饻ߝ9O~þog3Ejf2?OQg:.MP mMzk7|zf{` iL 9*>pj0h6+Oc2p:D y@5[ u0X6Ә@r6Q)B)B 2P,MM8.8-%XS'9>)ƤJlK4Z=5*Xi.OŌڢ,`*øwOO.UĒT5D:QBE*D+>x1bf!ίxTfVɺstϴMSiڙ$3*\|֫= {'AL$Ci7о.e*ۏOc r5DŻ"-WߧIKOM&EI@׿$}+g4_eCLgk (u{x\,*P'oz|*7ry# x)-}m)]<ޕޕP.Ԥk 0m40I}  (gfjNְ9||!bmLcN8*]]`ދ@:l1\iMpFŇ۰_O|8f >Afܶ8BXg2a@&L:ledI,_ےZon5 ܪ&Y|V~*{/]>]3lZsz0) þ/./~1OAc33GE.%٦*WX63nFm{VX60̢R ~X<]3Aؘ &^j@,-# ./Aj?. G޽sSd'?@kop48D БV=@_Q =K ~y_~ :Owװ?A(n0f?Zg_2}lD_xoo1HE'6^m 5Agɜ,w&GQGsd QB{H %x}-Kxq^4 Rbq& |7^m;G*S5Ca ^mΨ%󆢒.o^sv b4*ݒ~j1zW)2/bov,/TѡDTzN/)Ҟ9 P-!;%KDfaEI 2 Rޔ(x ?aܸ,OLцzP**8uDf2v-q%mGY0dm0}| DŽ Ru>/ij{~7NfzR ysq1 SCJ .r4,/cGq6rA5 b F3PfJͣVO#V >t)`FUml5T"hjDvKf4Jӗ`֫V9S$au'YaT`pFY0R#&@%6I +ֺh(ڔVH)hyDѲB E_uGțq4~MAc=F(k$F4޶t7qǺ<8X28ҊIP @SpD as?Oa9as!ebR‘e=CI`z:nt1s:,vJ-ⳃfXvM8~Pg݌^ҹ`nI B-ɾL AdY5J"HBZKpc Q9Yo- u){7ЇG(1(0>Hi|뛺I-&^]%ԶJ!>7݊ |#V=lڔ>o֡“ߙʤn?fHYF(țg saTƋӱlųh!ry&i>MvM{&:gZA!L2J bS*s~Ktdz';N7cg HرWW=sVU[ rVBՓT$t"l!\OfI31'h; M 8W8!h/AL Jx;@*1Tv[cCp[ 8)ĩkiFk; 4hxۑRonuή`nF'(r"`&%7'גGΞpw3zQ k)z^K3:\yG:%09I`ɗGEbM1nw=nfwEjA/C vncy$eqV^nbrsO)XpU0lF, dr1@9` UqUbTz~|&~6:?½{$ F77!q$S~۔V4gRdmpfMhޛ A <#*N>P( NckoU,-vnB&[EVB}TkbӌEά^$̞i},Jy.դx'kV2ײF̥B?m5Bf4< q/*[ee i*} !0xƗqS:MQb}i*uWl "cB})ĩbӌ>EuFDҤڔp?UwK}<1I8ÉdPacd4QK,{ȳ#6iGv)hsfIpSJp}Pg݌>=Zkg(U$(ƹ,ɳO>:8QKy>yav5`>8W0vz7{ڀtBK͔%DŵNHP*{ ![>q_(t)|8ȗƽ%Ӕk7 %v#zPiyT@J8p+8>ukЏۯ-WQ)",ƬvQj#qS׬=3,lPgk3)AV'.`;-𷦔u(q]}g8)BkLL ?e8bv92#`YK0l5LQҔ>Kh{kYlјߴ_R9ȱp Kb?tk=^֨GmW߇osA99Yʠ+x D 稔0 T yuHma;EԲ, `blYvE5KSy4 Lw(*nhǓyVvfI}e'@\|CDwוq4)L(kc5H $=f 7n_ vlL(J,{ wL_$$TRk_<${ԁh: ! rF}Zco$'y{E>$'e`z.xF'%m(D/ưH,\Fr4e^P-`_^Tam|)`?2K"ԜѓnrDp: _^%lyR^Ƕ,Q G<Ғ3dUt0^RD'zt^4JӫjSwnjϤo lU,A_XTw_o\F|o Wv*H؅_l>@\ `7'ޮixLL`7 7"D-e'`Fؒe!+b7pF Z^Xa-8w9~|7Gׅ=Ikv7~TmֵVe @Z vU=܆n:b\y% @-ѠX5Xh :5QT}VNtW@oŲW2F,eQ0[__^b"_֒ш . FN뤯q?#8RRݰ74#,g "F(isD_c* .L0DlWޘ)(d y3Uy]Î4ԱM :v[:_&IΗǵgdMI<1wM4{:mciiJXXIXhJVT㮙'VR:ĬHv9^*z`VzG'ad>.gXdL,䦰M0O[h;/nFx LL4C9.3'sPt2PB'k":zOST+EV0eQ4EYd%7i*qJ+'ST2݂RS68hPBZ1bExr_)'0 D1|=͌ߟRS~4oM#.vuI8}y뼿V.KA` DgV<'V Gt D:V){-H 8,cs*t' :iO_pJ3=J?5)R">N@W\ tY2\$59I'̎Ose9|J 0SAݒk >AM aLˮY3&R#"  8gwC  %|-=?5AWQɩfd_k;k?&_ZyX^oAݾ-:(2>ɺ?$ PʍxW荮" X+l e)",h8zCTUukE⇱uzy=IM^  eU ܝxg˭@-Gɒ}a6U#qigrJoa]mo#7+Fp_XwMȇ \>NmrKQKgWlɲl-D21nOŪbHAh)r(`ABBEe*FGf*nS# v2U0!x,ޔr7$ޟ&rG< 9)cY D}KrUv5}U$P |ndɛo̺ ˅V[Gϰ g H0K rSi5l]XsN0Wi X`LОoflqbvu AHjd* 6 sQC($4,˓"ُFE餩*LisRh|6iޗa oMB=g\0n9Fͼ9k<W,w]<ؤLyj}L 1 D":)2{2"^1CGt0,ZcTo-VhҺI!@I ݪ  So$Ȃ5g`u[5ǣ[5?[ r~ud}mP1y%;r*}: 9_q;MSQ~*Aj)a nB!c 3bEu b) ^H_;gk̚.8( I[ߖK?m{k%i፝6[)^hm}(}5Nz x D} ;%OG-جW'7%'(j/A3>J\&iQׇ)T`}l%/hiƓ;2hz\"࿭SW(:zx*v#ja8,))m4HkǑ "S~1k@F KO22dPj!nrIxa8vcoeO2i?pӢo&6KICQ8C۶ jϹh)DZ,֡vb[ klc?mUeQST <תJt= zxGNM[*-7&u^%-W. Ƨ m;@ 6a˳+&LR_XӅ5iTQr&c?N]'0`G]웡:U^nG}L Xvqs?_^z=< OWo:,aރU7 q,"c"9Ti*IFĦ3l".)mSSqӐ6 ]ђ%.Nl? yã,8{Iã08yN<_mcM:IT5hf$ TŽHp`ZӠQZpt`đP #$ op8`b,^ĈX!BiC=..s=277Q t.ei?ç9d:YN)&Aԗ$m螺eeNai/X6qx_4'Jާ{Gȥ}͍Gxry;DQHziSQY@IH ڱN9R (r"sm r*HVi'@+PZ0TɪQB8>8ç7nikbʃ՟й!^7?ݚzx'w&$PȐP"ù%= pd$C/<)=0E|4zsĨ6F\j*pM(u;Xg t~aP(ϡalypIR9ƎGXY̕B[8)h 8.*;i TOg9̙}/D5di}m q,IY`*5 0PV&AU 5-[m5ã08"=f@  bx<H/|͹A=< Sؚ1}I{x'ln1 ``0zL!h1`r_B32ϣ,>㱓K|hc%hBo2+zx'C$k&N]a64`I(6^P/ ;A}Z^=lӤTTH Q gCJRHs#CN@atXW$@ZTRMm4F;t Nwn#$VI(->K$vEq{7)!`( N9IX9Ο55ck񃊒dHPn9!Ru161;[ 5ÖݒRN_OHpD7{ƣ匿/1塈jY.n]L/ۦ~Գ&k13WxXߵSx?]-ڮFu"9;Ж=V 0n1!!-M] frW JXm)U49G5+1Ĭ/f VM|kXO.&M->&~>mfHe4>,%s X,)gIyIJQgÌJ2( wzQlFqF!s cWY%_*ethҀtNhN&=q$¤0BD(B'F,Y,TRʲ( N~жS9zxL*z>%Fiu>9$aG#tô+4 "!dhLÇ$ eQ2 ɐytxa8h~2]h$uQ YO xi1Qwe]Sfp=$}ֹANP%Wp5@d UEAGapU L`BU<`UbeZaBF1Ciy=v<p0H1RL c=#[RYYv|qMat~oLVTA4blcAm<~Yaã,8"C.&3… jt_{Rqd/QX5(hu`z_ϺaPOƅTұԘӚ.k)odGQ ݫ)k+S!n(D ) <5J쨓D+d Ƒt}-Lt)tH#JbFŁUhI2uGBssY;P-K1Z3kKo$٬ޠB+f)lE_} c!FJ;9 R,IR}L. #!.d(^oPu DZ6 rRn]#a+=P#B /?))`(:3ΠurY[-ϟW\·4hTb6d8CE'u 2E{s=L " >( N_דj( ,I d$.ثzszP jCF%)3?\| 8 ( QڝR:wã08"ۥ<-\WGap -uJ=< S緖Q|} ܇]`Izx'GYvÍh' Oh`ښt)z+&9/ @ݵ ht ?N[8u2:$w@Dk|bӧ6JhǠ02@wy퓬|5Tr발O +A Yh(26?X.$,cR iGۤέc[;[ވݒ<( Nr{*UbN%v|&b __X( ;UW^XB! V-HXk4U 9/8 >B qV/ 0U;(&'P1JV=0 9> >Bk3Oa3NsWATɑT! =КOe^)G">BMr)q@!93>BW;Q1}rcsnm*c{fn7˰3iq]M%nV{?wZdXF\m:[z 2{\Cwe5񇯄NKDU-Rk  _5GJֆ7p?>,#ObMZ~-f`g?,b׮·$|`߇^ h>?w#;i?.Lv8y$C1 U?>LןL۰p0J0q[fMw0V"Lz>~Y- r}3h;;֟ ˡme>Eq/@V6 nae핡B~`Bʘ0iꋐ,ygǯxdٻXnˏ}bmL/Xb83ƹޘ ;t[d!HQTH|^"Z. aG>芳ql+m$cOޥ_xP;[-\@BH.cs,7t"+][6+,fgЍUy:yHNvj̇3r{a< 'feQ7]F YBclC?8f,Thh HX@NRef7ut+X?Cs!Xs*:%,b&8g$4VHDe4F1LReRR< `aڮ>iw@.{71s}{g˷Yl|~-w5ٚa<|@bH!بmsQQ 1ba P#ygc.v2P<2*t3%TbT&Lвt&H$t,%8)p6v.Z i_ݝ_aK:ըeQJREbDJMT%TN#Q(xH/_YNeX;NPM-CG񣭧_K^T1<>P(_"=,'8 ߽׋fl^gAPc[:HuvuVL:X8_ i__;< E`Cy}c$RZb@k K?=c,szf7S_nR;Tշ6dwvwv ݸ>r),ވm_!Wa>@un_Ƚn- 4eIL=5$iEJJ͉h_Dx- ! b}ohϛ.ll!:2bk/z)lCϟ0F~–޽X$ƮK0\`_~/o?ؗ/ob41yl+> 3$qK lG,(8g<<6jin~A$yݼ/_zzT|d cgN6 TD!5lTǕy6E9gyD_/S{Ya:zޏsӉhSy,G12Kut:~7QU7gkeʕaybs.rѠ9?ki Bhi9˯IJ1Ϗ[(k#7p^d Ϊcx ɨNz qvʷk"#ѨiGysFdCkHehX11q ;ۋvn0{ FwJ km8. N |Ki7<%( $yBL_\;#݁hNgkXq!Y7iy3`eb:kc5?t Ӈk,J(L,$YWb$ĶW1 8D rsd8'Pb#8)ֱ$ 8Ρru1mʺ>/g :#˛iHw^.܎uuKu G/ZHGօ:]iwۼErd嶎kQ`uЏYg'Cѹ5]AW}4v'` &#cn~N0+FϵdʓdooEK`;Y7G+JLºqDKw\OXq츅J gmZլ\%v< fP<8S^G'F,;nʞUܠcG|tnr8{ {LݞM8iT xCЀ74 xCЀ74 xCbT4넆ohoPЀ74 xCB1VGRt1ʿ[Xz~ٱi5U J,<$b@, Up'QR+meGY@ ֐7yHΆf@s%ۮRZXn0Գ,csAsD8~ܹ['c<^Я=r*ZtS73Zݚ%\%-L,擔e'P=d/;IG^9c'oqr#[cZGheTzgݘ= 7Wu9`l1*W]r,a4ኺ'^lzrCĬw(> zցmxZʞKݟF2?6y,4q5Plד0ߣX>Px;@pb }~D(xS@3"TIK)8p{1]?-~~uR#VD}']ylE}DZcnٲ3Pyn'/>MV+ Xq}ƞUvyʄQ 4ăCMpƒ 4]hvkΚ>=GzDuM0d}O[:xUa4D"ҝo5釛áC!Z-BXN>_ 0CHDMtv(T7?jiQalh I}^op-VH2_/7m̲:Is`iIۙf(Ȁa{G}wr [S Gn:p Zg2z{1l> yd~5'2wo5r˘͓>#{<3:]!/cg}4c}/*į=s*8\{M'&!RsfDPz ߊiW5W3OL|Պ_=k症\FUJE[[v Kncj]9%2XPS$L9Ʉ"aqL%HfPqk R_pu^1^;z칞lytĊkW*F&4Z4\X:FB`RNby c"jc.us`sͫn]uF3ns[do#:`iɄ"*=,r!1Ӕ@RJ!1H8eT2-!d&+y|³v2@oL)J!M2('R2#YJ\LLؠhMµ< k gwϭ8X^R{nOKhߚ|u4rH1H; TkjJmoPm4[XA<6 @>iq$q„nm p񙋝 KSbeSssJYaiDR@FhmAxG9:y0૪F@<$kw08pPDS3O`>}*Rm02ӔE189S;9P3 L+Lcm J$MGƈ9H87 (.ے2ZJ׍8u:JiG< ).x<) L8*l(4V8ˁmY) ~kׅ@PA/.`-L~a @Q`B PB 7 HTK&]T4K'94(1&"GG ? K;{=l+HϣY.dG2\&a D<&It$@=ƌ 0aweXZ_7L'2>]Y'm䖾l{XROirM%Ј怦P% $8Zc3-y{%)e Y %5a%iB'l8))3‡6\j~Rz= `ʘs-1RpbTfd16Fޝ Sh֗trC3YŠZu ağt[r:?D+_64w9Bn蘽KpؖqcE>oQޠӴC\: -]+ K($B֛[f`|m)t"opW07#KzoMBCә!+=|R)Ux;ё* &-vO熂90)}#>WDīHEg^u̱ZqbŪNeesv,(u%u]x9a^gs<JM~՗}P| 1\p}Y81'/WkM,ao#2hĽ;h̗4wX[9bgv=355&ZQÃTe'°Az,?d@i&znL&+N-5z_~ϊ%M=4(R0d)7uʻ藩QE1hf.Qu%UhIBџ_ۯmF!E=Umh 5BlηFɻ_u\8},ki6s!-!YFNU!Eo~X4>EQ:mg]Dktle܌EZ}Ms#o0sAX2(ƒ–/In Zv%}̟#҇Bk*7Jz•-Tɵdlɟ%>^SFoibEukpƶlq\.cAH ڂR/EݻvG%>iwDڗw%y 5r}Pɱp &:><,=@r) J7 MP# Pr-Lژ뜥չZ*V5&*< ;]¼%B?$6%r\~MMy->sq1TDHz=TBJ|Lɇ .q ӧ"Հ1JiJsU&JmPQJc1 ov%Kͦb>6]nG]oYhojf[q>Ӭu.eU,uLPq?v~[Jz){e ecə%P\87TrLеU3"/Bվbs';8Fx‡0. dJYwl؄ Ͻ%j/2:?&}NC>OhpeI(ɳV4so8BuWpAQ_.:Gt-M(X1sƸǟCocV%cjl;Eifj˳f]z@ܿM>TFןֹB'ox.7#E^tY\T[%-T 櫼Õt9?C#UWo ͑Pva0Dn#zVS$ˣP? u}F:0גq˝׹"T Cä޵6#b]󰘙`<,^Om夻blˎd+| :2ERUX< B <"{p&'~ 1#8j徆$wm ]rs}$H J:[H>zLIo N3G<1QNj@ "ΈRRrIw\H5?%Qd[j),|lFb ޏlof:?<w'Lډ^`/"D>γTB`xwCYg<&6(D}*x(Ȁ H@&A"7^}xKE{7oyc$#qnKa*B A0̙:@w)gH})`s~nF:vK%\*ۛv{J\L_d}9fվծlȍoj$CRoٳenm F]1Qs⡜LFQуÅ,h+%De%T )CplbʧXYN=zlhi&wM\U0mA((N)ZdRڀ\@ޏg ݛ~qˢmѹ1(M"TBG-ڐ hr),V4[1_3棪^/̷?v<4\;#݁pl'_i#e5Lጭι9-$Gd4qՌj;JdOi\JmI\L_Plc̠H?!m$_z8DZe@4@PbVk# Pj=7VCR|w-/ÇxؠFjkECLW>FeYo\ޜ=t> .W#K͵t_)LkђlĬgl̬Y1/[uzbe8R7NM"]Wf'}zI66:cK:]Z%]*W&]w.YWWܥg8qwLL]G=G % o{ܼt#] t6|TfV7KG^I[>A2{;ޘfXYGV-\ 1y;ԅPfS'@'}i샇݆g7PE9p:i㤇?,;M/=⫗&sozh&vs~u;dlHfvd\@~sQvMfPinR Cǀ+%+L5-Xl(&/3crJ`>l7]vF!CԐX|2iL'Րi;-'^:i9×U9.MQN&MF:)ޔHHθB'*'Sk9gLʽ,y[Y 95ۊo`P^m|5OC}QRJ!K JA8'JnXBʎGFk"%BR㤂CQ o(88Km)fpk#M&pO}%haטj y tS"Jx[7˙wf27DN nO TDث( ?tӔ [hT7m pCu~x$5Zw'k\?b=r xx|CٮU#V+RiO^1VxdX3ˉJ,}xsゞ{g0sZk _\[w:՗ͮRv9Or*jU=K0fsP偉S\Ay2(}eobZuc)vsK\@%F"36F!J=:D@$8CD+f0N:FY)>Q(ģe= J9ESEZ`ˆqÜJۇEg?apV>qf%ADM^ފ2OX HKo&ק¥cd:jtDSoL ~!P-:D-E8dzъfRG/>,'9? [6OGѰ5`qG@c|Cb &8 ˩ -BSŃZ!X¼d+{G05SD#ċLC˱hk ڸz{\b:utLI!)3J \ !)qJH/pSA@bci:ֲi5-kZִSjMk b ٴ\zU7yr;|ṕ;QdxG% c?&BXͿ1>2N%~1e`&c@1T?Q#b> y[uX!4WPgn3=`?t?c޽Jϋ6L힦+Kܽ\u'wvmQ'L̏)SFgCƻKS,+]t+:o*-zMlvnyH[~>F%dSqC=4\!uzljFr['EZR"u)nÿbs NQgyb sJ58%/1X{${8[_r~ŴT3E-X L2e!Ɖz8GxG DH%-s;cK2$,ƞ"Q`J!ōO"hY&b ÒƬq>@{@!M/qpZ3KC?|$(Q&pN)8eL 2S&p:{`0Y#p:Q`{~pziwcI!c"dz2ݍiFiD8) k9hm"t@ca 61Ʊ:8ᇋ9e L9ɾ2W&d_+}eLɾ@ϤX$p0 Ӷcq"1T'uH4ͣa9σaUrXJVI~fx]?l2_oWR ٧x BZYPT? /ֻ?OTpZ8cc^jv#Y2=&&Ll=fz Sj $p3[[ͨX&,8ⱒZZyGrt g/odrL2_>2_&D !er]/erd+e~YӲeM~/er#DΚ ;BR򙀜.2]&w.eL~2]&.T ӛm_Ljӑ=SRJ!Kiʨd^|LgtLg}\= yŠ CF<$8%8ڌO'"0D_ssܗvkEj9.`y,0 ig 2%&iGKcC$‚h6B0() "IQ"BEC08籠,z()gIèUpw:^6D6PZݻgkcqaӕص RO6!ҬUwn~ĎѪ]Bj6 2hvB`0c.}2\#BR{噷 &j\&##TķFR#ZQQP' NAG{֕޺1\)ӄYB@V lbAѨ0獃0MK`hڣ)lƇ$4^% .ΥJP3!E2hH, !^;(A"!Bh^^W`fktM:h+LH`$rhzk,"g`FUK0۔%t 0hXrB[rVݱrw됔L,430*n*}bSN)gQw 1 `%Ё(4iu,G#%?vv= $s}o}KnXS#/jOغfF7";MiF<͈ DOT'??}&T!J0TJg1"Ü:d!HoƕAPA 6]D[~f76f D1WBsxfzB>0J $P 0k" -54,(n1[;H@u C.(Tb`&(#3uĨw)6?,}Ȇ_kK,,;f0L^%YM77=wDɮ-U!7y52LB^eO>/J@XϮX+Rٿ+C9_(T|?zp%Vp.JFK@#N!RqӢ -M˹$ռ$iKLJV"h2E)) a$xҽ[猋.Rܑ_mѹ1y_~ɎʌB~oF|͘OHEgH?l弔S(_>ry|(2h(@tN2?A5L?vEߤ#0<=ŲTLg?Ks1}w'3"X|<仛ˡgL!sJX|0yPŠcMגx$JTq-o.r,l8/۝42|%S;7T<4z+s<;>FeYo\ޜ=t0 Nlkq}>yJ5VVc?|{ywD^C/K5A3bfżbnyYC[ŠJbD)˛EZ>N2޵$eOwȎ#He}bO1M8mmp{CFP6 QitWU(7l\T>Ȟm:.th єZ5ixtO+nq*X"yWHϗw@cJC Ch`{.w.`2|mDllxHMETүCȇΠ&M)bIS1ǡX6pţy_b @'}=}C>oxGŠ*8Nx{Z8am eåkޱ{-wʡgCU׮maٽIv+E*P0,\R9}IqXt,>c`\bҥVpb#V#/*&c}~t&&V'K!zƌƁZ 1(+j$>e'&]@vAm3D?&9g}~@v 3/v V(tmEbt4_ ă8HcQ.~-]flJ'_-`jbnHIX8<_+ˇx6oꁃ/1):`~;FĔj(=n6E;y;j1H9tt S·b< }qcbJ.'#-= l]}" 3MC`25HjzɜԚfVX[iRm׺ٛ_9/@CјbNo5x@矊]\Htm%ev.~|7}jJb<-_;tJ <czÄq oy<˟oҏO~xu)Wp5^ p#xOp["#ɤo Hˊ C 4hRKN1L+XJ %ĀYKe%q!E앰ˈ#S]4H(˜(Xp׃I]*,  GEH/aN pz-AY ,]"oCI DR3qGeQyD"rǍ!XQLBǟj" /& hfY븫q~~J_ > 0moY"VȃF\^&ƒ.ESNfP3ƒ,&YC3*Vpd& B+4ՔZ F8 i'1L:U:bt2kI:cu]o8HMMqf?bKW9 CW2Fzz+yy 3GyBB1N{G1LIXFTg:婤7Ÿ[UY4=F6}kNYD )R%&1w,(848A X^DZ(  TKIc鵉YR_"Q _L9"!nF*@d5@@R n( YG(>5_& 1 FYl ඀҇`xEEN&Zŀ?ÈS%yZX 4&D'41{VIڥN''Z 󩬍V( '{w!^KpuBa~GR. xm<_6BhE@&$y'-sYDJrGT#ìi{j Ca42rTz%1`;6H:C#OJݾPreK]YjSix`,0})xl H )&]hJ"`h IR+ c FdQnut2GT A"FyJ9 F`,e&VpVnpdz6vsc+ n+^L#BnKcd{DKӖR2|%gSpՄ1e`V&QOF=QO2٨L 0lG=j$::-䒹~AX*9Yaj:ݸ |Ÿ8ht}B(z%.á4QSRQ-2ABdVLSkr`X!?=t^ xhSi0r/WZmSIgsŶ^*2*g&5ΒE2`f|9lyYBw VgP ^^7L;X$SzhP2}=<@hAdmqIF]akr%m8;l8q꾚/>MyN<4= oO<|QcNVZ}9L 06"f6 OIF_OE_yw!r1(;ޞNzX}ryh0piv?D˝rz#tL!6#0lX*t1tBΑYTܚ'M3T0+ 뙎fEzXIa>./* obOQ]AҡL%$ ~:#$g R:Y0R֗v,%>sd SJƝahNύAۗŽrIQvB_ JN{e &>Ui@0-p8K%7iRdKJY^l3>frdM2یvvNg亂Wp~0v =%}10^uo—ߦo=>ᝲRKJ)ђX 31ߣVIjgx{i^YrRq+``KwUmY_ t4w|!ERp N9ťp<93Y㐥dRHŎC@n5Kۉ?oPpm8fYC~ FQk %I/P!`V0/2ZDJGt`ZD"rRZa^E.qVq1 A0 ('ւc}9@A^F6Xe  #$$8XG!1z ",iP%LR#1s:ևX̒%-K)%%Bq <WzUg ޼B;#N>(8?PZI)VXESl)m,ݻx'yjgB=ucw2_rʉI`pϪzÀ۟Ox6.{LR-oNjn=Hnwe>u/z^iRw'5nwF[Ϧ7'T2\bMٗfKMa?c^g!hDhxEtxz Cy--dԻRafJ/J=S==|Fv3xDz{1%ý7M%bS#EDM7F!|+{rs5:&%*E4a , {"/ G@D< -w&8ZrSF-Ǝci^k/pw=ySorWbqdxĶ߆à8p}\`;n4{⮃TN˹`1X99vt:p֖R2| Tl̟H}]Ra|a m6+rwD}Pn3Z gBmI `Fl4-z%J%!(*^WLJJ-Ѣ24;˜ly۪!$HO}eY-ڹzkI rw/t-ۖiӒR0r0aʲ&9rK`RqXtWX,Qe,!]Znu OaʠZXc W!} \%qK K!pV{`:FY^1Tk%i-õKYsmg҃>ORHJC7B&ӶHv-{GpjKrn6 3 [*Y0R`D$4o=-x_! ~g1@VY0QJ' 9Ge^* Z&Eȁ(JElVe,9 =!"}:fhp8%b"XZZ;Ja8y9[w;N&O#z+b={w4g⎡^u4S+Qq.yϛdlxL8Rku<ҩ ɅҤ24nTnz5~7NRFL%5;țɲ*-ԉe_Ȇ:tG=(oE'o g_..u{z31P<8,JrOhp/pEmd$.n>ooT{缈l,Aluu{8Q&ZB0d;pɇ08X b߹")'8إ dQ6/EYo} [ tB eC+emQ٩&cmg҇"SnYz'#5ML|AIR+T~oY3MuS|BV#k&TJjnYM4i/zn_gƭ؏i[hc8~\:s6|s0O(C2@Nf<;d8V51 ϔn>Z^:Z.3;ݗvL}ixּ3ߖ3+-2YFz L3Yl8J(9QrԹ%׹iz??z~`OwpQmbtp_NjVvuSRhAS ŧ\`N79=H`SsB^oΥºok+OjuJ2.ɽ]ho;]_nI.`|C4 1 8q<}J%3;i_3/T|cuzOgZ|HrJnlY;[ݠ3|Q,=7άeD`n2`O{*"9yF+%cD0:RL݉>`nNBW?@0Za[ia#R&.'tDՏ±-L%U%$Sj(ggxUO^4I o. 8tU-`Y{wBڪ"W5:S-o=DM[5P$OS\LU`O 4|HҲ ˽2Y?נq| ᾚS"IIFҶ,O ićZxʭoҿ B&9f9|bkĪB FA&rժN\4lhUkA%9 Ҙ&W)hϿw~x iq`.:sc>wz۹.<Qr;T9vCPSK5n!MDyc3C_~ST`}Nhf3n@U6:dSMcc) #DtkJ;S|{MpSn6% Af?O?ϿtKh/~z? k.`Ğ݅=xٽ̦iT]9G=]9cVG@ᦶ/K0*1EZέBD]Gզ3Zez%ZʼF0O5BPp@;n q.#NmU"/AQS>qGGAK VFRN`'UіEⵣ6k;W[pYwU#_D0^p4 XJY4T/NE(D Q؈9d򤀭1T)]rS逰 ߜ <`^s%c̓ug3+ ^NCې6%JpţV`#ҙȆ&fDH]Un{ql+#AgG8Ϩ4]`Ĭ9Jp{ sZl)Z2li%% ,kKp?5_l e*DD#Tω1/ 3JqTQ&#"2 ;X+\$JOƧn:Y:^T;`'.סvH3):飒9uxӖXtrJ>zѴ|}{ 9]Vi.4_hYqГG_+q&%zD$ኽP"yi qe$"_jY"MeCyMr.X9q!.mӫAI$1ikLAT8+IL|1i 69ŤPs(_ #JK´8PjFi1vSpD'YϊnNhuoДR{9x,g!EJK,-51ЈA|{ntt9C1]^;hzt,{%SFhL -%k7qR, RA.erCKMD4oe$@wx!Xr# bJ0z)#"bIOC+C@ot槌ٺ~x#ah5O,sAйj&?-C;ܘE!grR9xHSTTy]oW|jkS8m8vH>$~ò%; r9\oyL&n46閃d\ҙdUpl799 yU)bA.h*sEե0ܿOP3.pdd, eMI5Q)D6FJȬdP6n%CK>نYAB ګ ];w5vݨ̓ލqǧPb~Vskc͵ߓrXٚƁKI3 \,I2˜- &Y:LQ@IBZ񭨿v mw_;_?z=a$4'ڠr77:l/FdI#,E΄\"Ơ2"3afl&iu) brײoò7 $RR )&4OM6\wUdplBGH=uԲ ݻ6z^4nHtCK-ⷾ( wY;=`ϦEXVޒ4gϚo}t6/i8kx;bז[o6לKmթ ߗVEeJR1a<=R_oҲ-[{,㿳ͻ:2 2K=Fi1("OmP:yf`P+x,$P`lj>Z2"L1LAHx4ʥȡԮ_SYvts/ S*a#zPl9k 騀6yL2sQgϕ `v dJ}mnjl )甔rtP!ڜtp"- #b&$<,*R;Nd {WN[PWtH)4>.rd{ZiLOn0;-Oej/9gմ0x Jƛl \"c6,u6"9ȁ1C*Պ3yx-C{w^vm dm>\PLՔ0DoM뼜N#0AL,RGwӦܣ*9RgB f@.~6ݦԤ\dWrwuS9xh0qG+/f h1]BE\-MX(C<# U,o@m&WW mv~T~b\r-5eUlLV?4WRβ8 $Y֑teK~7+h3ʺ+άu9.%!$0xuoW zn۷ E85 Cf%U$JJCr^ֽ+GIy}>9ߥ9`ħ*z+ɥ|TސeLܴC ]jŊ1HЫL(2 r ;x4Z\ɜ--ZFcT*E-^F|{]IvtL<%#Yb"'eYm'~:!Jx1ydЪM m ;ص2pH[*<'ߥ.Ru*T2sb[R+6KR`X C+`eС) UIt"E-{ffZߡMīWgPuUm.!?M/~Hgj ɻd|7X[tE)3ثIU^_ y4OB/(-=}w?n߾&j=^w?;W:wJƺpw2sreX9YA.h,YL!`d4,Ȩ3QcC2ƒi[Y4*a)9J)袱$s2t |L*h^﬘+ eő7f)DxSX"9^ }Bt}*2]IfI*c=Ј?\ DQgHR(<25ADJ4Ee+'dZ'`)OFrOcfIB\ddDꈴ:aqR6I3ͬT=7{32go` nu|9iӑI, kwBZ4{H’.Mq)T!ftQO_!7 IٕCT:ѕ2w{UNNׇ_ _M]`P-?_KXsF@cQ< > UsIĺ!;mM ,q26PyA+!QIBO(`sF5Ee?p?9>V&P dv9{폨sͺٴ~׶~w=]?~Vk~ȅզVh>ZVz>ZVhޓVh>Z>ZV8hh'lJVu;8WGl{;yjKq]f#WSu?9r B^C9a !Me61 )|BSFL1ȐVy?9;{M{ҩ9i߀ޙXm更ooi,Z4K,') Iݻ7z8:\~K?6W֞^t~N'pw:5='<~88 `jNTGTZ*GT#Q28mਵ i~Uգeˮ]++vs3~ekYHHH 5jv0E ;ݤjj(/eR"^PIjU+\dc2=zSz^r$eM y)0MJIx[O Jf 9XxŽUzY'=p.oӟg%#ˆaՃi[v;h YV;@x)+)DHOJ&'*/40)Y{[V|ŋ,MY5UY$Py}eĀ\H 0ADh"Xc XuSe@BN:(1e!=hiW(Z92󀞠VA$;v^l8`$MPjx_$jw,[5 _k>jcMUҷ1/bQ); CmV)RkbA+S*R*I\왷m[]B 5q~/7i1!zaБ=j<@v޾9}8/xHߔ00+V80\I!le2h@%/yOȻDC89JDMl hF`1EͲ6bDQ \֥ȲR3ZeyZ.4p5Ƅ̽GQ#kp1"Iꁄ~#hl)zy$)Ѐbl_K޳l`S2|EaEMu3(歱w}ї&bU(g|wC`R~túf'x/\r/&ɱ$HY1,d j$OΈ&2q.3dMQ+i;}s\ߏw1'dmظ:'Hs,jsIYڵ^0>%݃on\*,ɲoz9I91(AVpHWKpXl_%3uMs"aDtj2=__53fϘc˺ |u~d⅙bk޾'P1mXһj.NQmzJ6_R2OdGڶ Wxi PMF:Rx/mSDjDpA}96iS}xL-3Uxy!4c{ˈuԺwj6(-}bufC*Ip{w5ڗ75,}[:6q7mz,^)21o-uO^;B%e]:srd4|YD=(oEM1oG'|yxw٫ݧgzK9 -H͆kYB%*1H\EQp%}d'eL|r'F.{΋)zغ=n>Yx6$ܭCtbP|^GrWJJ~" j|J g+/V@goC_R]o{W3j(-l`җ`?lAitƅaqħھ\jRʩ7\Uuaky, )'.KF}ҁKjd1Zc&$.P8М4D9I8D}% gJ??s>ULL޾^(WZ6ǣn֋n'( k҅SY&RhA᱊j?n̬w>^Δq:1bv͙K_}B{aT-65A^G/-sWyISzyǵ'|bѣ|"|99T*L{~bݽw%S {ˆ.iEX|?&2Wi =M(seFA(}wS߆[ZgqX vγ W5?urr S_,ݫc+n67b== ™*3)>;y;d0K58_]{KboFԆM+Q|sa'PHnۆ!mDy0чᨒQi .co G%hEڶV <`x$ F^RG g}HNҩ W@}@5wSϗ~r C\~/?;S@qL#)e/^BL/ƛ Mghs͆3ιq[Zo7ӭQOC~R,ƧwHa 9 WfHK? LB)5]<լ2?dL~pFqj!(c8b8K*h W3qQT͏StQx AK+#)'`qFxcGl’*3c]@( F p3p_ p鎌kN IzWfF 7.н#XMQ[βr` abuzK)JTj6qC0\I,Kx2<' U 6jvG׍޿||1ɭS+G&T)܊I@hw1U4kiCjuQ=l͎v}}7e-gݶz^4y0ݣ祖!Eڷ7t ŚVس|GMj],kޒvwNϚ޷j48.EGp|\ a'k49L0+ Y[D~"M>+e(~ɕ^uٲ˭_n;J`IP'(sZ_ڑhIܪJ;Hu$Hҋ#@*˫0-{M\V6ڂ- m|,DtpQ i CYډ]@}01еl;u$%d,;C@V遤vT MNS H1Ju,*9Ռ..s@T<(8L `[A9Vi"(F稜ER\X$ d: rZtk;q6&u4bbRh,ѷR9Yq]Էo9811`` 9D*3]$ ';F`" iܠ@\K\٣7ӗJg }&kZo{e1c /Yۻ9hyԫ/gз-MάM>^%1v76}+co&w1fIu|I ,Rg2DOLDX,-,xGI4 GrMR3|=>=o%0wEUnsokAQg,je=o,;Ne;cvy<\;m#3V PhyU1+ItפRs='IqF;;%)$8ܵ:o~o˼;s:X>ޑL`u^oMjxJMvYJ +9eA6=&&sJRgYOPiIYI@'z.jvnCcj΁PTٝ8y39N2F?_v8>*fE3c.9> +)Q5`/0En|*s ) 6:ckBz݅)Ԟ3)f*G0F*k&J$8 K!^K0R`N+.ddNUdt|0sV`ǢkJr=zƼR6FFxt(vXU@&$*Lb ɸVҭ0y0oIf5^nz'qZ|i謠ؕmgǾasW4tDhbM q?gpwr& -)+R҂XRY8].i&-D`4 jCXpǝRR4Ha8M)9Xz 8X&#O# LI>2~56jq< s7npP,׳#0q5r,JNE1cmgoz1n)vEgYɲ0߲˾69 3.ԇeq=  `?J a#B( lJ!h3(;rWRHޥ: zY%M@,72}`@Xx]K/ihtU+ݼU2|\ޓE <UX-zg՘IDk1МͭlNO&؏q'l$Hb#1$K+FsШc֔(ʁ! bL <* 9N|0aLroDe6Mg7wmm 6#"88@X'ؗ5J\Q$͡d+Op(IICHdTu}U]qIxI䎓\5dJ|Rkt$v I,(`[zhrRޠˍʞة&_JVO"h-J+B))heG cW12#XP1g&ڐEw5t#d?kJ>,mRunUW7 7KWL3`ZK` OP* %^FHHLBDN6΀%;:)8,r!c)f  D9>0GL1mp=Xc" #$$86H[XA`Ir,aKmX㴎:N%hSl8MYQa>lbdνs؏G cgFW=![I.VXeWEUѹ~TppS&J`L% Q*PbHθ!$K<ή<90U+X N>:솮+)}$NmZ#.6T1Lw1ayu>mb7潴vu{y6 P=/\OӾdoA[]aϪ WKۻ^Xl"zuAISn[}}?Q%[I7g)Xy_A kk#1a3@Kk+ D]K=c5걔<Ջ|.xezտh6.`g[ GYL,38P,5ElI,\T!BHm)N­rE{LI8FҎY | \KKȳM MgKÌ@,zK3D+|1&\?N Y=ȟmw4Aκ*vD # Ho-p'$pR2ȃHNH/:|u:wvkdlcR YTqXGtmv{(?{iF.nC⠃B^cr&j tO D3M c%-ہD6pd&.s^Vr^L~'sKČ:Fjt;tF8>l~ϿZcPev3{H dJL0Ɇ_Rqa$ɾ w%f(FƅO =stR?)mOϗ:ϧlILD{Ǝe SDG\ J1uǦ(•8C_ճo%P0)d0: C(XR%9$UX=]!|6J,QQL1f3 —Rwrn)j`gԿU'jtQF0 (e:v[!W-M#uq| "yP&?ewTΦ͓~-u>P>zrQ}vݟ ra2WkoR=CIY6oBM#q8Gj4 #,G`DL$pп7ٻM'k֋li)|KtjIs .u}]d?8No:cSt*&vLg7r|<{݇~| 8{g?R/RA)`Ğ<>m 7Oo~hLƫm3hsՆo3ns gRl-Oٯ_w¥`:INs Zq/t\|ͯJm9ԉyY}yV]Q!mCb\1Hw||͗1[S0 vNaX<``7b, *0wHsIMo4֏Mqp8c:DNBPp@;n q.#N yi Bz)umd?BrziT5r/WH[j׎ZЎyF!,ҁ9Xw %AU o&\->])WQaU݀7zX--:yZ~SNJ6j,seK]P)*2smC"yv!϶Ey`4큆o݂tMu4Rhn%wox ,f1grHxB j--W#З*0`4 Wu3M Hsa#˙*W:܂ZAlY9bBY" KZ|d^qO`"idDC0>isV@c5- ϶:1ٜ|NqZM7ݮjGӞwGmGߎY%pL |-'l$U,6;+v㔇R7VBEa=7;" CiN C%ldvݝ+fN( r.`9Oƣ~:pl3|%g=Wj![a8U=ṬΧI1RվQASM3Z0tL_ߤ1z%}(@*&{TaD`RGSJ-[d^ZIHo\X:ա(k=uk tEa!=!-9խ#` ZlMyչ 27س\DǴp w]s0K4@ү<%oqh%l\R]å O=cF@;S1^+j$ kl'>Žff ΰ `ӟ$0RkԑJN5uZ7+5QYh $V&}/NA1x/(Th^늱ޕ5qc$w%싫ȞTřq<0qaSM,+S߃nndKlܪĖM8Ί#ڡ ZLE­yr䱑u!.FO'o[n]AALC{]+lrEL: z-B,f# $8>ϲa`xyPFgVV,܍Wsr AHRVa S띱Vc&ye4zl5ͭʰm'>LFBF$F8yb4:+9J`dhT1ũ8hMU-80$5AZX,@(vGEwo> ̂HB JkY n"G%#wUS^<9 H覼 -/sIk{Em[ӎ˖ĉk?w?}Ȼ^(GkQJHA ,8>jP936wپ3nF)Ij3B{/2 1JV̚aUov|3v~rح8[ (LQk %iGB:`^eM ) Ӂi]IV-OZ焉ڐȅ4*C0 A|@N@@c}9Р4 cdg$cZadg-1걔]}CK4s |03[փL P,&(r~0E9f[ls qP!R۔߭]&$\=&y$LKY | \K8֭Gg-Z#aGdeKr%Jz.:<{L Yȟ5sgCjv+8|M6 &2w\ alpxLDxkA*=!K'QD4fFr@|)KbeCFf2ͽ LF*:& Q*EwZI%HD0 AP$@ GA}d(`P$z-w~ڄ#A8,cr&j |O D3Ma c%-7&`M\4B+4NJVg#™O:0Y R˂VZ yb\P(~pfz}_Z$AKd9lpB ;I}qKIC/ fyq2{Rڞ"ȟ/y u1[Q&k G{Ne SDN\ J1u'f<s2&xy:> &%xFAr(< xI֓+Xo`BzSi.?%. $8_ץsɆH3ji{NrxQFЁ tFoA M#u>Q Ί>R%015Yz23?Ks߫'?^_M.-`.4_{E/z1K"`դC'j{b|}OuݐnefyC1((4$D}.{ {%hsAZ7V`YL.uh~xXU̖T>1(Nh g}ys}%AF=iR;_2 㷭!jt Y-e \pXXL62b4Ѡh R֑1ZH=MnS@dU֖EⵣcQKt`Τ_P8bxBh;:+ ʠS [&J"AΕ-ur'mJQ kT V0bҥJr{c_z={A3QHXJY4T/NE(D Q؈9dڭ;1T)]UP逰 ߜ

^5g[j fX\0<\.zSd^y"P^[h_KK_nyejTs9{5]h 7jWQ A66sftЍ $5T UJSWݠwFw ]KH_ٙë~y}|'ge;3hp ż/n/xH8XG"@J0z)#"bIh#-jtip437}:L;r 3ź"p>7WꝥbD T,ǻI"HQKQ#5OSU )Uv%ịUh4BkM̨3hQ9b0B/w.kq3d܌ugOVW_)zXzEYHT X|i-yAԖQ{8KIBcuTW0Cu_B tGNcL#z (q D , DЩa9$LCm@A2(etZ+n'm 9 nf 924/Av3I,rPIRh H2P?!u,-U AIR7.)!Do7;cpV`b`8T֗@L+B"FKǎQ㝳 Bőp.uXER %V5S*GQ25 D4VձO#LQ) K,B`ptXDprgQKNƎHF*!7mlm͓6:śJsƗay .㺼UBV`oPU G9EדI (>n%T{@6-;3>(z\_A^fw, O+sIμ4^r탡ؙ|ʇ^+nqLc`bҹV:F Tʐu !=e`k\ 2\ LŘ8P}-THbaJ;Վ.=U_{LTH$kL[ϏMm58;w [<ȘEÛ/.\"ͼ8OAѹ6;BV5b+U/QZѓO 4lLVNkB.0乆c$XI%svZ>VkݶY.i*Gehk//|M׭=vZ%/a邼L-kTֳ&:~i]KʰIf`iAӂv!zId_RPR̈́dg,oW %W|J,)wh7/_N.gCN%^r!c P߃<9gqYQmQɇ/[qbr-[⃲&˒oߋRv:zuL؀ KU^"Tf/~eB OƀǿWհ 0nХae$'p".뒴k[:1WHNcb[msVn7JcײL|K?-ћunosgdwVgd<Ή?g eG/aTbm"WY 6奋SU]%4QWF$L; dlf )1C2W&)H=Ұu~wc[u,|oS\!m-)?pkS-okiBy5bM/t7zf:.ͅOς;GToL:3E&BXmN*I&EkyK]Q6&4"gEj6uPsmF Wa0ISO$V wVw=qՌ~MM=|}e49r8H~Vϵx ˖+p\w7k\m6=V(!gsw7YqXԥ󍟖_Ff3 ؖ U2ei)1H-hyV*=Vgnzh$hR:I 9*Radp#,vZ6kG,Yydc $KI S"!&"Ŝ(aH抏ٶjQ*:d lbe:X_\1yE>Ώ#CThWBh%\d3J+ H=<2#,ۄ6ij|3/1cF\b6]iBT ,+75u\Jc(եv`HT3jYgżN]GW zƨxH2)l`طi"(F稜Eh}֮٭#.8ӥYֺ*#Ȟck^q{mv2D\W>:D*d!n ~M0<;+&@Q0 A8_c&S٧JWٲD2(˙*=-'ٓo)AX!TS 9>HXaM'O +l7a4A2ln9^0@C:&°>OBMp>뽡`_)#YK}?s/IYҏ"@E^k o:?K 8 Rixe L)r1 a)(`RÕ5k Қu|ѺQ?{'+pԜ:2(ZLjX}r\P] mɰ"r k9bߩ#dG죫%r\*?UbbqdKS_j2[0}&).Y/ۢJM tat#z!UԵ0D2""&ZH0<)c"ҏ.^vo}=tW Xߌq^[qwnݦ/ø=\qC57yK͘ ZxOuUvTB#>' wՎ9S4$Yf6%7蕶7(}o@ͨ `ߏxSO5AUS$7[fzGN^UiAXMN:b$|gw7EUݍ7$oSz3E\`c,# 1^j-aEj"Y$ t0}fh+(_mꮹ3m-xn2{kF&ɧ+BC]POF =;rWrd; G\YR+9 T6J!^ڠLiL )&]sJ9crd.ȎVʑD0^p4Nj,," "(lDɪlJa^iGjd~s6% wE%9@U6m;{07Rc%K PxZYJ[RѨ dM;n\p촣iGZszw[V\¨dϑ%7iRdKJo+KG|J^\mK>Pzt*=%+BE$!#`u6ҒS3}\}?/%TVf6ab𾸿fyI5j{w`x?hq.gh/fOh=C7/>:=|E~Fu3ލq@ͅ\[ܛp3~4Ϡ LZRV2H -Se@cjҵ4jPtŀ_l1`Fci(^ e;X0\J"%GKMD4OG瞿aJ49I邛z7Ah٩|i&m>_]EO8bB``)6jX`i3@S`!cr,F$F8Bs` ?b4:+97N J188hM"o CR"jxT~8!,Ѝ$v{+%R.w \&p5/Qf3/%!<5|xp;쵱F-Z|}ۯr'uJpqʝLuE5E̹8'W8dIF¸82'NY'0ZfE9B:P(" #s/S\BM*Ffd j0D`R9c۵vŒB$ɬ.y8[&kᴀECˡ (M?Y vL3 JGB:`^eK ) Ӂi]sd+ΖWmu|-D9(,r!ch@L> 'H c}94&/;h јV!!ց3Ȗz|ɭ[`H+oMRvW_3bWAN茥 ͵;a86<t慑!B7}{{^WhKLc&pd&.s^Vr™ 4<àD̘1#̄lp)-pb)?q!gBsP1 {H JL0.B(0 I?f%bn H>].')z{?mO!i{`|*V@ +0s 0E]IA)TUj L,u凿~ ul#-y4Ұti>KnϺ:[]'pڠl OOw_*OtKM{IJz ? lӍϴtRJ]_:6w61YL.[memq:[RaFԓ&f8y*l?@} ˶?Ef y4H)ʝϭo ;߷Q̇2 .8,,&qXsthL|KkrΨ;wdʟ˚^OɁL}hFRN*jK"Q 1(%U:0g/uPy.^T/l8z!4s/hp\E5j)j[$JaJ4¦(&l̝yάy@L DB)Fdf/84%"Q]vFƩ 熓qf('IgJFe1cFUɑ'%ҔG{a|;%(L7xl0񌨱v^gѐg{6W!϶=G`umwѤXbcrEَ(YEI(KvhE83w漪"Nq8`1(uEĖP=CJf/hl#@{%Nũ=Dž,h3W4GZ4L*0>g(ĥeB5Ad\{t(|O(r,^z 诃4WGٯa}A+~9/`{&ύzMV!35Dsc̱|4GyQ$@M+Dܬ@e|= ULa=+iR%V' 7UE& n6)oFâ3 ;C6뻮նA';XP^2B`޽=V0Aӡڜ-r ON&zG۽YE77ĐӗYG@Zr;AfSD{Ok9[k1,wy2^E$4vft9S3X!qpQ92U[F\Kԓv\kK@RpH4Q0T+HCaR[ERvTXNxP'jm g!jﹻ`߸O;Yg d:XTEd"Z¹@08m :Й[G|AS7)!Do7;c +0Y0 iEH( ĎQ Eőp.uXE%V5S*%dG(o"By C9'աFX}3YȈGf) E! E˝ X-@ 3C~{o:FnNKiaPgri"7<ҔazX38}Aڳ4ЦsM1/>V/z9h=͑csAu.H9V{q& ߎ>]h=딛6!ΌLza5 8{nN0m82&ttnzǑ3,JG#Fw[,!#kfWoGMV/XC7 ӌZAKOABFZ8/$gc[/ze6rN W^R Jk/Bg(&Zř[o쌔80]{+oZdv Hsa#ՠ*W:܂ژAla=U  `(/-"ÙF-E$btIvh~Yx{4~Urhӻ,-m[皫2Y:Pg,Ce;,Ce uԺ u2Y:^,Ce`,Crw!hޅo~6&NE:ٔQ't[SdG#; p [P]x ?Jy]ī"~ɼ-:7bo ځ%`vcCvnZ(%>eY11T)]*o4a#A29\;^sXe)\NHL岬S`m~3>a7m"%Ax ۿU,6,rwzӫߋ/h@ƾ,gVI%1j߶\;M j_TjZ"FiQA=A*a{TOQMpQK i ڟW:e&t-.G?|akVvHk?H stgH eUƗꈖ\qXt-.S<`s-:2u*^m [VrۂMw`(;ݧGyp~6Ċָj8zWكKPÞKptzׄA;yUһE>ˆ^#2% be}0@-I Xy$RDۨȾ}<;l}婾SѤj/4'5%+n|J_·\^+et|s|33cldA; @=7%0w58&ւS[Fw1Nv||Y3pfǘc14Cdό133<0OT F{ [e2TIƗaXDzzhF &nÓFa 1A&a3jtnԹ"jKf?i|;u\,#ͱ,;XVC̓m(b 9 ?Xâ-698DBmJ) HXG$.  ]zu W >`l+yjhzvzW@dz~z~>tg?N>H K&zm7t/wi[]Cbt-ڜuۜrCه̷Rh{n*[+s Wq7Vd"Ju3?J6TrOը8*MG.}(OM16f&gu:6N-|cQz;Cf8N`0]d~V;GJQ&7My`pm+u*-!ဂ  v\F,&4-Std?y'$@OSF&O,HH \ mQ$^;jA;JLñՎ꠼Z'zіն1E|U&¤w y;LPjM⅜+[urG,*2smCMN>(\m8"rV+]Ͷlæhv`YȲ,i`L7ڠqu>Oz J~V3Qbl_fi@ *ijH3 c} Q4 NC#[3y"H>` Ʉ@N3!c 6H!%%BSu]QQ{՟qy4GM0!40>a_okB;xE9ˉ^xC2$Ri"E6T9BLPHbLhTB[eg%HSrh)#W]4`A'ΫBA>BEuSqigA6u u=T0ȝC[Ԑ쑹+@6w)}!ם ]jucKkn{鲧6=5~l6bR|@ٿ]-1I=zj{=\6ww8=<>k~#Gvqlu%)̞ierT Y-g%6jٓ' h$VK d@]OL].ʪ `U/zW]G-y10ձD l6pM)ڸ"b%W񫺪dNPRHb j"Ē%{̀>#ºxVG-IAJS8˒L$d Xʔ-rpVWw՜=xsS9fO?N6cމZ =z)_Jn /e s! ^pG\Oʁ]Od8vBq2,Zz}eBЉEmu|W%M_V*E[uѼu3^%ZsզqS];,f~p<~Z|_9 ̖.f#Bn?SMRqXރhvU&,-GTk8 ǐd9e=Hh\}>` Bqdxp 2l-?ޘ:Wml}zՖZ6}7Ln[Cn=o;'->hQ>mq5Atl"gty111K\B}lq>n]OY̶၇Uoؕ;&h/:=;6q?m{,Q)2iI8qw{Rfoxک 62d@_oE_ۙ 넆ϻ7aPr8q8VrTrypt 1' JJ JԒD\$͸ZU2& 3SĞs񈢇D,xۼa ͩ_6::O,k޿EWǦ O JeZG ۸PYєe2g3UfB*~;򉇳G A5M#|6`unQq=zA[=^erѹ9D(h<_$n&SY#1j XeG=sB3~*PC: >$džDhǵLdM!Q.h`5s˼WNh&CZ |$|z((D(Mp>'>r#Co.\.? 'ۂb R1/A?prI0P-M%gHA Gm)UG' hr$\kI%ѺLECσEx8 >23pz^nY]9CՇz d{ ")6҃!s76n*/m"-px}I =Y;_wߏ5sr:CCJfnٔ&| {huf"G2Y\Ul9'rS^<9Ռ"Y"t(F)LdZ!mVu߀j˩U)lCYCXHQn48LyԄo/j,.ʾ5;Lj䭙 p& n>wmGָ&WM2O;Qnzj,ݑ3ŜRA(,l fF!Q0jCRRj:H9=5&`bh%H#u%ԥhrSTL ǘQI֌ՆӯwduZoՅ..<8=xr;R7膺2`/Oqd\}tm &8#W4=%G:9;$G%\p\6AW88nhQc3R.vbkl?-lvV*km5hPUY*7DҨbA񄒣ӡ ,# h&\ 3gEd챌Y(UJYi&" >H*)E2!A ԐAI$퐒vvaS\3D6*ßjY1+C։J7ʢh~޿z3F$4eFc4BfZuOǵ>qn=]]3*shJ/etd 'yjp?b./ɺu gzv]5lzj{=6w+>~5y#l;͏zH=p.vq%,}=/'C~Ijr4BeCb_geg2q /g773C^44x-zt7weA*TPͧ@qH]mo[+BcЛZ⦋9ڲ+9Mر%Yi[IN& >3y&SY._\DoEpW:YMɺP]F嫳*h|Pܻ)E^JAUtNR<7;l{M/WXwEK BPBUqJg4_]c >lzk=:v$>>Q#u6ATk#8O7_7+j%woFihzu3_{w?O~|h`WsO|m5\Go/"{u'=K#Ħؓ{:ԍԍnfy͂ir(^$?_,zvd|yrNͽ*$7irIg"2R;}q:XQ9:tcy9 :=ol*j#7#MFZC[}.T6Y18]72ʡ"da8$E=RR(wIQ~^$EQZX>0R$b Q!zTN;,F2)XU_Fe&I,@\Ii# EpDVJPv4͆C^ާ{6̎>!57g::ؔ,i'!-EZZ?0=h x*.0E>o1 _LdegtƮ:Ats]M.R^) ghUf魌9l@X>c:+:JIuB-4) h@W!D}C2 1KQ2QJL"g M<뀳1^i#L#@3 =P xQD޻e7 q\8oUr쓤/\RJ>Gm1C) ͼ- bCfJ1"xN䃯{ W $! %bCM*L?*cE9 l{ߧj_H%W삦( "%Y,jdldlJ=}~.נխzgl ;Á0A e Xf55,q_3?@ˮ-?}⛝j_^|7#dDBZ_ǰ 3@=3P 3@=33Cv@=3P gꙁzfgꙁzfgꙁzfgꙁzf0ꙁzfgꙁzs Ax嗲<ِsy*3l5tCKb&XZfM)W+vg]Qev0:(}D a`+NPYR\Y!'L1@ eщQ&D:\`Rb 1Nj6XW)ϡUB˲\>:ջ 5n|:YLJSLҌe#zR=:`nYy]=j 8^;ERLOE20@@"T?5$OB F\(d fE1`6%z/*2%Vh!RTU1#]1 ǒ$zb_%= Zy4h}N")PC%ytr:܇CԼ!Y:tF1ѝJ.:HTzϩb,vCVJ 5QZ .-Z IE XX3"thaݑ3JʉBEK Ţm+RSչ&`k+D 4F͆4EΦ^|&>3>Ϧ?9~;u+ sߟYyO @+C59i]*l* АetJli' `yJݖ-)@Je$mb CHPBQ>Q*rpVù]@eSt'+__6M`-v`Y>ez;=Ԡ܅^g0(6dRIHfMѵwM Ȟdf8e'ɳIxElQA$dNcJ !|d m&r]m㭫ttk.+$/޻Wk㻆SO*TRJ \@)[18).!,Ò|A` C)&zN:&VR5cP f+Jy]lgWƺQujDunp{ó.~/.~8?}=ʧ E dEcOBiI5P+Q% +St1 ܕNIZj"kљxp /lE  _Z;Nrovq_km7hn~m~S}XSҰPQg]IS8R%1b@6͆s>JZ3 hf7ֈ~ЈFl H!7l5ZBq^V]* #'F2^G/ ~e'$mOZA(jKfs:UNn#kyDC )H-i03ltɀM:f"Ɔ=MA*Ahcs&I Z{pLykD/057%E+IYoR Y o4~쓤/\RJ>Gm1C) ͼ- bCfJ/Ub (EAE𖏝 eZP@ V8$ e(kRPP#(/1PcY6\k=/4@Z\`j삦( "%Y,j5P^ 2G26y=}JA[RA;y܁1o_H׿ݼzn0v9L?g|~Ί:}b.g{;SH1;~cvy6y6^PZ  UC |l"foS[H8͘*cFDSb1$I2ei"6G&ࢉ.]#͆C^ާx{u-Z+#\CGKMw2Oyk}̑n[Z0c3/v xaxeQ8q 8c |zv"E fxk: AEKL}H)[HwY;3AMrZz/vu(-LS|2@ڦDr̥Wo#Vz]koǒ+iŮs\|Y i )d;!R∲4zaX9TwW:k;@e49ik3?nh٫ ocgG{_toٷmNmN\7-8{>guUl#nQ1(Q*GR'Tl%{5b@ Y_XkX CI1xNGZU÷,k/˺ JL&tm&zEJh0bZɤX$XHTXBqTǔG~\jTXź;wOiBJbJjm[c0U$U2Uy! k50\4?'m}:Oq/_ =f[?zMjw zOD KFW?0R4$y=ZBYxwN}[s(a9k4ZF-2['EP[*8ke=[Z0'r&[Ȳ)/}V+Z*TZ%'j xE`_QL ?P='L/ƫoypŴDZ3^Ov^h6W;ڶcK &;1>I<"B[-$cIl"&!x %{bٚN`ڼd8@zhb[3;A d(~bBPm;cO)AY41hAfʲPM@65%-._C''7@bkCN|ڋCqj3;GӘhZMvi#C'?E*̢]"eۡXjb.jHMTQ6t. ?Dŵ]]][8<9;=[]wӏǯAi^/vܿvw^(~(|e9ͳ{- j3kQ(is}]bZba~Zݕ !.#.;}yC`>]`^Qև퉫OXx䠸pLtz\kR>)M~W oo@vk1Weu='2~@Upzm[7˟7@smZ 1sx6a>6q)}U+8_,Clw\^&-,jEtЭ1ߢpM].J/dYúIzˊϫf*m$fʾ6w6,649\z8L0,j~xp;s|㳷w\ O\ZRDSVV.CKVX8[-.v:k)}19Ȏ恩B͗{]8꺍-^*h׎Z=Z^;ZͅϜϦ~-d2P.Zt=嵩`lL~_ͭTȂ}FaGrjoS䥔.3-Q0g9km`*'ʴт66C>ΥgU3Yn^Ak7 䭮A 8 ?+1^d^G0 __r?sX˂[ >?ʧmᦋq庝w7E+!4ig|umQCߢ:|eƐ]CN)fa'f5G,%֮4g;FUǒB#,+ez6ս8ZY|r/t[B{u%tK73_HxN^wFݙvH^dkh_5/G'b m|{+@ :i%WwIwwV q~@fa26gb؋ۚvYDJ 6ǗS8 Ew97vm nez߫n2kѢ5pW\Ԯ %`ȪwEϨEqʛ;f;{ߡ[zy:n^Ƿ؉?O5?f3cMlO^1b%(rեƮi΁eVQhdI2h[uO,u@H=^zkWE_A+}ſ?S+ l|<\ F ^f-&+UN& 5IJ1\wzC$G+х2$EMbb*e&%U+ ?IْF#YݝtOwZKIm֖J*'\tҗ1J:Hfdo5Pc:xFezmVGN'>_|V攣ƶsڥU%&Œ")!I4jB6`0V7cVNƣotLHR$5ٚ=L Ji}TBȘE `նM MJÛTwb0&1b*hDku&_mh=*2_ Wjsx!Ä=.#셧;0XN'!& g]HR%hiMFZ1q1OhNOMdC17{M-JFBYDA*IAf;QU/cPB&g&u>+ݐx /e+ C\ qu$~J!u$BҒ<TCJ0@Dd/ E\L.lAzE vy>`̄ J `\!S soe0C;;EG̠nB ?%(c)ࠤ`gL*Q*9lLWi_@ibdM%Ttgd!(s$eǽ!A{x{x)$h~E_!Hܩ+Xkx5=()IE,)Ԉ yEU@!ѿ)%RL8PZ il-@x@IttH+WE7 XJΘ D:~Y[ZY6cV#9ip */VFNRB6s,`sBlcVHh%RK+|NG= tG S4婷6<cR#BSuIeLAkV2 |sP*+):I o :DOAnˠV:hY nAQN{"s0t˜yp-l/GJu!RZ)B@[}|hک^-RBvf71ltIDP0hY%t I["H,!̴w]b8 h 45w^h-;,GZ*Iۢ``MW ~U2׎Fӕ$kC" &AdHZ~2,CΪ`rHSYB`'Hj`n/H*6;e_pw{3p])@ tH 0rdH 8}$O" I Ř@I $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BIgL) S"@`͟ ɐ@ [C$H*@O;@}|UH! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! t`Hߒ;RI[ ,=0 t;q)=, t;i t;)9y\$  $_nun.}zR>ioYqP&z#G6M3%3!K| ,xтɉG[Ҍ0x*,wJ-\Kܖ*&Frhgᨊ A%ej^Um "57JhtZ?^«Whi\̆O2k>jVI֟\:>t Z^+|o3S]ʊ|TReE;0Qe{I=\yrT2ʐ*C 2ʐ*C 2ʐ*C 2ʐ*C 2ʐ*C 2ʐ*C 2ʐ*C 2ʐ*C 2ʐ*C 2ʐ*CRe`P"ͬ>2# U^'VUV Xʞ#USDW묙M-@ a?W Мj%d4 'и*SrYٹ'w.lY_ "" ^30tdUD[:^|~I_d{MW%<m8,ae ~6lyٙ saZj&i7++miZ۶e~Va_iϳM-K}M+|C<,qG"H& dUXgsi`V{tTpׄJp*XY1IlFA Q#>YQ;g&Np+̭͆TlU fM:-{[i ׻z7Tr6чrbE^/ZR&qZͱI0VjkO36oUh{pb~e#pv~IŮċt൙9yoU5v{=U뷟gc2x~[=0o [7XΠD7fhbŮidbbä _J_N7'a2;w;bcN3Cqt2Lsf9z1 3we~ǮOwo:dm%<LXJZF 6!{VR:KFI.$:ZKթE~6bgl#B]顿kd=TҔC[^JwB+LCu~L\S(JҼ-se[&RJ:˭N]| ͺ65hbMLpU}#}cqOSf"3LFA#3)׈Fc1i )ү iJmo~tY~s>|Ľ]ij*q-f]Z]Z]Λp`bNί۞jLfYqSӮ'n'ݮ| җޠ^V7jDD&fr  A,<ʅ/.b|JTHLvBL"xa:g$|kiiW'v1Ze}xnߝwh6=l{b^cBJtjuÀ|K v733 i+tyڵ3~'{H~pPF'3d룝CFZ`;0moh`YC1dQ3|%Q4Өrx`ژ `[6 vT2)7`;G[5Qw[}4P:7竬4hA`NiP:Nv?f>\3~Y_6 c?H|*vQ$>rA չFj/S9cVgc7 %R"uHw^̅}s6 ]?^qɏhC_O)lpޚ3N@1=8irPFKr|v1$\:0Ysn$-+: #$X`ϑBB.B-R+ 4O|G32|fNޅOKSu$M-A; !a|4Ë+\ݫ/Oӌk5][}ķi9r8. Ŋ0㠌y[u2QMe:eߔ'$AZKbsImmHDk1 (꣠S-|T?+z,sp1$Ԝl/um+4|ٿ_ٷ߿9"^ٛQP``8n~o~\+Ǘ^4e] ͋fR ߥ\+{]}V-J{e7iYEYor)}oA/ kQIúT.rBw3慿OqYŪGjM.|#|hfKT6GI:CSg8_y/@ vƾ`?+# ypοnI b|NaFsZJ'G9up:h/!1/Щ{ّyU1ΐ<80~2Em"`ɚ\_4B{XDjS:pgEnUWgm_/v,E=솄k/6؀oux+Ew'2l5k q>`"mYcRDd6(YSPY(1}=mG)^XJUiQxʣa:x LFIB>1J>wN@1L)J]UP@P^噇9*鵰+Xe;:gGk<*Pt'[]{,< ?.m-hv h: ٿu43,b>,E}! wZBm)qClf3᧕Xeemf,BUETn+гT?mμ开]Vl\N Ug5KEFާlę5}Os'fC8}R_733Sٿ`BL3ʔEfM Iq nJ4(2Ӈ~ݷ`TwKoA]9[um}:9E&)bYrhEOkm#WtB,2`;8^mm#9,o%E-SdwW]*_w2nG]${!ЩQxl *P ) $0sV>HXh5$s]1mSQQS4lŬYS^E3RX&SQ[DW<^BRli'q)<~w֮':o6珰N^̢ ˽Snmԗ-U{MÀǏ&1%*oVd8ZEaݹW0YT!9*b Ƨ%%i-k0JJ%aEeF)s4 Tkdl&vdR^}+ uc,U/5`^`}o|@o#BEHʈ@P gޔUHj'_gLOMڵPoC< BP*lh0ND=a]Q2:Sb._Zg;bIZq.[iǮmP{`mdB`MlN} [ 2S8cɚ((-m`DJd/:B&ƚE4*y&>Tgm{~ن b31"Ҁ"nx,B(4$SV@"LW+&Hl ;vonJ! !Y#HWX U8RFd#{ {g'_Ez+#7C݊~[̯!:d湘|ċ{Uop_}hG]̼L.fa'6/k-}޾u ?{7NTiH\вCH=MdK <~}-a{<ۗfl 2BQ"-} R ҸM!+#$CFږ%RDA@ldLRvZFAX4Z%}NJ.9J$q/z?]G6]7ކ!z5*AD["|"2%j+ 8'LvY{SMk(!Y qKɹ$OɃɕ"y*j1z^Tt!YUޑeCqɽaY-Lk!J[| Vshm?R8/| =;ռ $!٘E,)oHberX['c!{Y`q)bb :14ϛ lM{Pg{U_ _u I^񷛷@~?dhּ~N:L;Nك~SxΎ97t@o6 rvr gѡ)Qv􅧠2F*&Ba&-lb U_;ZmY$ut3E:y&Q5qZ FZ󀆍s\>|7^~XƁWVÛb#yP|]~٫9֟n6iM3(JbB|Z뭝Ft·xgW_OڠAĈb"Z +*3:Gh:;^d)$M7|]ztι.ISKw*ٮzBr F{I[W t+,\ g1N=|2>N% 7mONn~ors G?6O>MοM쳭}z͉ns.Lo}mR6P[zP,7qo\JJ2 _R]Td@@;9JsiQGݫHW:^Z5,k ##]^iITQ6XQ*.cVлU).m(- 8J8҂8tR" pJ+%F \c*C*S\A2 ;&g(-, \RM*WU \QͼGWU`X\'UԡUR-•YYf66}r3P宏 MN?Z +)<NEǓ|h9懳ꘙ(#V"rN_zX%  LWq5$~;0 ZTr9,~2 {G>M*e󍚇{9N< |fy ZhYKDɜ>Bsa]9 $_}轖ʨfc~<}]\o|=_JըsSoNr%p]uE?Evwr U`{<*;ՖIV)WxjvV[GxǜW?o'[xwx*Wnӻ/Ǘg//FՃZV/ u7YTRoSϿ櫤w'+;y%0AiSxl| >s H1F+{ sݶ3f*"1:Qqyomy7:ygg1?v祜H1;YtVAـS.}X)Yi6#$xIX&PvPGvb?N.xq߇>^ZKIR=p;,;=zB}7y_QÍ?_PuοNV7|_{_ϖخ`ڀ*C2ag|19~ujuSCr>|${?PqAm`j<9FJ1*a,\[oi6(?e :T$W*{m=;ZL(a1.ĠTcLZ2 @yLB$ ,T$2dw%mT$,$r lh7{(njyˀt/U}-zpbCaP8d(2li C!s;i~vGR* C5 ,k(2 LkL kG# 6VIMkX,ЃUJ` 6JdeB//$@d!i\ܻumK:gͱD°\”٦K $NH9KPضqlj+is~M'9L\οfxwҋ۶G?`+~`kzC>4exf];#");~~ ΌI|eT d%HB H 1e^%GbX>YMJbgbJH$S Wb.@iB h|AR`SU2E*gG,m& }1n&vk4]L/TĹq7ν(l~9c}\ۜriY^8gT/OM{3i [E_<ε[Ca[M1$,@E+[F%&WD "T]^ 00HJ*,`@ Kb^7Q9YK%3%q6V75Q`>guƤ>minMlyExD|[yBG<|jl)'MBIYҫQV£ m0SZ`0rkDd1ɪPBln$D>ULvVf}05zZهġmB=Fa>̟"-t_&ٞeX,b\. b\}v6m]R. S,Y#!&]M7ܠŨ}Uid<&=x3ք\M쬟DK±8a?ţu-\#UkMed6/<Ά|y" #p)x]nSQhy-gZHx!E-GvAPf8;y}FoD(/h2yOYT2{IKv2XkB 9)$N` UzMPj`,,_~x;ε+,,qzwm%IUGKU#;Wľ.˩|5W$@hV{z %J) 靝݇-]6ՈLk )(FaXahgcwց PKUV]!AXO(55y;q!CϿX7YrpB-fEqҼB&IR2eMܚ@2G7}mME ܉v6d|B`Tp>,^Q~# 9.C)}-6N='Pjs7 |vA?kqwҖ&x/\E\qG5wf"i|Tn*at#uzy2R2T,3#&kT[9g\2qtɎ>-%ȏFxoFcRGxV3@3+GG*2#s:=njjoM74 g?Ysa>:{}=uǟwkE2%7M۫5UԌ:Ђ<磸]e#uhCSBpxrkJ3̇޽gݛ^܌]ʻ7WogvgO'c+qTp59Yޞ|pϱkIƞ@-颩ԍfR5vsw?u`Ų7yvq6n~8͠WV:{Ȧ^U޶yr5R:V4}6ՈW1?l+:QMs⟓˽xvyJwo}]}˻o{B]ۓᄀ]4cA`g~M?l5muMͺ6Zߤ_#7pYC߾oFDICӟrtk*>+FWQKش/&\g^6QKE47*^ZX!3ߕӸǜ邹1Vbbb6[1A=۩-pv83F߬~ဟ41Cd?["Rd%ZqO ഖF\.xTUTdnIʁ%/Ûv \4?Dzqt5Uz&O+,y7)*FӺ%Ld^ ۃRYg~wOkcmSc>C郲ѾC'ҵe+tz}{ X |Xvl&^YEͮ.հ?(A&Wp^ѽNyj8c#aW+~z0wv9.ݨ y_A`e* *gБ舒vBpf_!8[r'Fg"'J3"`9%!R_1ZEp_/:$eTk#|iy/-3`Ԟ .E>.0tk-|<~62SAP"GV"2s!ufPّ*99;.1j#-BrچUTZk,j).LqL[1cF2}!.B\n}X6B1Uj|:ۤZ`_?S Rך+h2cPqZ!mf‚՞kclvLW9FǹvsEM?:OFX^(~4NU:dIʺlCEwJ;g@՝Ov@ _Y9ȢN{ەZ_d=&6v !,,rpR!8\d3˜dTKT&h?]+iovIW"[>Sj^!KNɆ,($JR |3/VT.vCQ !(E]D/%"h!I \1ƶCQgŏzGrys(i<͓m<Áy7ufmCN[} Ksr\MIh6n.@=M:_IoZyX46IG3T}P22$6DC!O%πKrR1׮O[S!0JU!eȁPthH$dQ pRJ,F>nM3E#6?Ոeh:iU3ރY\qmx2H!iKh![jbYΘ6@H2 DEτԨhc$$-Ho8DbZgF\8/,|͒mmY/N/vzUE]>XCS?ֺaՇQ^]xIfuO6*Ŕ&1b' O{fq%$kmtͮiz ot{V+E 'VB.~#x͵]M^< g=kH8Ѭ;+sڼUۛ7>-Aw\=JnWs{s6oy9Q09m|MԤd't,< Ԝ=M!7;+y{>/G]j޳@uy!X2cz}U>&5wNdx W S I!$f9vJd+ᰑ}w l3c6Ѫ3(S0 SΘCI .*!Ќ`Vvk8"3Ҕb7`c,ܷ)i8HVyL]á-q6r?]́WNC-!z=*EadA(M# s 9ѡŽVW bN$/UL.gN{A%ME% g@+3/;s-Pyb:">¼ 96G,`bVXN`T $ru5c*\V8ƈ-O {V92".:栢HK5nLa} X[sΠL@ick[M(Ї7Jvxk ~?`Je+Œ _Ox;<.fbv́ll%SD Fg$ϼ !(b2PF1LKIKbnH.06p!qvtAH<AvJ0LZFgƉ^>x-ps;ϳ}j 1БP~FSНZqjW ]#W^c{AE`#^ {d׺I:=HyGcj#={$=hpO+PU GP>-oJ 9 "sUPUY(Aہ]%(Fk]b71Mj=a6 r軋O?~C"Tk^҅.u jX]{y>Q1 ,׫{[NO;Xv2UPcNm>7;dLT+kL{/xᅥ,< $P>a+@/ 4goTǕ'E.M.FR3*r>ZA̋ "1o7d9,RҁdʀB+/|f@(E}l25q֟!NX/"sCZs5+8aٻn$җ5yCU-]{Rq2aƫ)qM IV;(%(|CncPϔBTjaB>.q)FhWBh%\d>& RL3W R{76+ڌ20VoA&{hs Λ;mg<}#s7~0Òg&R]jԑJN5K&v_?^y8< 0!{VB8M(ZRL-DSQWJqTQ(,4u9lVJH?B"scllƈk+AɧiMʓIny}qkC[uAGZH?7 TsfXFYD\$ڒa+K=+E8R׺]vh*۫s\f280)Hd}<&x䉡)*  ؀kˆ^W"CXYL)#"b1)DрG!eLD:3FΖck\9\V>;FѴmޡcwvqMվ'iǯMJ͊?&x%C"6TRVq(l ϹJX2"514½h wS HGX`Pi[g.(CIŨ=hxfcXoVNKiHޝmzx^[4h9jR*妃}~Jňa*BV1с;A0$5 AZlj$*) ` 9U`Sj>˼{#^+dv[8$0[ܖ]n&-?=R%it𠩉qσ=ӖJ* sһ "M6uj<_#ݵtV] mUmNu}}]mA_^ z=Z[11`Ƴ{& 6K:su{iqS#jQAIj9WLVU9{ o7٤ @\nYI}ŭy'/#×wŗcu*;[ K:`pd B2:G00)b$k=_߽fý ߀ޠ7}yGO>0=8*"XOy4xBĄ{ƊzJavJ/Ki9 ӋюoeeI7ezCmtɾE}>Ƚ=:,#.{]K/J)f|O8\ SNԚHӖ`LkJ0*5` Vt65ډ.v΂X {<ȧ';Dmm(b 9 0E9f3$.* QC) g<}*ȿV,ު60ᨸS ZJOH d^J.`zGDBW1RA,*͸JItm6c]dxN Ե 6):(a! 4AT '=%4K#<JZ? _m؆pd&.s^Vr^L~'sK,%) 4^;%o >bhg_'. ͅ?\؋^G-D`ͥÿt"rsU)0Icsf%a ώ̤WK:{ " C`LI*c N!bdmp2%x~lߕ@0Tل5ia!rrWK;Nm'`X|C/L1J43y_JY60jR۪tپ4B^gC?.`8-$OT9L~(i]1#ZZ?y]=xq1=k>x9n#| bwzV-Ws;wP6x{1ꗼ<39ChUO=ꆬF Wv3,A} +3f1AUtb{%h} WjxjJ'!2:>]zW,҉>ŰKZR<{ޠ~\ ~~O__oO^z]/`f`\*$znt/n5ڮbt-rm9ߛ[=7[+Cˋ7qKⲽnҥծY : |?AʈIWTTu*ͨrR! 0݀sW66wLkmhvxb[-rcQZI7-}~5 ?Yu1}d~V;镢IM<Go1~FFyj!(c8b8KG!G=>isYxB^b4ed|BdU֖EⵣcQKt`Τ.\BwX6wFU[ Y`q@;}o%( h+[ڕNc,*2Km*CA Y,G1jR=T J5(كՠ$Q0L$8#M?5c 0S {c6"=q8)`k U c|F: lp6HF'g)O=lQ PfI"CقBJ*\]?Sb:6*l#?XnvVn:z$t0Qo4M3)M1x#^rÜey%3ȖXR_pȲNH>$ A"yHUU$@RqGK)eaiN֊W#ʳBojۢ'HtG7CZ;&z rFH(h&AG>XUa&I̓zIGcV |J6ҧ?8>?Rf"Tࣷ|8܍`wC򏿀\nj˛vVGmjN\T DKa XY]#g=5:]Am uSóe4V)ԓ<Z/x6CSJ74j\ŻꯟWw:E/ޠ]i1L/ƃI|7E2袎(R.ޯfb<#u4~%ovXznw$H2>נǃ9"S3QjIYɐ"%Xk$@#j*Cve 9 1 NP4?z-;:`a p) Xzm".}EJSٲI6"pۭz ]?w4ӳ6[*%''u!0q5r,Jqށ_ɼ*-7N+z ,jK1awa[VJ֒iTj CmEڛnr _-}iՙeCj]{58i>jFnwt2I1JN8騧?4+T y{Y?  .&75? e&W5n=-OσVvk]dنc_k8.n>0ԋK/߿)\?;覴}Ԧg@ |Vzy.m>RFZnl71"ߎ)6*)66a},SU#rEjw:, U)xMܜڦJgr=s:ewځĦΫ\eCYnZƟG<e,!]Znu '|+*7hK)nq]tn`"6]mN{b3RB秊[3*ֳ^m˃P:SEC[gكſZ*Xp).} &x#32x5sf"(ܚ195z8Egsq]u!Ϭ y U(Jc?{Hn]`Chd 6;l׶e#vJdfJ6m(ar}T]l7'O}f4v8.\&L{K\`xi<Ҫ%`$\+ D`i6&< | x+,&Wp c&@Bsh0Q|L_J(%vvQX0.{ڥfǾ ] 25SU(h{AP۩f@ $"#y9ǒzvoK>bOFԜtګ{lÚ(&{kU8yUqۼկ'0Nk>of^kIP$mUO3ն!-[.mv%e{Xl^6Fk$7@Ra4T/Ut.2E(J*f&+(8jƢA C}$5tDYK67᥿_H,ɥǑ| įc˻qշ󥇮w9x5FDGNH"6䘔Sr!*(/+D9ϥf9H# &r9Sv˨BdH]CS0$>-&ΎS)H4pM1 rIZA"Ct whƢfրedMA7_A3u uu}>c$Gb%hNS 9gf`:G@4ӐM`3rKʽ~j'NҚ&QXH#.=Ai-M1!ʤ|rQr :#2cڇе}V۰TH%ہ-6j z~Hn$ lcC"?!ZN>;فlBv6>;YNE,;oGO,8|A Dq䤹#ٓemI>ՎF F%"IV< d$SioRl._gϐ雫9xa/of!Lf3{S+'.D=.qj6eӲ%S*'ܐ|u{iRr‹fr=,nwI,"$4hhdvgγtCB]ҋRֵt[Y>9ܰȽZSpAb"&"ygLPDI`"irî [ˣVE=O_/Io8~|:y!bi0X6>Ou~nom4>Ze窃y,\܇nw& 5i2:N,RR@pGTwVNo.*?ʲ*ʲ6OJV֐>-ҵ\ں20-twn 3&noIz6wYmpr6 MNrQƓW \_ƨ~n'UjI?JZaӁH)Yp7dp,)xUS˷[w~GK8g짎--:LjKC< Ю2'[v=|k$W㙡>j]ϻ+ h7,?M.x;Mg{7˟ {lg1gQ';ɼsWUp72~, ˦yO*oەx(F"/?]ػ&;YMC~Z|Pk*(/=uϋ~9Y=pjphkNʑwNfrt+khW ,_=ԭݥuk|ԨYZɘ" +Tvx7װ-Dv.U~m%!O( 8!' cdq8GeO>œdZ<E4Q'WY`!NR \eiZVp)Hx46T,n*hv|6iGYĀs $? &YӀOs/l(m|" $9]$Mv7`z˪x;fM&`|G(6+Ƶ>5%c 3KJ[GMQ` 5y;BAAd /ٟyɋ]}yˎR7+Hqb0 ,>**bN ~ k% @*l'WY`u:{NPZM{Zdջ+㫧:>\H`%e9.\HZ GbW/~}\ WzJ8k2,1R"t3vn0?ȳtz H#d?׃_~in6Ք+?P\F6%шuMy'%eGEqnu6;+7H%`^x&Sʘ$Z&ă`'>׺ 4=ZPk^.3fkFfyAkFfkfkFk5#|ߛ$I ^($afkFfkFf{WqfdfkҚf+z  w: ̥9,V;L;=K)zDtˆ,W X(,jZ CF{Q@v¶H]] W^*jKH $´MȦ9;>ѯ Bz ;0PDKżW6ZH$ r5WE J2iQ&!W^0Ȏ1q]6wLIlv^mvv\6O[굽ϖM_EHR4z{𨙋Zd KEVF Ѷ"qHDh2(Aj˄RXЀxb ?ʇ $o_ 4 !MawCb%+00gY\E7FwSdb_sw{k[l&kzj`àFwk3kr$2;6h#I PQ Z(}#@Œ FHTK xxrs*< yl-)$Є\2/ YV`-H^H%<'uy}rW!p|>2rK`u9Â$Od '4)ZF HHyUK%EZ_L{f7J*2X!mB.va~/_*YFϻ 4Ǚ޼s2c48pH1dnI`BZsCQ ‚)&c0^ ƟQXnD:ٮΠ;ī&}֦]L,<}̓d8HAn\hwl6nsF:ד˻NVEihkZm[:6a7z̾4ӶL~Í愣aŋNY2@G%W͗tg֨nmȂgeRPtXL%Vω`zY'N{v߹fͽxrty~BFiBhIiDL )ih<#P&Xf zE@/@6xs{J[?*MRY%%u &rkS{. {U}N7M{b(帮=5H\$5G(qnxۛTktB$/iok@/&RitIL2su zWQ'З)ۨHϵ^4Q!;D)!K~[xd%lx" ),@#E$hVXW!&Nu0*5?߃yC#zSmʾ]a52&y: QFRgDD>VG2۱Ѵƚ&owNo[ %l͑eOFY\Pr'?~bpK!5s Ar#Պ XbE:y:j% V!'D{/Z)o$B$CU2ΥHAcwKzF J@lM> 685 BZ3.8!(i=Z )-FY#. 6i7L1"uŻZ#W1g(*5Zgq e/7G!IJp! +pI)66pc Bʩ@F+\.kP!F+@v$jws&4mv?X~Zյj3evEUxgD.%.疩+/mVVHSO 1ZټS>g7lz<{B^\~I{~ _1x_p߻`΍:Xs2ɣYLVqjެS?^YnmL9Zm|eYD+DzRƠZ OI Li*&-Ǵc9}jm?*}}ɖX3ŹJk!*($ꨕ>i b,)r!l1<"B=e 'gՎseL˺  M5FΚ[ S>ߡ].654 S%.U֙_YiAYN gw2hPC Oy)*.B. RÂScb}%Y7}7U~dǥޡ4BjR?pWnZbК[yG,*VLqIxY9?axi}8 ?w\&w*>!Ouyh|*nykSl!h )(Krx$C; ׽>/cה{$4<Ɂ1z@}('W*9#7\ti;;\{L>tv~Ca^Z}ťd>]\_*q;CSfAu:A ׊28RT~rS֓n7Rl[{-?mg  hrI׋tcl7_3vjGϤI*곟Վ:>vq^P#nO7%,x#q4/W:XIvUQSDU"d*"zN,H"Kgۤs RꜴ\QAhT%zRo.D-ҁC5b;XC)JVy&G)4JOc9 t8X~6vdif%?dΦͥZ5M&$:DO5G/%F\0R`^ke"(P*AU06lAJgTLHHJe80(3=y:EjLbcyh>p֗ȞGΗ1w] vdCiF3s 5RKC2OYh dOY !@p|KiRN&Pݬ{CErY#TcFxI.Rg:Ec" U7Fj W8IæV.RoS9]eLff8iZb!%; }rPgpɏWǫSK?;\oGN}Ͽ"WZ>vԚGIdd.-Q\T!^ =.U r&ɿ+c6^*P1 y]6$tM[nzj:\a cT},Əw~\O:ݢN] z|"!mY_ ^Ư4P1{R JxKL̮WY|kHEs7fnTV %ۻ%$نP WA%W3W3vW3՗B+CU&XU&W}LfWJC[zp%  {WH0+{W\2UR^ \R \!R+uB*U"JSy3.ܘqlqf :&/_+D ~)~ͫbՏ.wDIY*c<%9ΙWNNlV1{1Nϼfeߋ 8ĬqK㪄 y< $ݝq*CNsciPaǥ)33%~M8m|x5oTx|o?=9)B*S:LegSQo`?:FE#Sɮ1!!p ){ 燑˟yaԂ|zsg;WpZS"' >.=\e*jE•R0cnʀ˪q-fEA;?Gˉb|_zs58Ocd]/_O=*$7dɕ@*- P$h6`{Gۂ)jf3V{7gW\%B;%a UQa"~Ґb2 8 f Huweu焰IhQ`w}t]O۫j Fܿ:xkG/o=[ ̧8uwUAEKR4lN5S4 ),ԃ{[=G@P,RJʖ6$iCzۆ>R_sjNFg9*#RA4*p& ʱP d\DrSwedY走H.Ex:E DK VH #gy5.R,\pZ ǹJ@_E)'?3'qG 7T`ޠq/sUX͵b<)oHwJ7WX'U5N% NMQh^T/dt.2I•FI! k6iI&XsNcQ80JRQk"tDhgj{ȑ_ǻ:KEٽ{AXEHrd0ݒ-jI)[;@*ോVgݹb9?#mWNd+ >8g5*M-\D E) qk%hkN641AiW>SRatRhsVpU$-**&[Fdhq2[`v m&s d )2# f)LO[U&1:VZ׾R/#"fãBIeW@SM`lcG4f#,'kzo/n ͠a;2mÔxWg#r*A@f1; =eO"xdDņLEh}N%Zζѿs6=3]i-d-vwܖVeYx_]ϦhA%n"Pڤ6:Mklb[@i49 mų4-]Y/ cjUߌF79|߫ӏӳiYk}jy/ٺs/'u͵+Z3κdmAu)L'2P@ߤy!-/ъk@Y# 8bnv˰6*_=_s V`e`t2¯]`i=eAH}0KKyۯIMK͙@)Y5=ؒEK+V>-9$*|eĀ4 EAb gNDYb\^xliU#Z8B:2<FD)B+'\fdx Ev]MK[MݧrƦ7t)8#ݓh!y4bwc'#oKXJhH$J/?4bI ʠǤ 2%[e.|f_op~♾apo6ط4`JK(/a3C}RA+o0܀\YyC3`FIF%PӴdHnuBKβ1EͲ:I+U9Hi],+1r1M`{n1! dm#VgH@b-hy6)5ِ^#'|Gl(u"Mc01VgTSQ0d?Eܧ&>iEJ%djk{oOy7~[b{7j<[袱* :Z-TТdg0h &A q^!ˆ,~'eiIɖ"(2htNRY` f@ZSu&E? p}bx^CԎG/E͎$2x@A‘ D6 z(+ͤo'MӢgk01K2j{vc[@jX;d/2vAyanYp(]`F`FTQ4f M0.$}<j鍸Ƃf\h[fBbV¸M6NhgH **u}_-1F?FEw?|o}VnH6^a00)Qs<.^+[Q:*Ӷڈ+2:'O.O>_җ.f maxFg,ݨo=;o;i5wBiGdNi~˻Ŵ?bщZ~V-z d'N,rYB8oś`3oo,`":awb<9b>;p*\-2=~?|~Az$,eכ5ߖG@ҪK5u>> .F&I j׳L25R;MwɄ:H g!Gv6cbx|:آ*tԳis7Z)b"$[ &guvzz=\k|m/=HO K )K5ǴA2?0Ň{6tX7bɶ:=Ko=.vDKoީtϛm+8?Ϻ;NE"mx*:3@V*>{Gg&7y=wjbmş[n7lnݩ]!jl' %.YХݾ\Wvά:ԓmC6w*.&F僰Au9G>,s1(d H2a:u=G_^N x<?3TR6$5DykRJb<& PEjldu*W40<-.~vyƍ]^. ;IAZ?6CEk(5n9JtաUSQ?&ix[}tI܍Q P?UN`M6^yd <&3-( I@W)KM\on\?尭ccL'2 {8hۼq4k`2U5DT4ZF6oaz̓:|, |l |4 #@ "GprLH ,p*!,^dŤ V)HzBoϫ>07o·@jl͋͡`<#9Z)hs#Y.Y"]4:0 H:i^(nC OZJB)rQ.WBiMe\4.<YJ 2n!2$D҅#7@]!=ǂմxh+=vzkgq 0.qwDя==+95^3р`E^{A8 ysG7-s)^: 4s, %Zm羜A˜S92(DCc0Ǘ3W~gqM׶b.]/Wg_ UHלb&hoH r3mJk]h?ѼCsl׻6wzn\kZPR[]M=t~s[׹.űإ_5ݝ%|ٴvNM!sI\s|'5_ͯ2 kUmzpkXmTRSGX,S?+{wP?V>rm:Mc-~ _َGF{V^ֽ <OOvC; ٸ4^Tui2sυ`QA sqKI(hY}4WM=vewEqhhimY/fx^] |-&aWy-7zt^uWQޕ5qkҕܖ/,UɽS^&JUj*lҶDJjiʖH6⸰A8 ª.PJFyИ)Z@ n4+ :qutMEǤ" ZgQiƝVc +"XAPVўwNJqOCu~.,u?tۯt,XCϧZ$8)h)_+0V҂Yؿ,\O5` Fh圗)Bjl$4 josp1cT:^H1~l?ݣB6WKP(pfgÑZ$Ǡͥt"rc90 Iocsf{ P^32Iuaojt~|f"|N߇T%wM}$ k RhV)hr^]]x$9,yLNT9LgK}jJguto/gӳ惣Epv1ӰY=T_AZlZ;:3B!G-1~wKm͐fEfy чb\(4i'@mχ [%Vlk) "#Hj0Tt"~p6o:nɱTVT*'Օ qgx˛WG*~䷟t՛h_G'o~z ?`\}-$`6&f G4]5 ͛Ӵ̀Ӯ-^>lB;tW{k(=~}zYKs;']:]JȆe~^Ck&U]Q}%B`sW6!l^iM<,% M̖T>1(NUi9}qk~v1md~V;GJQ&ߵOLyx|/?1~JF\j!(c8b8KG!G=>Z ɎG t)#Ơ(Pn$T_(BXRs&p xu [0J~WQ;oa܀ZXtVco%j٠o~se}]iRTle,(3xn23ŃR$fO=ubhdI S GSB(ECu TDBH`Oܭ7/.*(t@);A2 S蕛cV{0:$Nr/ BzDe}'FsLJk7 U틸JR*QcW`+ k7 B틸j%yt\ՋF\mNB qȥ|_U]WJIzq+5zmzrq-67ǵhᬮ:'e8 7ۣf.a4G099g*#o"{$퍘N /b:QD4Ht8iJSx;tuj+i1.CW^Yd&Í,bc̱|:o6闔^kЙW:ѵ*NRaUe%$B?5-sinnBDw1|ŇBBU@6 fB*+ie1d6 R0f"+T^Ox2 N5YHs&Lwϛf MNJI/b!H#!l )fӇhfIc)yr;TMsFr4nN2\)7**~IC("TIy`C g:ܱE˺L;Ί4g ڌ-QVM.Gn%,a߻Ijb8-LS=S:2 h{EPzuP(WԹ\f{=3daqRr[& hnE{)\WTez1fUVU$}vd.ih?7 VUAAEdrZax~~M$R}zC˴,{U:'D # gxLDxkS@BN(%9fc /}'(bw7l{#?;FH]U?иZ?.75'5I49?By1L/d̴CK=%\.m!nBC=c293?%J0c8P1NܞNgNTgZBgV (`C<JJ%[?Ë7X"^#277cmH棄#׼Zys6~TarЎSS)l] E&k+.q {CϽ;}5ųY$n':-wMteu= ~e~dGO{~^]w+RP|%d? PժF&]*r@-éjR@~;uY'-}1?t0E"~VD9 <0Z3ᕡ_ntOhgR:`s!t/\Qx!T"j2 APxDg=wU?FɑsXS;cX 9ipD!JN88;ܼ>[won?3^={X/]17OUzыi8zء툎)7MDEG,Oie8.`(J JXҬ\嬣$ǻYA"\H=7$Rt&#荖:LDHqA#(59$`($|Bm֎[}'@A^; >DR\Y<YNk5j]$QT[7Bw\>Tzv9>KPw͓yYR8KAd(Ш1Lkt>F$H5N&^!"9"FƭFxI.:EKPP. .&+x-d4MųWl)jr]Y/dY2&rmYC=L`m\F? wz=+WB<7Iim&%gKoHẨ%CC%˭\!H06N4&Ws ?yQ/hk݈֭yѻ Ű|(nhc/CQ5[TfE6оio\?̮'i|`w?|kTusGPۋg=d!*_qx.ʆfrAԹiu9/uʈJ+IrϢ$3)GE_/{*%gN@ O rRېOhr*j=-(411 s[ < a)дe'\L_[U˅YDRzLCn8Km=e}et&kI5Z0H4 dYvgNs"q$HBE(CI4'҉BRYJjC3|IofLyod~$qm}Yu&AN99yZU>򥮩_JrKvaX11FkjmYXkNkwv;n 0%t m'pM " ZzP$Ɲ*zh!#2$hϢB]"xIsd1z`]H"G:*[RLamԷyb<Xl~Ua:im3rR"'HEXP|U1K !Mp3p8߹Bs7ÍE5GRq,g\p^($E=ZҔ+f:_:O1q׈uzP/Nی,6KՋ^ԝ^4 &N(F-@t2@d>"9p?'9;zhCSX+l}],^{N郎5kEC~ב_-ُ\-·DeP9 .qC+)s1h5^MhrR!ZKB(=cHq[S4 1ZQ399~ W|X~^Zי|}.{0RȭMGc"Wcs}os,]atՔ^=pH2(YW%+,vv>yYWvyon}ϻ!wG*_4]^Bwٽ^oIdmۢ&X_mOtY9/d+N()nkϯWxm{nE,*|iEYjHSml?n˖β}f^6pBOk3*(Alp*p( ,HBF"@n0J lDH&XNb*&62h;nZ ӁRРyJ!k<ynFwZc4Q^d$A)"80z%Kc@FE[InJaJ1eP'N MEBMi%p۱,rp5~A-@V(gm3DO'#=0Xiܣ\"LB*3p'H9`q"J L?@O' ! 4~Ci ʃ.  GK[ yn8=##g/\)DlSƼ@ (H}R7<KPHȑ&;fbgeբCf7?P2[)E}[v#6͉I_ē77_xƻ"V͓A{iK%Q[ I`x7bژ˃~ai7]qw1EYO ~U(WfDPW"YN!+yQΧ)ZIQGh_&>R8teG *ٳ[V+ay6뾎AmNb1ȍSrZGYRHHv9vA6Of'Ncu4:;X>\SNS\CGU/8yeΪF*e` Q~vA {z*~?̞>jR/o '!$LҏUMdGgOd'8Ν8gR\Kr@ i7lr":J_w?l|:I)uLj ˺<+30U0?`TZ4ό!0?Nl?}lTD[ A2 !lgfSԡul|<;j.*ߎPM GvgRTkq⇫TELB p!y_$zz1 ӼuE*2ΆН)#TfA/7+ ax{MK ӛLPݩ5yL%7?;'uy1[ͬ10%f'[&1jWPԘ ׷ՊrMu-MG⚍T2ImȎjjg\Zk:~3mZL[lʪdZjּӯkd8OD-w4#.qtz+ [^kq=(e[# +uD;0Fw9ât1h|q~;#(zخ gYFh쎧 \`c@ttG:+C)r~u8H1>Vqh oQ^'EǸΏ;qñgS<`s-:2S:c}dy! IUp3å O=cF@A+j$~$O&$>v׽&=^v]I^ b VA{$UTIٿi~ן /Dh6uB,[JL.0">G[0;̈́i HrV(t&ʀT#EAQi%6vL&.m`գk\deӇv_HK Ȥ˪0Lb ɸV!Xڤ nF{d4m*uXj08\k,iKy:oo*?QgG$qV vijp.9Tځ)#uSͨe$/D;q!"^yz1*(8L [:!{VB8M(ZRLA9DŽRn&x92J#KŤ:X+\$JOƾuY#BKh)hEyZPqduW1 *#qؿ V|<~ 0i9//RnߏJJ.͢0(e_Yf.s?Ҩ9o(_ ,i)`*$u׭ _ ?}x(?%†{d?׏~ROguM/ծeA;QmiVojӢ:Ȟ"&qtNZ]mˀ|zAjU.OFo>}TnP_ȍÌQL`$`F3OUtu6i#+42dxT#lH3#<\"ƴ'zͰx (m&hQ:02JRVa S띱Vc&ye4zl5ͭ0JRwְTu;+ )EyZ1FFXTy\Gѫ; -, QCvYI(P{+%Hς95O+ƫrѤfwRo|r#HlG yuP:X3'gq`GEgk'T-Uǣʖ;q.wMcArRyQ0.%+SIrF$`SL st~~d>ZSNqS:oW12#XPE@od썜؟q̾8X{B>`$]$͓fTV8mRfer3O>GKGl3YOr4Kp#b"%AѦ6$Gpo-U42dZK(IItY%^Fґ0хپYvRsgP/+jQ[ =h8^шb+$4UcRPҎHcp(8oI#eV iU3A3 hG9ART#`<0G:HczYQ߆&`D?gD"2%c 0BbBKcvV[j(H!A).$gWDiCHqb pz "u,i$53RZ􌈽XŲMK޸䮸zE5​3nrr–KJ  B $tn]B-f.nw{㎻p &uŻZ#z?m8]FQgo7k?Rm;ϓ0G#> Drf}-$R+,Ōgֳt8 g#R1Tc*QRr۴\ WKR {oêUIwm5M7M_rD]~f"w2࿋I[}P{M׵{h{u:G9kvU{}\նPs5?L'/=j~(0dI<4M8W3l/ ['bkV] M7ggx8L(&{e #YԻmWE복cZsQ@KK]\[d$Z-mڱ{'ji]'C+}n-ҟxt;02_֝\ P,&(r0E9f[ls qP!R۔}H{QH x%,K59.| =BzD尴VYgSץ<ś!7wBOEmv[:e9ߋ&7 4ᨸUTHo-H'$pR2ȃHN:_uxLgj}dzlcR YTqXDI@Zk'+g{a &-NP:RTfrs;|QPËy91>dEPy߅T@Ҍ ggrxkgrmT lۿ4B^49$|]R0y-SS;'7#Z^%?UO糓E:1bNQǣzlGZj>.{vb C':{bN!]Dyg7чbRa`ŴOy;eѬbݽ%zzW`٤鸆cQV4*'~uqFg'xg<}o:ztq_ώV̂K7u7 G׺٫ͻƴk&]>ߺM}rGŇ-oeЎZ ٫ǧ~Tt𼍃vҦ V()] Ol:k\DEW*w+QCM ?OqfKgVHnΖԩ|cQc>Y6?62K@V;GJQ&u:맦 v牨G`'ɱg&]9ބм K8@4+9"O4"hPDKItFk(^rK$`v /D*-QZT?&`l"B; ; &NQhp`2Qf$ˋ{VHu hmiҘbkq|/OƣO53OF]}Re-tyˇҞNkjl!MSJ*QJKxõf\1J. DAKKR:@u igRi6r:᭶&H]Ҟ-EpnY6sc3E?, V ߰W|Wgpɗ{Ga'&l>|y=~;Bk.z:+ןެ'4+,zR6)E%"(Y~K ҳj"$~2zɶ@iqE[a#[ͥmpxu#ϛ9\a ]|Wx[~Uy {͙_<׳:jv`=pRo_P~SQCi_$}e<:EE^}qf zI!B ,.UWYZζs+GUXJ\+p5b :zp%crA1 \eia*K5ճ+%7d c0+ŮUVm,1\=CH]?Ņ<(.ox5ClB6z녭j[a0VIDO,xb<֗lvNs()-(y R 4w?OA1}o&y|uꑗ8@}Ua\ŏ}> ,ıp5J-@ L!|?\0/#QF)~a ȩ35/ Z+h7>-[~޼)B*S:L\SQ?ЊY⪝!n?@)®n20JKKY`;FQwFZp*" \:\^F} zZw%-eO|?)ˢm\= :zhS`TeU]8Ixx/H<3.8Ά -{gggGRJN(e4qd0fvE}_*Ů|,Yc9 L[-T 7?~_\徫$z]`XxtphdV30H@5$%x!b*7Iq|ZMp}N!/2ysZy#Ȼ*r__.Iwv{y\ۛ>7vj-uclcSU5_\+ h?q\qc  p ^% _RLZXg3 aRwkRE凮^8w9 os:੪4˽?R%J|Crz'|T3Tq`eF}y.oˠDł. liC" mSoG_W[..~]mQڊ,'R<`D*F΄!V9qD\oFG")l3V_MZXh!Ou42QQd5 \Xd%pJPz8YѹSQk:TfaxujrkCKqNF}^Vג=h)>]0fS~\9jY"|w\t*@w2()^sZ/WU]_|8z벅l i(-5ס?%ڸc^ٕޮ?˄A,tm>YQTZF/#q*(P*F$ (8-?\N2sW546cl G}`PϽᖖ*T4MeW:+摲`-ςEm"hM4@ku" @IT[rk)aS A%I2T&րBPƹIB81 RO5Zks gJ*S5qn爈%FbN^1)sx즙Pm7Eފ䣣OD1% Žۢw.ټKl>Sɷlۍz^u]6h\#gV$%ѶH yDhU&IH`s$,*"E!wJ0ǃDn=0bTm90(3ٜNHu mm"erݚ8w;˓jR&~ਖ਼~ E Ǔ캛CN>L<| w}k ?lpKG\`U.!m}ZF2sF H\%&9!H9RCGAhVW;D*jyP*&2qF_4L|R:78\ʤk:ev kVA6ߓ^.~+~Su?]%~9C^2ݧUo}q=R6G~ ރIuկNkQj'ƽi/;8oSQuwk_=HImɥx+=MԼ7CLt0=l/vUHU1+ GSh4\)oōJ*![LjT ?fgq]Hͅԋ^5.t̹'*ם:l h ,767dޠwϼqy5yO,]Ι[mSa58d67 dO#TTS!z]18ṆTCԐ8t OU!Z Vͥ\V=CkmH /~ 0Y9e PW=3ǐ5HkDNUؠ{ zߝg?Kv:9dRځTqwImٻ܈vu_ txh}m]JkmvU{o؄sUkb.WL[Zo=l[8/, eĖ "zT jN x=z/AJLpMfyfc6*<^o!)QA<ܷ{X*=Ҏ+:{x8{e(LHK҈[W3 9Py GE5i e(י%3޸a%Ybo c=Y?ʅ{UFNqKQ5&Xj\yUWuW5*~QE[!!>o6Mo&ho[o"_/wN|T yTƠx~1<_z3|qۿԖyc]eguIuȍԜqy:hā|ۿzmPY }m9Ź{QQRH)NgyAb "hai)<1 <0}S2,fJPF J=>XD; D)x`O/WM T*jOL~&=A|ޏqף,7KfٖӇophhmvWHL_/fŽp]ⴀXt1(ayY 𫓞8>>JSOU#I=u$d54LKR0"Kg8w΁x @11C9ɏ \);I}Ulgoʉ7w{lӕT8t!؋>$d<\9 Oh!t1HYšY/)rT%"}-Bʣ# g.] 2JKtV(ŔDPMwg}m~#%VrR֕u[l@նB7;&w&݋zhHh L,A.%%C6( $(i<^]LmPd?X~}8'gs[\O *z\v-j@ҁl wMknr` 1JӨ_x)A֢&J!$Hv#ZQZtJ`U JniƔFY*hvb E;Otfv%㖨aeNiﶼ*}skr ;M[Gl C5$4#hy) Kr\^Dńvi!+{12lrb&&10L>/]uEzvqXR1mQ[vڲG^]}'LXC8uR::ZNXMpS1">ؠ'n0 wZP@ %C E{bMKuKBxT] wFz#*Tg}HLH ' +T80q'ws=K)g҈>\(=Hu[3T 1ZU#Qع3׹peiEת,6욮Sx󮻓D$+.l#$)x+ EݺòFUi];u֮rNϋ]r5?зOחAj~1l:_qqYgbkΧ?o>JxQu^E9#Ee4^7&\)I%^"}&WCID*J=}> LD !Dsq qEN` օJu@8UL,9 4qoV6wEPݦoߑ`ףyj1oe5p94rZH :j%xEOz¸&9=qN-D:υf9>H#…Ym\D"F24CBמE(ĖLxʗ(6]x3CPG粝-F.7 h PD d9ܕ(Z$U5{E|.| PɁg3FpAip'qtHڀF&pGD!J.@k} b(d8 od47ͺ '4"=D'LGH3~y_IVggaj{niR|h^3ΝX٠S\V㍴fg&bso(&ZsKwG+cw9\8LT4" sn7Cya}q^5̟T/T}C&>P4Lfs} YpD5*ٖN%vtANrd2Q9C'o,\J$qGpgzCǛ5l2r͂o2r->4J복@|qaX}5ۨ.Dx 0?AqwCr.*~M硂2v@]]zH Q^#ݔ&V>m-S$%Tf0;J̅7'P]]𓶋?'p gmEx #њ K~j˳&"V&ת%Hh$GOz"DP*oA$$Grσ蕺񹨶h!4m3.₡+tDV$cՄ*MoKn.8؛ ֮xb Vλ b|l^EChszk1w}P'}BhWBκ VTu@Dᢶ @8e}dc CS. OV%ɢ\PdA$IL"0tHS֩kMϾ I]T' I` ng||Y/')m[SCq8CCK\TM\_py7—LUtLґ&  DL&LL`߃g2B3/0xFhnv͠•@er \ejwTJ +sLWH~GU2pJ–2J \er9Bj WJ{zpj n{\Oӆ\{]bJÅҎ0.GU\qXt 3tny$<$VՆSvMmVikk EٸG/栤&LLryå O=cF@NU`OLj9+Sk%iBOM6\<^ 3npxJUgc>PD ga ]ʭ{wsp߯ ?N>li㞖BäNqAgwW9s})']Cfjڛ); L0w*⍪e 7o'=)!^Jg럶"g4`9r -%& 豂鱃т$hR:I S B%E(ZhP6>]3x))qJ7DdwDðq 2q88 ؿ3An<9LBKK=" lkR:FFx|(vXU@ fB.+ie1Πv6y+$RP0:~J0b+~< A In`r9B/;QѩpB;Ѹ9GH{)~Ҁ;458ǜQs ͥTr5Nf#[Ԋ6fXz.Z`^yj1*(8L ضThENJxZRpN5R1G EmJN#KŤI;X+\$JKƦU٬҂$7ܦ,kt[Qag̕~֥f >Z4bNH51p^lnւlSOZvC=@ɯ2p *%eaq`St+L02:C+r7RSp0bcD2Y+냉P=eDDL )SQ4`pHn\7FΖxָf8PJnz?z]Nfx3[vJtv& ]҉_+'៣# Ӡ h ŌF`#\xΕwֳHeDLGk H9ɣ: !rƭ`h3,=Q7Tb%p;d f0g؃ڌ%X-;5)Gzw#@jVM0\s{c0J EКWY +Wjx.k!$N^I9"Ghv[0H8DZZӌ W=<ߚ1Q7F@L!<`L(dGJHc2.sXJ$O_."hIptYr]ϥRM2u4epG`',@#Q7gRg~f5{v[pH`hQOEMkv]@ D6iW|4p[WL1ԊGyOy0VeXd *}s3uCbˎ%wN0u—Yv(8tqH,] =DŽQYX}tr}>~;|^8՗U.Ôi/2bxVzle; i~"DeVՍ6 mF/ 0؊vjtԧ̓k_qݐBnZufې;!]ɷr= V`X?0b9ڎ/nlw\Q7|vF‰T>xPd ߹n ֹ jdįk5   Xߴ"85u{.a4׾O~G)=Ļ:0 9랻J ytmst7]% VUv,Pk3{O#7 }v1|MԨ"=f_MX di2'"jV]6I$@1D{uVw ր&A[GL *Ue2RFG[ofߞPCWf \~wvh >*غnqOhp/YQO"V5"xfOW?o̙vf@ؤ"=cą'/N96^PR=~(i&&GIOvD=aؙo^7JZ059w[rmÁD]}!c);;h 1ݧf"wE|C pxX쭚<;tm(b 9 0E9f$.* QC)n*u+@0!S&aJ;f2Rʃp PX8z?!廃ڙˋDIp)eRˇIbzWjOMj^|Z[8|Cnc &2w\ a@?Cc*$[ tB)EAcf$hÞH:| IЌ kH#R"aTDR,*͸Jb,aG!еYE> ➺ ƽ P3;3|&'/YD.8IC $!V6ѕ:01.U埳eˤ=E~ޡnJBry|ɬrR9V $"a :RLݹµ= COU(! H xIV+x$?tEa_^$. qYOX|/L90y g!ax aS6LǓ\{hxs;>x; Ζ\smt׽*זǵ1E ^tq{!T7G31ziH4Q^;,|CP(XA0Yb~CZxqIG!LH4eoG b,|Ay7$ҠZTO/3׽v|xo./޼pS\߷޿S/R0b{~=Xo~jLLMfjSW ͼGwy-ڮS7}cw䒗>)B&hOQ@e(sU?-R%4y]B(f8&kכ_(b, *0w(Qr'597!jt2 .8,,&qXsthL|KkrV;ud"b=E ɉ{9Ӕ 5r/WX[j׎ZЎyF!,ҁ9X{ %Iy5;zцa\EPbᡓ [=`p,~ses]qlJ\rmP&qRL2g*Dώ]=a hJh^0(P s4P0vWAiGjd^ 婇9*5XJ1VنqRclsA }#u#@kG;zxw$.C>CWzvk=:Rb7>`]^7QKnVKќdsk,QV8dY[k/H$!<$RPD=7;"]JXL)CHsB4Vj Q($~k.JSD;>O>P3Շ` } hAk\w@q-|Msq?7睲\KrМX [0={A碥yQp8ipz-;:`a p)1Xzm"ͦhUY6k]YpQjb[Bf.ss; n}yU9K>#RmȱAT)Bg.Ab(BJ qh5Z`(3*h2:ڀmT yBx< ~9wJ `0uaӶRFh5,)fY(HyQB̤^J@=E^&d 6H_8+~>+3.Rf6RZ;>Eko))0G.!G\~q?j:,!Q5$a_"ulpp 8HșدP^*+ >(@.` mvhu^[zA?E.'6plsʪŷ_ނY9I.e?`&M R 9+EmE{]ܞ&dgv8%p]7řUBh޼!͝Wn|W$u.g_M/,R\rʕ2ΦG;%G'eJJ_yFsGK>Q0=kk(!H9tTn$՞etD!e!x0k5f,`ZFL&jKVHKDscliZ+OE~;97|8ľUx(K 2PĜG,Pm+GE܄E04a d"-' }W-;LWm DUUdUW D4&R]pF$bfwVxU;HP/%3H#-D 0D[K$lZDw@wk;}gZ,Jř"96[:;"(T=og~Ցqqb4Yg3-9]c\t#.&بʏ@b t+!PXM}UYQEͮQˆtEko8<<emke~?{AA?>b? nL~dmΊg?sp).Z:tꬉ:ڨ%h2߱weB0$Hjhg(%"oESL9PH496OsJ|,?Y}6g.||{!w6o]ힺ_Jw_Gwѻ9nҋiȎGvj'_j0P+uϷvwCr=/ MKOx\h;,TsC 36O'z?9gWXh"!{ F;hߝ{>n9-wjۧވ2G#;ЉJ%vtY$MmOjvd>CdlQ)3$gR^(&T֓9Ƭ ZcjۑGHA5>?(;'$~Q8F%gBNJJXW 2PNzY)Q>GAxEhKTƤ |Aj }6y$jCnRr.)` PH+&ZQMŤSO)  o;A/L}C}L3茰E0"IWU>S=8W#sN9,!r W!zӀ9<ڮ,/3w!٘=>s+@Rq2yL:,$XE|ʦX*PĦVKThcZ= à8Ӓq`݂!˛P蒻cxrIu0t!sr˜sxΎ&dcvB <ڝ=&́UBRUpYX>iEq{ oNe1IC}mQi;HzDNVֹ9rC9xm=H׾!:rvgrz[xkvc:0 u'fW__u9ff/h~>}1,t7W ޡQ]QY-R@GΩ`W>nsi Z|tYB95`i2CwG;^{T؞ QS]7>_S׳ 6lQC©.;CɺHc! #SOuL!Fd9x3YG䭋)2,m.` 0Eh״4Q' kw\dQr(:נl[jtIX.4qׁf:xː<tR/auP§1kّN6k1rX &'sJV@V "ҷh~KS쥴7\%f#JD 'SOn=2Pҷ]|F\,J.(q|ށ.ј@Jtq$e^_lSC ^t(-[@MeU"I" @hRDdilouxOϭ曊EOtlJhþ*x7 R?pvN{B́5.iϚ-r *#OLR"!9(d 6SKd퉚uXR*1bdKEs!S톋xА+*w RA526BfJ3,lbif“bVO3~Yŗ)Ͽiyk/ta:^<WFl0RĤ H2Fl)&j *Qxa*[!R;]C *ҕ/(id!r', BT|iُrBv+8mcԶ#j;a xe&]Nc [t ZdvD$d-c1PtFGJ(tFƚY<-;QA$ꌡp3qaWR'v1Dl""6FDqD]Ee"N4$lZDw@w :"pl h IJ*gb l%8V:EuZog~Ցqq-kHgʹX\tqэ8.d>S4'A$!]JVSf_@eЊ.^ovF\<.iDZx?OawF[پo~{x٧Q,'ُ/ 48W[Q*ʦ,h E&haLsƺ eYRjْ譣(cvC)`JP4iC\ۃu]9fL܅~/N-M =uf1mVxtǣbq{&;Ș^OGvZ\TG꧌c5 ;=JƱ)<9پOK~t'-,#(U ?iC⥑PdT\ :(QZAH8U9JAMs'6٦GcB/K"k&΁G wz>Sm̳JrAocX~}XWpW]vM/߽Z>{7A,ΐ1[RZ$X\fȪc%U*$[L-$t?hi^kmt6zȿS_o<DզZ;]jBo^XqY^tx\XQm_$Az8YIyWڲktGGmCCWT(a h5eгdǥHhH fA24O.?^?ٻ6r%Wrg6ߏ"d6/70uW$ىg}-a$[-[;@GMdxHV UًNחG-DϰKGD,L90 `Ibuf 0~|QU9y2}R=Dxo\%y>gKt5N+ )D0!:>RLݱCv0yl(D[ C)7$sHZ\!L\qUӤE_dX|__e.H6Kʆ0$U/AϾD/^LzÓrxk ۪49իitxQzky|ΠToB~ީ*1#8|xbr^}zs9'L|csv^-ճ;(Uv y_wn7_2Z⨶%Wԫk5#+'>LVA{5N39X}ouuֽk!u8]dI sP_"W҉ >٠aRZTNYV闏3:_?}ӫyǻWo?B~}+Xu=0 0b;nm M4 MmӴhߦ]+״{s-6wo_إ«tq.f&; t%s~Ći޻1* ~YJh2|aB`s"Om|B="XO9(뿙da1f)*Mpz8/׿@q vO.~y4H)ʝR?yo5BPp@;n q.#N ;->TOH%>MI4Fd> RHvGl’*3Kf.ar1$xTN.̽ҕ#>J::)0Oֳ융|ME<1-^}(x9MyZg J`$l}WwQQ <3OzNk,./SȋQM\?W)+nR=V($ds)ͳb^Ͽ(O v @La٪Z]eKxe'_wOSVxv 2X PY{>~Y ee/' iӁ>Jr]b4>+Y.,ˠ?8m̮kQe'ETp?oOѲCk@I eMag$ ﰀO S=y5ES -qRZ.fȻw^[/P HYg'>h9젥i}һLoJ Y̭rmKbhpއ嗯oQD.CԚ6XRBo,C2&7lc~c0r?Dp`ռ4G^bKsm9.O :_-5\#Z-mm%\Nȩ^rliĵYO=;@/@sda䐤2hS_j*e;Q!(8vН-+n's"3{5t;@ 3X;Z9D/p-($N[a0eBk$<[ ބ 4S3)!Do7;cİV`b`XTnL+B"FKˎQzW Zőp.uXER%VV3"|lH%1QMj@}Jyyuh}5ŒG"" { C"3T#2zαc%ڳ**yںO^}^?>E9Cw:Tҩ&, Ƙcd46ә6$Qs,w`3U%6jMƀmz)y\`.~Cre߆N.|wyrlkhV쫈vbb;0սkx0-y'f>k8A ]+8+jC<2ؼv)m!A*=yy&wI &uD9hkcN&zG-z8j"]ª߫EY-ڈ֫] @_l[@HKQ-?G^2۸yhOXl>ޒDw]:Uv=ɪj?OcF,`WO1:DDꥦ$<EV 1[C׺KCks2,ĜfTN1-ǒE*Las'>[iǬ`^ %vϷ޶@A~@A?.GQ7q蟹 lNSs}:\H%Qݞ3hX[^yEp,I&ab,Z@n%DP%v+RP-j#d` DZ-?}Co}q>j鯚6 j=Z-|1FB1Fw=csShs ,HM^lC>/_ڦwc=$oTJ6( 8'L02:CST $}q N^{nN0m82&rizǑ3,JGahl9DxzB ]5p c;)\`c`*"sOhp/PXQO"5bxbݺ1X8Fs/:݈M ֭Pvn:hzgryR%@QDQvl<FoQ>^u▱"10 f1grH8+jçMzͼc 5A0ɵ .%5\ 3f4j|9#HbaJ[foj߫_!/>yNkLB_u<+| Q&}/7:u^o/.ͱWf̅UwЩ`t\̶UЌr.fuG;_/Uj8;8d%'Mf+~  ,v{C V[p[9 $lOtwH-W,J : CWCm=nk[z㈙_T7 !{40+a$AvZK_8*~<"/C;3ta}O`t͒\kYy /8*A9E;&,}F|RjK> \/Fa7q 4ذ}j7) 6LdQp$ ?{Ʊ)@T|s$늓'h-$I{zf$$F 4=ݧOG.Ҁ1G wX^F34~.No)!lV76Aˉu ][FX:W%*?{DMf/GYJR" ʖ&Fg 0hJ*hzaM{qveHI0ؘi8t?m 9@#r -%& Ӵi&AwƄ{|4$UF+LIBSzVGe^* Z:aJhmlOJ<xbd SOGL*aB;,* R 8q% i .H;"=Jޔ!n N|OgHȶm\us )I`.Θ\AԮZ?ZwoIaA/ǽ~wi 3vu#4cK^> Tɖtd E7$6c Bg@ɴ9ՙAeW`C[a91Хb~-@q]r8ٹNuvs\u sjI܀E QĘ4 uBRC&rdQ(b1{So>8¿2 Zz aVَ2X08Fn\Yƨ^7Z)ڍ?)ڠ|w6DAmߠ۠ 1$u9+R=;!/%;O$j vqWߠhwH\qFf;#JR*QX'Aq%WJK3*䮈+ҿmWJ܉oQ\I;$`AvvUwE\%jvq2-+E0#׻Ll6MIc jV(av4QjT=u0 ??dz2 UQهɩ9IT"/%#~x)oة6=s:+,OyQr1bs|{t4c*NRlaU:+.N(2+>ɺcULEA{Z3$W=}o"W')ޏZDRnV/n f~K%wI].iI{&SDcu:%udy].Y^,K%~_⠜3> R~F_`G3Ju2RG*9ՌZi1  S?5_l\XUi>OS"Mhـg阋1*(8L X0!{VB8M(ZRLV1"JqTQ(,4u9lVJH?B"ƈs7F Ig`C/QǏ ٨]KtLm.EMP}MU|OKޙẌΔkg⫷[J"|[WP6 AL*fŴ12ds>>kl:116g`r9D7 66{P0FN<1h\f26( y"L02:CKzIHp#5 #z^9F$㑅H>Hէ-a$EV 1uk K.YU)mfדFb<)"BTy28Phe\8$% *sJE_қ7NqSB6u2uoV!}1wBlTri\*_&QZa9Z襓T7TvF‰t̗W_\?nH{0B9M5e꾆> aRcIF5S>1èNO'D񏷣O""_}c|g)[?q"paKHR\p뿋I䗃޸0MKwajs6ݹΖj{09sviÑ1AKQ8rE(h.Z#zeBE 2k ͟e z`Tɤ0U"_uʾMMcN\œ%OT ;T˗/8/~ ?ò:]V$oh3%K׀xZ6yFΊ|En&$ak-Pr&*faUO 296iք G6GL( GsªxL) @;@h|/1HU HwD@$4HXQ*hEwZI%``!AyZ6:oq]%C㞆Aoɖ_8t,XQ-PPO4/h bkAw$U: K82X9/SR+Hp“9 %b1w bks8V$X\C6Sť|o_h.Tiq^16i"rKypa@d/bUJ1ҩPZ Ņqd)?&ɓқSS?>%O!V $w"a?RLݾ)pnW~~6 gEArkCR )W0 =wRA' 4b,3 —u)\e$(Ⱦ|+%`zAPƵV*MN_5R]?/a8-LTe\YU<9ciz|r|Vq4 n!| bwzV-c;c(Vt9.<:3C%[JޢE-75Cff#,DRFVg[W.͝5JV'ljxzH5>? G.U}YY1} +o-9*jIxⷪޠzwћwoO~뻓o?@'G8:5:K_~ Ie=jG?tiiho4UlE|vu]nha3-ڞ8+\;Da Zq/$>W\m- WWQRM7ʝR(3lw:>c<cu@ȱ ht_3A?S`zsᗏ7/]seeБh{_kA"ڐȼ$vB謶YdtIDRxMLr =|&ǣmP;g$tN\iS+,t卑Hո>BT,E2E-=sIiЫ%=I{eL0 :Aww kfViRuOhu+b_Ϻ#[}Ij5 FVTȎEH'"Ĕ `]'*q' &x6 O**b6g~<ϛȬc_g*^㆜O*-!:}&(SA te8Zm8{&[C`'>^y~El* І,H"R~(/g@0Jh5g^fkL'rYץK3\dVϊ0yyLI4 \ RQS߭,M3ˏsf'v"3D0U7^N[-{d=y(m%w6QwIM{Cϛ_iͨ{E X4#.B*Ъ"O m(pcb]Qd$`1KʃX fGڜk9W3ڑpvGzX^WZqh,TcjR}.d`[`ug&ÛeHPh)SQ^笂d"r(ͤ|ax1ŌP/%YCI<*MM(C6¡ɔ1"`_Km[vK)DZqh֕nQ!{;&yІ\Zڄr$#0vnaK±ѣՑȳ`ì1XŁ2dB(25X"{jH10+dd|=j~IsЈh*GD"b,ރ9pŵe>$Fxr;Jwn](7N $wDcK F&#eC!6:ѫ'̹imh[\lqq*Q#ĠdVeciaQ4+d1٨ղC㡫a!l!DZ#jtLiSn\c?ܮܮǀR{dzzIܡh|Yyթ9BC}Bezl=xu@> nCyƃnջx7 䖄- %Aɿ{kQoX>5p*qyjןwp~ۢr7ItaRL}zJ7/?>ƀѯƙ6޿%o)(>OOS)%{䂫Պ3_jsO肯iEDe<23뻇jИX40wEUAimVjϴ;[&&( \t8mcp0 (>50($FN깫/ ]VYUfKf/]o=i[6R/ot';yOà?1ͲLtvGg_go<=dE}?tfGgݕN}rW^:s=כ.t^]|DžghI[9zKio~͜_v.tk.mt7,'[[7>Ѻfc0^r&7e{}G˛7/2ze'K3w04x& UE֌޶+{A3Mg:0? 嶙-mo& /`5̫Jal&tBkى T䢈I|0 !5v>\9?Uڕ5wYq)h3,3R`.h%hN. !JeEQ)q!J0֠WEzTIc˜ 5wՆb!!š}v"tv:y` |6eN@inf)!42y##qT1}jEE5;#+viܐ)|MfI-Mh)mY j[J'RrV;S}9ݴn7whǏVX_S Eh|13>ۘ7:ܑJ̙[4gi΢t-/B#8E}'9}M "2Ĵ$LZ]*e/*oO< WKNgT $!2=x` ̵j g"4.SځY [>!bOkC6׿&skh57v `ei1V_ #z7iUL4hݏ5t\C!tY$a猂.Z+JNW ̮{cqRa_a=9%XNq#Pu@{i-A@K:!S Ρf%Fjt*pIlM+Š\uҊuX - j M_Դf]!oJ:NYwHE~1w5Ȁ2h0.NJ:Nwcsg4cਵªPVCY;Qvk芿 ]˓mCZ'BZ`Ha$(tE\h~Ҡh۱s> pUCgu] Z$.YhH$EIىg͸c3'LU0Mǒ%tsEsղЇѳnw.Eo~v~%@:"I!/sX8 #"×B+/Ȧ)/ \A(E}5|C`pk/ͦ'06HOyE+NFOx7|O-;Qꧥl[xgxgxgxg{xH ,^j zRRh]jR&5e6UjT횪]SkvMۤ5UjTVSkvMծ5_kvM-^8ZEᜪۮ8ΡcOqFd61ZNbegA !Ĭ<"叹ST5,x85 %N-igP,͉QHȑJHv+ЗA#`@N(3VuUdx85@.͙[-H#صJML5˖6˦pٴeSl RRIo G5˦pPVSl M!# -eأ!raPl@R,ӶeL)9 ll-#ĞxY}fœe(B !1EQXμ@g"s:fŸ<·)YeSblǡ7 ;McyWϯ-9cSJ U^%;EnJ;|Q₃Œ,VL*[t Uw~߽W)=˧{MLnVPJz߶D78LKǐR1BãNz܁sR4J:Zs$Gy M-52^h zoUI1Fj„%::%g9DfqAeJR(OȲRd\rQwژE 6F6"tX‹H;6zqc֐_QHf ><BeF1X RisW^,\R mʋoGy\zR~J++M9tjRJ A/\M-z;)x5ђ'Uu49`XMPVϭ#S6h#I \#jE%˧ 2 Pr^0Ŕl~z鄞y6Io9O!I@2y5heibރ@XR1#O\(,7Nt 89 eIȒ$>AB$xɳ#9~/7zݴMC61u/g7tāyd!{gn&:7In.Mj& (ZuIFKJC 2/U $y4=ƻjD y\~@p$cF3;E 5C7W4O4v؀5.x:lG] lK^m@ވfpM&DN |mI]nĻ:#:qI6{mzX#7h&-:IH.ZoD1  zEd3W>.Ll I!qAnr] 4+ە^6'4vU#v󱢘}@1/ gjҀzΔw-|tx;+#Modr(,~}*9b7ٕy< YjTg1OW,L&ɎYhΫ+8jЫ5Raz cvwKΆ[4n%?Vn8^mXvzH_C DŽ͔|BLtݐnȞhƷ6$-;eC,WDF6j xAJLp=y|yfc6\y*<^o!)WuM~}dZlq٤iSRyKD ߺe-R-,"T'r™*1T鯪mz(sAa% J <XC#w&@)5x`OAQm Ot~'=9lq??rD|v}_^v&o{"S)L=dfp:VWvgƢDYo$X*gHŸNz/ N=V ؍/ԣ52%bcKD ,)OqSG᢯Z.Mڧ ]̫Vxp3NؘCw{LmjQ| Ѝσq%_ A2x(*4xŜM,W+m|=PI8sH^xTZB) `l"n_ϥV. FԇʞO c7!KK20A@J*A)%%C!HFbGrJO$c[,PXJ$5$g28&؄Ӗu+owM^ŀʎF/a=hЍ$.0I< 4ZDaeJq’b$|P /"bByn'eΞũs \*/ *oGI L#)K ~|GǶQ^̣E>TNT >^dִ9Xh4O?EEn.EnK<İ b }לep6cG\_0n-GR]i gpiAGn1s̾//&M% !"¡DvC'v]|,]%kM`Cvr6i1b /mB]N|P˫wgqߗ8g3h8lQ[|x0@iOfNgψ˳Akjkq'ϩ@VI$a*UR88Y1Mp!u #SZRsQu4S 1ZUrc]<+k8[۪,u筠]7K͈}lM 9F;AZR:c dcS{q#SM.:㻑䙋K]u.geAۻ0OHJ/`a)YG9jA$*D8R!ʁ2qҫXBtv`vao[q!ͧnJI8͖C>V>oKj,'(s`i;C uJJqMr 8Eb<6F .V;E!2az74CBdbl;]mr}-+9F!:+_Of+NSصȗ6K/έ;5 _$7:jФE'(c]!Yp20RkdPJpF$ɯ:~PY-e}ӋeL| *yP)4IhEB!J.ѵa^0w_}l 8".ogѨWkʺ'Wh"AzNOPp#IN{x` G /K%Z9D^D^M(> 擕F{qpzχ4 oh)=E%QkuHifŞfLFq0q-#.R7pGr/GD|0Ҥgd]jc`OyPVzj}eԃ^,4bOu{|p&syp@5:98E%v0%JrN?uO#đkZhs :Pi1D{(fW*yn8&6;>>R>?Mf>> GL;p,5zQ2 b'(޾绣o?Go~xsk|W`}n7%`~{ ܚԔ[N5l2,yo2)yVrC;F88y7.ߚѕzadtqlX|K꧛\I+hClgtŅݸkUVH׭mlߴٲ9UMRPNC4rqiZ}6?.ch?k+ \xeO~L? Hv+m$9Xܥ#``X`~.򜖇:FRtߗCl%҈"PKd/3#^ʌJ9ഖD<s:p JMt!3I%dg,ʤIݯ=;u Ɂ!"OǑ(HTm+"(cUi]/yu>\Llb:-N"nlQuM+?_Xzu;>`oy舘_}XJ$Of=?{KVL&J78k5$3Rp U~wJ&s*P*5Ap$vƶu~L/k`+ Wɱҡҹx*9%vn4tUE+١UEIǤ+s)sN;t`Cj7/vCkn(-]jߡL#+31p ]U:]U+ DtzJ2͘ UFCWNWGztE b#+|<`+F V:]Udt JJbDtU;h{btUQի+e/m\*ŶcaRh,%59Eϲexmy=oMݫtZ]M_7[-Ek|n4G8rqe#M^5frp=KXͣ[5 2fk{Y>w۸^14VͶa3-VMH)aiA*` 2M=7^jI+#FX:?^›w|(9syT{[mb[ݻ:m8ACrOJ=)Z3Ws%fH=)zeSirΔA|O/7:nhy|[fSyn/P|ӚFMG|%:{'ۻZ@rDc좷O򦕺/>';6|ک⋍;OxmE槵Ơ^ي m|Fz5dzuuDߖ~+gP v^Ubw]aІ[3yZBibq7Jn}UoCYǛ;L H–yi7i^[b[EZyka{m:^{N6}@x5/a/6\29!o =6{o탋 &?\b(cwRlunl}c{7!zty1Ik_Y$Ӓ0#Zp_t}к_p[Z*-5J}r+/O^\2 hH8 UqO^毴PJ)Ejnee=|q1vVW0 :jiOtFJ{4g[9}.1DS'4:ϕ ߥ.^<}ܒ;5ɮm%}9k{+kxwDNJ[Y!HC- ;bfτ8R[+iTg]M㚓xJTn4<8;tZQ vPZJ}sA3Y+Z ;V$",Hk)f*WʏgBkh?bP6{%ONq8\ִ^f'$13}闭V_Oz;eKIX󗦫 tZzRRTV@WHW=~'Uh4tUj1h;t(푮^#] ΍*>p_znh#]B$#+b\'`c&cL~9NۍVˁ^~~~sl4_|]젿GUj?f ɿ<%EDDx|? Ӓ0wr[!ig0!45CcCE))@HeJ 3m'@bL>ۙSGy20h` ڮAi&IHXH&! 4y  1cUJT5%b<6k_0K5kwiX!<–^g嘣=Iﮧ>/̬>rDKK܀] ^i 1Z Wܰ i|8xeyh"eEk\qI6S#Kc4,z0A0ޠ𐬖{)-ԭx#$nAd6&"3*Yȩ Տ/XF~ S,0mF6b$e9v,pdvw\M{^TYɘWƆm:XWXk=tdc#.6pi /s 7Db&: ˸` <8B *k ) jc2G%Tx q"Ҁ!zlNǠ- sT?xL(mX-bRPe'k1@sJm,2B˭ڬpRC^TϜ%LA >f@$5Tq1?]:X9gv 5 |5ěNZAs8%*\ZC 9,AaNXJ0в B ZE  X;ZFc|xS@tQe(uʐ JR [af$" 79mvZgC u#Få#Dh`P]#5@$"Hj6SXrםd\Z @^Fu(?@5 gW$Tr#8ØT#r.~1n&WkqdX XGK*n9>KD ƖDWGHWi'eAt=/. Q*c+'$Wv.gɱHm~]eUp ANB̽(0v遖&{aw9¹Wp9/f׎]tKؕHGmBY[{0Nc'8p2^S*b^Л¥ߞ\]UgQ>] ٵZԕZ&*y'5?e{inozwsk9kaeJ ߞ&+\&]eJEZ׾j*Qظ恩Tp w D^eJ+#TKQ^YXr"A3V ]v Q*+뫡+Vf3d|[m W헮DkD;t1Eѵ.uC+D)ҕd9bVCWשRJR:]!J鉮ڪJq&$+Ũ+D{ Js͊ 0B+D+^]!JEcWHW{fJ P­w+e'" :BgJWؖ%DWHWN*qSM#,B?|b>..^.>ƒdaUf|oPő;Wg9B6׏rR1*$*ZymU?+߬$Nq\zִ=!ZbUlUt-lb9w[nZ\6M8Jɝw,nQ9=%I#ܺ:<#Gm-:eGU }Cnmr/~b?<}?cG*u6Xjs@`ueH/JbKYOi!f~X0G竝:z[vҖ vjfje笞7 :--{O8 "aKopln,Hlq0bWa%*QDuRB/kKv<͡Yxj鋹Ҹ:n|Ԝ<)]ӗIn%t*xc">^Cbd܅-rVvEd5u]5"l,UjUEDAoO)d;Tȏ(Pʺ8<8T_^u~7R֕ T>³FCP>k{V"lYp}11 q1 4(ELJ?`t;zK@C6v(lLη+Ntآ* +lT1tp+7NWRwR]] ŵ*I]`#d1tp5/ӇNW2RWHWRioLAt%I] ]!g50=mwT@ Qn DWBWʸƮ•B_Z >DWGHWrmAt}|¶p})th;6DWGHW*}9 Q 5OW6!zN +/0u?3(=]#]9ʃ/m mZ)6ZHԲT%[ 3x869)YA*{QRTU?(5My[\rY ]!Z%zJDc+6v+jG߾v"uut`++++D]%])9(ɸ(W-|ɌEr@VJ&[;fAm TǃA֠rh٪P![U;wOIVvu 'BqqP,.\3$z4y<[ɿ߼i$"aja67s'HF=?NO:~q?ګPЯ_rښj3wJ<[ /0ӦKn ^~N{~mS|gn\.h ~~3GЬ:Mu}ϚfYbr: zXԥu#ܑc+Z|l#!/Ϻn,}XVn~0jS~,W)_Nyukv6sG<Ygka.UlI^ܦ,BuzZ0/_Y%U c(]}? U\f8Πx|1YrxwAM0Kgyܛu?@04ģ{Lnyb<~Q6ƓncoypЩ7Y]0h{gEKJv_t<,Ҳ`Sb/VW./nֿ0?[?2LOA),ž< C Rf"}]˕ǣȨ|2|?zͶ65h)7'(ɃRxx*8*7|`ux J>LZhq4)RlyϾ)G z/(? NO}[%OEe7 4ͺOa;2}ΠMc*pl#?yzVHaȻ@,FVz'u% 9gHDBU (ߕYjE* nH2Υ :$Iu&gKFcBbki OmcYQCw4 A9ة\" Ÿ 5V7Od)|>:D?]8Ťϗ/ZAD*BLŒ/p4B%IwIEk;M~`^d~{k>F̬5P(.9Hc>8tpT YQ{D#AG;mv4r i8"%"p^T<\Cp{c0oEZ<V<^h$%ԄrRŝWrr$D[!QA fwwR<"vQti "~y(" &",HBFbvhb!jIGLyZf 0|~Ra('^vZ+GK&!@p#$T# 'N.hMG!uD= mzc^G^jCg;=r>䥛&z z5L&)+%Ĩ%6zD[>FA $qL.N Ip"hvE -ґY ߀]?ym !Aژ+%k9Ԓ")Q[3ßkEe (ar}k4n7fn{M2:o@uu1ߍ[Ÿ'u薴O3,=0>b*w[Jkq(d£-PXn,yoI1}x+zR3>wUaaF}G}g2<՘tO'RIEsJMUH¸!1m}N-dJ2F"庼 3x%LZ#6)ZE5r68#Op=]8ȔC[kRzae4iElfY> F*+8uq4ٟ4?"yVFV:{ȦYO.i`&#ybh%SzB{=fŠ y^_] ;?8??=O|g8ٻ;-:@UBHDLo`w ͧq܈MVm>uk|yXxʠZ {\};ː$nfOu`kUϠ+aWz[@"t!*~OTXb; NP<'tcPj3_1?k:. ?'@rc ڲʿpgslR\ ^\ʱћzvhp}&YkjLr|_w77= '咚={<qzCn݌sRƣ|$^~;z$veрl`҉`pyV? ċN>mYp&ߝQ&-0dt9$P!&V2f*V-ot}-tLܖQGkF[#'2ږoDl~K.p$"{Gv۰P_/m$x)Ta5 xKJuNPhi!JCP7*㣓Ac1:Iu.nƜh]ms1ЕI6Ug^e+t+b'uM0"%S* uH, $5ø"(fQ*M\#+X*7YRep 8lz+[+gĈĸ8!}59R`%'c0-I rij73g9cU(po 4($q^kT mGwEzAR1FX \ÍXnƴ'MG7y8TU^N ,gȓ*.tȵ%_匵2?l6mj5A6[H%\4hddQӨr1_ވ$ĶQSFH F,JhRNCTTqG.j[3FzVi/|[.-B½vW\gdܐxX(oP,_i /~9yЌ$>2E*'SpyS)I Ks #$v ,3< \l9{8j+h!kfNhcU)@8YhḶmqǶZ[U;LF,+" D6 J;5LK9pBtѨ@t ՇI*7 RVtQD A1@T!#$# hTvm Fz}X_Y1Fl?Ոe;iUM J!<ZfWA 4"z|< ~ͱ4?žݝ5]6Z-4]7ݜ;|>(r>5tj {o#[%RϪGvn0\vXG=0fD<[t>w.e/<7?o%Prn5y+F#vA4G gJ:3tnf9Xೃi/Y}igߓ(['Q>|ʣg_z\8K: ߯Jٝ<_Rjde F4Z!gÁw =4ɽTm+G˗/~nBVR& 1q5%a P͍QmQ'(߄$K]\˲ɠ]03i.L!#e4#/zS+s/`VXߥ99q:B8w؏sǟݽ"qpQ:#ZzɱVzRzg=$`@)@*}(joFI*qڀ/e;Q!(8v0)&8 nJ_SpS\;Fm:F5qVϤX;Z9D/`soA`$qڂ=FHxm|L6i =qM !z#y#=s}8јiEHPha14jP8. åh\:Ԛ)O)d4M^Hc5oD^oƮ_ڤc8;RD6Gf/ G@D< -w&8P:"1t3:{kku>xp _p`梘grf_>5csXTΑ'9G탿탡ؙ>@d˷E6e"5qFc;5h)8HKs%Cpɑ HPIgyޮ )qeqc4vA-0a[0 ^vݦl6m9س&j>[|O2ϙpe%oHL5nUBx;\Mf7O &86ྞ9o42i([XazQ XoY//,Ai8Pd\…g$ۆϖVpA^ ˋe*[|{7Tl Te𕕍;o2,'jy7+o׳$ {㛬s=3GYٻa֢ 1\VPgYV+>ԃ]*Z-+@n;]%TtutEv3tEFt-t(=] ]1ƈ +sxWMU+hW JJz:A,k]%6EW .]VUBz:E.Ů|~m\ҙUBZjO(yOWHW - J.ѡ?^5HW .]+@Qtut'h e_Գo)d*Q΋AQ$m=Ϋ hğ/ݫl(0#opt؀bpΉHU.լpTtoIb)?301_8|~̱9OFaYY:4]j[Ƣc\Ga]~MͲcJ1EzJ3/t ҢVTU-5^X:9Ŝ9s˭#(uX 5y+#e+VFB{hVFBdoeB#(]%qZNWR)r9R㇜w>˾y(uĝ=3w:scb6ŧV0:0s\E>a[Z I`RGSJmΌ6>imw1%df$t6$JCϵ,wJÃ&'M2'M6έ M%Xw L\*Uq)@R'ZP% UK:/P@W|˩4G q{Nh%9RPt{wa01X%wZFNW =] ]QCtp5]ʶUB{!]߬~U+qW*Ut()c[Pj7 w\BWvJ(9J`!T+|ڻ U>x7ӡ+ k?CX 'TڦNLu0gݱ\ޙUB+ZjO(Uo]&] U޲ˤI0:'TK.|pè3g-w:Yz;_/Jr5^-'l=o2a GkZ9qP+q6 #9NbbtatČ5QF $yzJ{c-^#GK|#WQ -;R|7et%v+վS)Xt`;CW ]%t(%=] ]d̷CQ2ìp<_Mln6˗g.^=X;齹SP*~%RaexPQU?ˊL: rl8N5bgvrPt_旆p1p(joFI*NP}Ԗid՝'* p* XEL)<%1QM^Hc5Tڅro*#O_ըM:/+ H`I|TkaB`F "]ܙ@Ո:"K|;dxv x8Xs0SzA*{8%Ok'w=݉CۭЯ_ߟr?߰b.{3-Q-&]fg83`/'YeTCƋ|>2VM 0?b1'$?x%)!\/^b4O -FzkxSVdqKLg?Ul7a}6!x7&H儽r@=!H1Sp`yy梘grf_>5csXTΑ'9G탿탡ؙ~4ףpי{;Gm]mL2s&lrѺȦlQdS6|Z1XC7`:AB##- IG %G* A&e$X +7‭i[B7`$3ø)aHcp7/+^nXiΦjzQlp֟gj۸_ݽڑ*UI.KvSqť«EJr8J=D4h4Vvx;Bcwօ OR1.+*Nׯ0*1CdCPelmc4|Lrq0vP|0vg 0vuxв$0#&,k -K ueIʥ}hWZƸX~SOiqUUϽ7p9̣3j&ݳEv0ȮGm?k1'L #Vvayi*]0YP0vY4a#A2rVk/;^sXel,=]";25,? ~^ SY~7V[/oa|K pn/(Ao&mo v?^st9ߎ_wM{S\Ϊm 傷Y50cuüIu v- Z"@6,sf) 1@YT׳>G׵Q_ڨHo^Xs(GY;QvAr+]ˋm^iID2K8u9.( (i/m?nd-%9*{$`l4ڄ9h$Ǐ*hd;DȾt4%!$W1*^R^РȈZy;B)w{0b E.Spu+ RDꥦĤEV 1鎃F:eLOwCi 4Qy$1SyR~,[հ-oKX{BPO+|UPo奜n4vBlsX'T΂#"_As# n{5<{6'?Y~xc ɭ6ZaN*sʀT"\kvZ ]x&NNoȪCZY{^s; r>)9#)M=B;,* 2'`0)h&JZC3(2L } s4%h/Wy>_Wm >$:+(v% k4@<)y0?wϟ;`1Q>RkTFH%Qzs<ȳyǟ# +ƨxH2)` ȭpQ{@Qc6F#]L(QmGs+:[LZ)"VD h#v&n:__ZLEsR"OyݎŴ"Ύ tVG1_b%n:b-lxr~ ׀L l#r Y6O FF yb(u~u$<]$cO1C(3z=?Y>Nfu;Q.HS[24(bb&iiQ" ϹJjHeDLGk H9c/I+5 Co^0`鑎2&(A1 #]P093F`ƈWzd#_4ތ?ތ7~z|ӱVRpx8pwIO'n :xzi<^hlry/F[L[GL *8ezǑ3,JGhnőE //Z-s`~} T010u,# 1^j-E% tD)gdQDLt9%{EwW}v~0_cAy>H@VSZB(4C'O( /寵Uچ?۬)ټFO۹~Qui6O ߙ]OG1{*&=QdP\~ܬ oIژݍ QSFOӬz@wHpu=_M"Dr+ə1L\pK3 89Î5gXT J &J`L% QXՅr[#WDrƥp%D Hq x$ )[X1c^ˈiDk45[!-iyynгd1v^Ā]Ltr5jnL+m[;vm]g-/ܮxK{uJ[jb w MX5u[]fK5Tie_Pm0/Da11z_S `L|`tnԹIL`Gqo%AS_׮.0C*h̭.61e?Xâ$.* QC% $\.%6!`BL1jveZ2o5]iɝz~Dc$1kvR6we"n>ZVo=wdsQD # )<\o-JOH ds tA@54⬧f6*#%TDRA,*͸Jb,aF! еQX*(yy0qf%{F#A8Hǂ:8M% zO D3MXI WQnzm:YP1] GRmb2 &pQhLC?:H @82XoN+4NJVg#ə (]:0ψY n>F8o(l ?llѿjKP3{=*$%_򈜇Y6-CC $s|`)M&C11.m?o\aB 1[#Y-c'XH)D0 :9RL݉)pew' I*V@@BTaB{q YҢ0ylnKOe&4$(Ⱦ+-۷G9<5A&תtY4B^6 a%$LT9L~Vͫ'7#Z~P|s=>x  \|pp~1[vQ6zVz_bpa!G=1nHS7QͲ|h\(y^2ǣ]=}bͽ!zmzVPkCL$]T$с\GZ)iױԙ8]Km3_"X_}羟0 =[ ^8腬#57z>_"@w`%,x#4W;XM'L11sDM"l&"zN,H"KggJd΁xF"F0fsrES Aznl $ 2T&@ P(QDsN"D^-mʆfV3^2P"5>? n,;hpu k8@45s6ETrkܚ{[Si9Kn3.XZH%:J+DPMwKr/,N7q WռxMcߔ=C||Z^ݛzjH 9RJ*A-%%C:nYA{HS8wP服SD4C*@8 $e2θ༆(G=z gwD3qGC3}LK6E1.ꂋW& $ .Zd!H#,g$]ŽI3ԕvlc<4 lpYIhȵ8."^W_(#yc CZPPpٲK/IRL};tڋ A?S֍69}3_M)+~OÏ~Ž}N`av.C䠎QN\Ş;VR73-N$(YrZc}g~M 46?hlh=<;O;y-H$*4 (nvT?O?ji'_umVlfO鲇ߙ\ZwTo~ ,U W ߽&xn:Pë\TOLZvyYb+XQr0peWYZWYJ \}p%A* \eq%=Bi)WYJN \}pe]|`+Vн \}pOӅ\V6Ifak$M *"N?!?� 'Wf)2?}BXu|wVs\I?ePlAp?;qqJt1%nzⳭY6=s3yM `aq7-zl0OPysP-%$x'~ڮ]\x3T?a y{1ܘ(iV)2q~|jwyIf=ahQ MfQ IbhU 7 j#TPN`ڕ`87 wN, sj^kfE4 j;\y'| _#˝e9kdmlH5#<P[,-u@YX˃̲FgҚ̲+\121=,FmlWY]/^URW_ \5^I/WOXz,\=MZBp4)/RO+UjӡhWY`P\CġUv׶p\d+p# fói4a^.h6o/ajY8S M-Yr5PjK2 M23c6qcuG[4֨ MA[~߿9igⲬ͏_Xh鿏DPh6o t23>N,>E1V@!XmNIXfưu1crNŸSr}Dz OJ^(JB U( %T*PPBJB U( %ԗA=1)P5ݴ?]O^]d8%%E[68Z{OL?|݌ Y?VC77~9lǥޱ4Bfhz0ja}@o.H8^ULqIvY1'IރWr=iοЈ=4Q>χ3y# Fg F _3%98^A8M>%6'LޠINsrMaqeKڈ>cis48;Z üw~CeAn^}KIKسg5Htrjrqj[MH&39kUi4^M]ZqsNoL)PjSv5}=vݯ']M,gB-sa Rwv>[nnvaUmդ90|snk dYM, 5],-F,*{! Fa^+p#Y5+e+ Tal7|[D#J%[F*hClf,~ϯr{neYZ#ݶ&vkʯQ"1$%D=2ꅋB%a YB<8JϽ0ü>Þsx}UP4 U[X0;8EV6Pv9J'6mym8ߜDe㝽'֞*/ |]UB._&h0eN~Zh׬BMH• j]Qm$"*NYD@yAb6ڤlݲ$s\k,z+=Ϸy U*PH$ L$Ifv6/[yLv0$o E%v_oHe[ef 鸭s:>4C(YnR`_>wkm%:9Jh X8&`Jq#5GkoAq <"{Jjσ2_dB Ț27ʤR9q$+Qu7((tIMQH >"w&@0OyFJ۽2g.0.4ytd:Գ)2[`O1tm^է4|7b:z ~7d l6ْ߷D ZQ  O@SJU=;zVJܻ/y9KЪhEj0ZϵSD2 69, vkUܙx@ĭ5N-}E"svkh:#w`4:deٚ8<ѫѠi3f%|)Qyr[^x@kMRB Cn&{n:w-Rx< ǴR_)9  }) !FR H)Ц1Glrc;)[MNPŋ 9Ns2ں+!({pЯxȝtP =  )XI n$*UvqNz 5KJ,NZ ƩDITcŤu 9R qukM֪bXpڂ]][L rg Vkՠ_c)FVd GFPX/L_6|rWp^LCh{@u?[vVGRBEwAo׋ܻgNl z̽|}5]W/d17U{w$7v}/{wr82\RYe_Dz>&)8g+RRY8uyWU X_8SJE&`QG-TNV\Pֳ(kuJ񉁮ͶCZBZYsB}H`/NZwםVGJi)g<͐=u] 03 17m5Jd49}̭^,אy+ $έG9(CDOCsb1 s.VVvwM yUO'g{uzY-T}s{W-b xОThD[F*Q4Dl" B#bpE,,e"\8 bpUI6dg$(l2&%KRۜ "C]\#K2b11) P-]s=5Y<}jfۘ@䤗Kcl}&҅jR+'Sn9ik ީH*yR(Nwp+~GvީIQe,c-d4uH^5fo,4{`_ p\+' @ ŸmГ޴A61,LѢP >Mu`ur~v\(k.bV yD#S2![/F"GJN.FW6EwfS:&|aP?U/;Pj>G<ɪdB 6frCL|lmPkWɝ([8]TҎ""T]qg:p ,r_l`PŒ S삄 4`z`m$e{<; =KΤkJo&Kvb0%ѧlv`ڕq3J9 5^ey*o!B.<ɑ0\Q*Zәe+{&व.Z2RB! 2MFCE#SWFoyfMHϮ.Ҡe4Ǔu,D}RsJHף񷚳">u꽐lCY{ai޿)N)V.&e?n`6ޮ1}v#]:^k*lp|l%_ ֘dz\6NY6vmhBGWWMBd;7x@@]^sM\Bδy~m󱂨I|Oc 9hV7amr?KWNaɎe:3n>uD3o&1^'Uqf w,RYo{,UϫC35❍XwTMܤmݺȳ/lxsmD Ȧ7T/ 15Ɉ:&4!-Ia<(6)*]&N;ɑeחqT GoN?_a= "u@PBh2J.v t$V^T$_-/~i5xB념=xPQزE\x7]*Ax]342e :nŦ4PhҜPK`Y@agQGϊ2&fו&.OxHл#dV{Sm5q YEt)B Ϊ]vzw<ոWжcyi=i5r黾N|_#KĬKٴͷ Ycށv0:mड़--[@xhPYDFcbQOk6o߈Ef,[?%:K4XTr$)Z'x-9)&MƉ>.JlxJXZ `?;6Ac :Cc|ͩ<oUH3ioڬ> ]iL;w^ icIzӽicdz3|7o'tBI`+ɴF*~*`5bRۮ5l4^"k[u\JJygޠ2HmO\J1WU\bU\AseU؞\Uiћ+<3j\m^պә*&:sALZOM\]Us/z?Z&~Zç<\ŏ%' 7+Ql5I(y砫a׷ahtu7 S˻w}km֛:5wgo:Я-߃Ih.mX|'#ؖ6R7z^_꟟HܰM6sVܰݔq\_!]O)H?Q)$7+_~[p;OmlAem.oOZ鸃Wc?G~ܕVL/hB er!_ J E)FVd Gj J"A3ե<īQx=9](s!(H1g.-H)EO[w0A*̛b!QlIH^QNept5s*Ε[iWϹӽ@*Z}E/S 8JES)9SJ 0Z4 &Q"u->W:.1@#Е> ]mgHkIpn&dW0̂G[yo>f(Qh#sy2902 c_ƣP݂_efv>uIېOo u[%KĉserhTvsP@Z1R"aΥnR.W..E )Bz#a=3r![^kMq ]G_ [S9e3CIr=ܬ޳o.?}"Ja:xr&A_^ht<ҠWh1W0*$K;1#?\FaTE_xðn=RLR^X3l#ٶR8s0 )6sFz1э>5tuFۨ{uV_H km*ѽFmҹ 법$߈ji#FoQlwU 2G1e&|m 'Rq׏qӹw^5m4 T#^SnIt;E#m"Q{lB6:fe_l8~=lm/7 W4U )<𨞠~D:ҵ+ T L Y%l80L{?W}Pk1L"?#箇+aM?$h(doR\;1e9(nvT&y0ؔ4eŎ_u?># mu濇N$=O.l6ׯ{_f)SQ(@]tR 9'5,2wI54Ŷ_}ׯ~:z&=uJzbwtILJM|ܾ$$/WV{n8jѧ;ο~W?L1&U r:8??}`xr|y0Y{7+~/__~:J97iK?=^}{ IF/{?>O.hJ4ōV~ CRnҥ/X%b:%F1+3F.i̯E`Sn>_~0rKG  'yzFik2-\܈7UˁuIX ڷ,MQZ5+/Uv> 1OW L94ɎYhV8OY\̨:tkh.ݓl21g [JzA4 Y͗y/uڒBqXt2ulexVEn ֓g瑞B)+\(w)  O=cF@+j$+eM{UOIpd<ٯ#꟯Ҋ` XjߗYۅYH[z)^-` 0OC K)FDoAsYx=x{?qi HrV( \ʀT'EQi%6[˲8?k0|)پ1rJO$ chW Bh%\dBZo% ZLA2WʐjpM;)Gyor0Z<>7 k%+vP-m<S%ߜwsDE}`<9vݛp:|2:LZVHa&1pl]%ĄhLԀ{(^ e; `nJk0h . RBm#D̦'MMW}*yBxxsR9xHSU )U>w`1VYE!leF͝QFG''| x2Oca'_v!5!Q->EY+:fao&)ҢF`"!0|UX-zg՘IDk1"5[!-llP}M{d"6X╍Iq"<>b4:+97N J18x8hMy6 -,Q;7 qfAI$I1BaaRdVlWPe*&U~tLž&'a{rg4\mP X|}PLNv0j%6 AE:fTEe;fS)3ׯ1Td`MJ7'qd^qON3"0Z%7,zyWԄh-JgER>R N9ť4bdF/H@*72fcg=2Uk%b!όL3%˸%niɖkvMnCapPN;bkz"a^ZD(#( 6CIUVhd0 3p+j-$& fCD{CJGt`ZDb—̈m#b j璎mQ[dFmѡvgƞ(BJS<-Ph;VD") `¡J8$Y#88f!exю XrPO#`\0G:H{O8;r! Ƕ(3#Cĕu1 k S Q/Ym!` !Ȃ3 .և |:C #1)D8BwmYQQG~ZKOD +)LBpK3ʺ0\wnnOEp 1|c"T R ,i[p !$@.1{țkE3ٮ4t3 oN:~J։$Ch}:T ؀.&ouݞy9Ի{g,Y}ѼQf.]57(_uQkNyjyHG[j~(0dI=k7'eR]:ɅN:ќ?SWnϸi5nGg;(.{u y|ӳA٣4GtI8?7T%ZM|)|E5XETߦ80i1,q JC:0+wݏ[cig- T*3(pVWzVK\.bR/Se…*9\Hο >hMCr{zay9O mW aAշ_W x^Z`歽z3MFe }A"ܳ(v/}yi]g{)y&guWdWK4L'wi;<@3% U0taf2ͥ,tK;XZL*N:B-_* mA]~x:vF,kB*оUHq#K-k -{O%fHc eLQgyM>jb w^Tsc%bDx}0}J{J("{iCSqǘa>T1SQB9aW C;Kv8<]SNM֊D E&oЀ$N&9:.'Wti흝f.Q&Rvu~Ca^ !f\ ~ƽ<ΠT9> s]onFrk|Oף'>1T1kr3[9GZQ?9;AHHԎ$.au(e01^QyNuMz8sONQ?*#:dݨus͍_fK: q*#y`G_Q"{ď!-{z,?'?_.|o/~߿O?'?~'u+07(0po?cǡM MZ2jr֓_e\kƽ/>bnA{k%@*S/\4:ɇN]N]Y]B_AQ/ma']K ш!u~4)b҃6"xq1{ZTqCpkDm 4裧QFtQD饄DfHg \ʧ\[$L:7 H]Ou4*QЦa  i>M3w`]36ǹ܉p[ȕu%v6K'E["E3>m1yB5=OY~K9yQ U1_:^'i)xe,j4nqVi"Gۇ UQ! J#Imp!,ReTѹ; (kwzLOC>V4E ㄡ>PBZI=.#Z(v100lq4o;ZAd NwZc4QLtd$bJI4@aЌ3NYʅ2hm`zC]%%,˥&%{eNd MEB/f,jf Hv,OnMҒ(:Yss1BLC6M8 Ir5TnDs53L^`!iȕ/|2`dߝdsyAHic('gU=,CVxߋE Mf⃙H+U`~_\Xd$(hrCU̱jބM%ktA&wJI䜻RXJcC*1ƦћUJov7R\0Fm-;Mf\7$4|mL%mjl37SIً_iI*BiOR+CU/oc"utS^zE9,{u[<gu7Th:删Pܜ^1! jXEm\JwfAսwjzleڗuܨ%_ٍ&swYCn͖Ϛ%2\E=Ryn]jV&=pfpHK8Z ټ &-z8j)#~kaUjQVEY] .l[BHKs!!Vs09(K&BK%4Q,$.J"WjMBH9q-zjMdKgNU3]wŞn躴4_K&HWҺ`bx:DKD s'9>ĘR..-s<X1qRxS2Xk5T|ĭM4\1rV,)5p֗WoR6*ZlEiFA}׼-nVK/UXiK2[lȩP!P8X4D2, $Ũ֍ 1!Ehf(L qz1xΒ;&6FB|&vœcSQHշޗ e.7@bעv_ѺhTjvKk |8=!zÏ?*)N4YPg 2Ee"M3eb9OD>caA3Sڇsit,9B\4Q!;B@4- Jrc4sf s+8"O-أX_fp]Cf^.\ߕɲK÷P<[x)GK?- v  ]hy8+Z ›˵Y֛m<2## v.9es4IJ`DX7Zc]ką:q]O;v>D צ~[<g7_nۉ7dɪ23RGbsvʨP #g A-) 6"dd6Qme{7 >=j,؅(%o^ȾIG)H0S)+ž<#wLlSb344 QVm]|ڀӅz R)87`ɍNNrd]e~]7GFZbp_ONE[+ SX`@dJB5Rƥsi=76t$h)[ҞO5Zks( <5m6F&-jrAC6;.yݶxS6~~c>h-AI4_lcJulQwl>9~ 5h\#fB"u@\gF ɛX Yν抰DI HۤH6]b=|=n=|&3D~M{8i\\g- 󨙋ZdE4VR)u.!aZD=IFxyyƭbh ( ), S%|Ԗ KVQ!G2Bѷ%\pX;5)x̯?\Ts~|\Mp{xÏDPWemd )Q$U0Jw^ʼn$V0BFp^2`ڜO0:U=ym&rB"Me(5h2m}41D1jY t3ߟI*lU=qšc^P(2ntB,?%ФJ} /ytm }sR?bº-jdvUp`1:qƼO !Ԅ<#B98zj|Vt+QXcYš(ԟ|鎊 5]s1W2ǡLޅuwt\aî;}-!ޓq$WxcX!>%)!(d ISCvh$5*%,mp) yPEN_*r/0LLWǼO{=:;38;)[wޥfO}$L{!?!.䑤'5ɌAs`zMeE蹉Y8Bz)Hi5E8Oi=Tz^@9d^Iv~LCdgYgϞef)pȱ$.gR\)rIPpIwt-G4˫෽Om#{;[T ǧ٦Xd,: %WT g?~{V|j͟o'^ּ h}gEQbh1wZ#8yFC< bܻ9.uMOe?Eoi߅7xz)dcbr?Vﲓɛ.k4JE?P!4s@?z;&L~5[zY@=`ͳ|y_抿N9ϧO#,\Q9GU<X@|h6^z\ᵹEg[r}"ȍٖղ]1Z#q*۶^*˻G&Evvv*/}n> NcWlLŎAUe*r6ѵCaf̛}vƓ57Pw1H ?}b^{%f07#z$"sbPOXŸӭgr(mkQe~`ː^ ǔFyϭ# G)q QHԍ3O&y2utׁK߀>2)[Fz01P<8*"XJ %GA4dCFu,wΆs<# K h-ݤ ܖˇX=EWZOMn-f!7 ;.l0@?Cc*H$tB)EAcf$h ́H/:O8X\ JwqSB=LS`4BW`|@GAzPGFrp7yK108dXÙזG1 BꘐCpd&ޜVyi"Zɭ6F2'gdpҥs1KIFPf d#;~~X\ ͅBYgvB$9m.E٨AZ1?3_:듳/~|~;9:*7M6ӮbL-\u̫\r˼[Zol- /\nC۟t85A+'+|* /2WQBm7]Pn8&[W ߙgPQBTt !(c8b8KiG!G=?(Jmqhx^^_hus+22y!:UԖEⵣ4NQKt`Τ.v+pbK+ڱ>;{DԞt\uaz&5jgu9W<׵9?4<0smCWVn^uxI:RiB"7`Ȼ%7ih 5(O+P^"|(H>H$L$R"^ʙ{nwDR^" Z3pDy"߷m.O"іWKZ:y@ rEHR݇Ay$чq"ILgi3Ϫ'?07BJږ^]U<“'UZx)Nf"%{)\&bsߌ?Mfp?p|}t6SR)]݀(@flpK:+ϵgKI|W ]@1e)Mv]o]}{ hj]- msi*_-To?+tj^v0; մf3ME|{0AvUuVE'dzx}o߯SQ==lL3OF]-OۿzN>Ҹ :Ba`^\\MW]Dz,g"ג䠭ќX ZۡWfXtr&=8A pPSJꀅA*åL.c鵉vP].Sϕ+RkmQ}ݴ;ѺI5ӿ sX9xHSU )Uw\ịUh4BkM̨3hQ9b0 QϮ M:F:ojk_qN+G2I^ i+7Rh̠L\iE~1J@FP(=UQthP J@qJL=Q(e ``O'iO@g ;9QJrG"ȰHv* B*Caj3jX$"﵌FMYͭP\ɲE6]V*N_ly(K q"<~lYɹqJ`dhT1AAku_HIP@.(|8}]`D (J kήYO|R"^Y(U0d_KdLgxm.b'uzY=Y~}]2'S]*!זQ)G*$ jd+1Tf$`ȼ8e"gD'xâ!ݭEM(֢TQ(#=(S\BM*Ffd j0D`R]sYUfl yǼx+]MgI2nIf_5շ&wo~!7j/: P#D yi8<܈DIPI#I*lCR LE%$ItYА%^Fґ0хKn]c/{õŽmk>HgF#@TAG 1ԎHcp(8oI#eVv#88f!dhю  @ǐ!c}9P4] wz~X*M-#vrD1Gx00c@9{^+R@R" $)l#t!$8pFHL|z ",iP΁&MFbu] "J>:;Òm/_<եw9I54r-VQ$HJ\'P]B-f 7YѺŽmX<b.3r<{ 5Anz'#y?*)ݱ/tH(Ys4 K1x;i\q=$J`L% Q*PbŐKJbXr$:9v-uvk9f]cv) WV܇P&E& BN~PdL=;Szxafy5u?vnBz6͇]~: #/\)}uuHb͛F!7ȳ|u"Vv[^Qb򛮛̸{&C!rQTRs;揤k)%ڕn޼&'oшt(Lɜâr΍,tv% ]}z j%Z&zDw1h\-" Ke&^ݤJqnooRumvvFiE7Wy MϭԸ:Mkn/c eLnyM>jb w9Tsc%bD'UyE)-$Ū q$.qg&B4|U`C'Ķukf#glCwV)s(;)}kyvߋW=ktI-JaQ-LSzJqB^05&aϻ Aig>SƘ^|hZAR }Ծј8;7+N'~B}LlÔrm6JjvrAmib|2ъT*q+Q>y^cwYltLsrٱ#4>LGflVFT9vDsr4sYXxK_ /VYcʮ_{u?2! ]0+}?>Vd;HGn;Ghe3`M(SQ蛒Rs'2хL&zhRy,he,nHˡj݆v D2 6c983zsmZpr3__}>{О??b7z܏Ϥox?,c(`nfի_HRN$5$W:Krc$WP>Krӆ +VJҕ<\Hs!M{wd(Dw+\2b#>9{'.pK ˛Xc9M0Q/b5'oG>-aܓUuY ##PQǼCc~B, Ӣ?o?w/|:3w+Ç)|_?},[ޅ?RM,Ж) B*v׺w LҞ/0v6/u Σ;iaM[Ɣ))-,1W6Uh6BE$:Tk]㤥{Ѣkg_~sIKszu>&Qw3 @W(0q0f2cN"<½XzB8 9C9{GgzO@D[<^Y}ɶ'ꌑb m0e=`<9hra\q։4)\@cV_93Jm*yL Q"N^7鉤\2/^6dS.ͺS.ډ-۷YgJB)MjV@ L&XJ@`1س;(Ptv@[RZ 4Cs8 mXV98bӪaD!إ+a$xtyMp5XLĽd] 8^&$$ucv5)rNk'fpuTK`2%C ȉ cC0"ĘJ&X(g jHȵBi:UhrofXApWj^YPl~]Ew(!.b@j;!7CUjG2P(SP|2(]p,bD&Tt DʓlrLy>6]EZK a:Y9b?!`E-ۑũlDž01S.0Lr`-$1' ; :2DM.%{ Z}AhKUɐMbXtLGɓ#]DNH`CBEE ]1ʃk IԆ"92ke(\L`)aY(^rTɚm>C& q'2CpLXTuiU% 95[vIn+`8b&{kl4BNG/u a>m>A^Z@mDeR㈺ +q@f3L5Ҙ j`UDS)h BVuPt R(RPK`ID[氪f  s'֦Y)7pfB"(kDYci,CJ+L˫ʰ="찈JfcrqZ/hz|U{9DP5wԀʬv6RꝦ 譛ޫ*"ݘ5 iɄj; l$G}>;>go%KP0#B=HjPb]0>G<]o7wǴWgsT&%x݃ CBOH0%ZF/Ʒ ah0*ߏ.Հ4QG@YkllyHY$h4v/ lXʋW=<AvU)i,Wr2TJ`[QdsA)a>6[7I1c `f؇$-a sP' 7b!G߲w՞Loa5è3*O(`TQRp"&FXz) U @HS]qd319} AoвR'5֪Y5}V(?\H6Px*B5sגrFo]ʳOɒIE:-pW^ J^k (:z7eàeE({̀|1 =peq\쟟VhJ7a#U2GT6C\5fK/ypU_.,1˹ՆXMzjkx* ePb:HP·!U~AF+z P9hTqvl<_=1GQ#Ȯ;pP1w\B?dwz%r~L3r37flVߢWn\[[rpܐpS܌(&;d^ ٨H@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': |@em $h;N Ho'xN I@ 0J@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N l.l f@7ұ;%@ uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R':Hjp%'''fwN :(I@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': Nz̃[dz~nPm~u9(6e\c\\c޸ ƥ`\zoeYwu+aC<,'?&LrBds'&'f'7/RHwڝ'ƽ-Awv`;~s}jProwq. _9!@nRO ד3LPo߼SWo3?ktonbeC% x_>=]ah]À_zc,hW5ۓ_N/.m|kBvf|gq7a jxͻDΣXI#?s ҆Tޒw*C!hS>v!(٪x*cf G6nLQ$h ;y;Itp>h#H']9xySަnKteS2o[+JP(]=rp+uf3t%pmNW3+/ g6CWקЕtt%(3)]=C;%u%v֮ ] ZeTu* 7DW7:tkW2Rsȁ]p0ۡ+<\m`~%ή}ZcR}\a r>TIˋW#ßLj kcfѝh;.` vJgڅ>+qH=.5^O;.}䲐2{sDF̉:edn`[7}޵q$B$e~?{^\86~ZP$Çe%>FIHqhizuwU3)E :7U&=UfYR779u C?6YB)ڬͿozEɁO> 8<-]?Q(8|- AWSU,dc*)t*YwJ(uKWϑUu)tЊUB)[ztE1L1n]`LHc*}ǡtPJ3+ M7\BW }*8Q`̚JpycbW mnI-]= ] ¤lR+ACW M+@i󖮞!]Il]`tc*qBtPֺzt(ZLL>w0accAL17) C .kCB+Ei:ԭUu8?spʺF!Giԩ +$Jk ]\hS*tP23+¸,tc*U)thCNtPHCWix*qܧ?: ->s+AQNWUBKT*ԭ3so/]`› &5-njQJk𘺣CW Sgw; eݲ[zQҘ+l ]\xS*tP 3+E1-Wk;oW1v~s%PSrTt1y1=3\3}Cp$S,I[xNV Y0'`p ]%XBOz)l4~ _l| X\z{dBMߞu>ymߴ"F_۳}7!MJx1@S Ñ9Y$dQֹlPq[3ܬs#gg[qP)8=ߧK6 Pɣ[SN!A_^ʥBriHT҄ܙ (EFC&ue>֠v#@U>`~;_]#b/~;XLv^4UR0ySֶʶkUBI$}yL$?Uɮ.򲓧$lc|ƥ}/qUŲg}ӧL_}K8?LEkf2 gO`:_+ߕ|mi&=-ZC 떼]E;oP0ِ'|0rYt3jVx_͑MyHgUx><& ZgWs՟]VBzlW\dz5.UiE )ҕ%oBY67iaZs$ k|x>r# &]!\#z!Uxd!R&R/5eDDL ` Xy$RDWzgU&2/QPДsc#]E/wVצjP'yye@vMK(Xgr#<fv$xɍ4˝~8(;~Y\uɴ1X.N_n0<…2=ߴa&.ùcr̸]Zـ\]]Uwr쵲2aDxBP5; UTsc%bD6Ra?a#/MER0C\LpIcZ뾷P6sYKZ~bU209kVH}-1#d(3LgIi+ D0v[,Au F ӽ)]MiM!ctkGDrg&2w\ aYI@TNo-h'$pR2ȃHN:& 4^uX\xY6n\\Oa?\cKzlK"wYa W;ïBZhgKoorW3dW3QEa89 F0bZGr6 vJVꮾN~hXHj|3y)W sŝQazRE JEeaQqg?W/U|_޼&޿o}g0`a7@ÛƴeTCUz1vo+7whn DwO/ԍxi|jSe;f;~ j~:U䨾D% En:ys9zd#ݴ&n!(1;J| A ,pǹX:M4X>Z59y(=vayY>1)V཭7Oq(#`|Sn$I(qBXRs&XS4{Lc č`:8@/wl :/;)wB3`vw=w:2v-D,b'AZ JBYoNY, RA.e?FKMDTUj!>أzNjb VqvZ߶0.{kcz+u{v@I)"nFNJLr4L2I \j^j^ "p6Ώ8Ժ1}8A&g3[;\QS K^ٖlVwGrRyAiȼ8e%gD`a|:g"yâ!]mDM(" ,G ~P`)QU`̙@RU3cepflʸ28 y\[.|P.\!J}͓e|RQ>l;Ů/n`0:ckz"a^ZD(#( p&ex!$M,S42hZK(IItY%^Fґ0х0?EIpwS*"!$8pF HL8cu4(&LR#1sĺh&xvU["k:[ӒjU.^Hsx;i [.)1#&I$ H%P.7[f--P7abZUp\kոe]\\#~|G~|8>m#Pn"9>i nbFYm龹n!>zKYUJTj6+!>@b) sl=G3׺}sg~kg0{M(~qWVQ(c]^TbkN}sh]jW簸yk OZr6~W=z^4=/ZַfgҺut~;zb.9Yu3_,Aߢ} Š۹ .{uCT31kOs"AGpOo &aJfkN_xR|מ<N"<> 0mr~zQM;N&nV}˝xΪMJFIa vQJ&U2~ig۱"|}ҫ'OkU@rul0%,n-7&<,Og`+F}7աWy[N.0{QY5G ߙLE=@ͫ"'UE,ebM@d.x2:r'n5Ē,juNuQs%4זQT%($|qqmD%D=L=;k -(S-x+=ӈk)zЎ`m a |X .i7#CʔX0}Ԗi]UFm1r|F&,GfC";ZQ:"~qZ/a[ Oum^5|b%U8{59¼Cr<ə&7>"m t! 1[b(nfSn̦܊` E uL3v-=GiI`N:dH.9R 4'9ѐ2]S(s-"ro$3ø)a9mF= )Y\(rtjSۤi 1r cYdFƮ2U|nGƊk9#b^};ksc̱|:/6$Zr;AfxGfRnyݴ Q=eAf(;Oa9>V/>5u4 :@Υ7RM$i'&cLeیŷ0d@(_ `ggaM]}(w e!8߽87| GgÆ-Io-t%Lu!;J=~fbjͼ[Ի /+ ^5%:ѕZWz J`fJvyI &uD9hkcN&zG1xԥ4ҪcY˺e]v@`cky1-Aht|\^h3 yJ/mOe -%9*zL)*l*+%ߺzvrm*$2^Ĩz KyAJHouJ!Ę&y]-+T>o|/2JR5$i9^.ήd9ʐ2ƽ%ޤx;G9JXA(H5ұd =! i)1I C b΂wDðqe#nMxκ`-ercNֱ)cjk, YL@Ӧx+i]YJu=]wMȃ9J>py4'7ʇIsr3)9ݮ4'iNh^ 6 w[aYW7Lٹ▱?-̷XLb-:ʠj{vni5MLrK#B!gh)2twF NVF-k+ic=L~RKAt>_9-+c[^E]c<L5SoꭼgO'-09 `hn)1H+h;wVKw!1@VY0QJ' 9Ge^* Z"\@kvZ VZYqL8D{i98_~Ţ}9o~˅R:@kcRaզ\$ڜa+s=E8R׺#]#7 ;{ [ 'L l#r YO FF yb(u DDB26n&6]6[PǠo,@Y:nwzsqK24(bb&i8yD|l.<*bʈ(rT/C BZ;`Kt1 F ڎu2̙13F4^ =m]Zo;;ϚѧWzZwכ`tkM-^mvmc=+=%Tx-,Iɉ#G! :N$S&CAO#t! 4b Q`8P!=RBCq!Hc`IzKT<-f$?K#[=FJa7TK1'(C8 VG@HB*.(=ʂ[cR]AN=^@h l/ɀFw'"U0Nm@R%YDs2Ί}Yi se ^E nm)]N|>T/UQWa;̤%꯿{u|d?ף3w%>ct;6:( Gdz\=ze"4 .\^5͌Ns:YBI|Cx;4E/zn1ȶ_$DOSS2NIWSyuf$>_.SIgϟg_f)pSȜIrrNu$c> A/dR>˯rtB}ξn8ë_z7oGnA1?ej, eI/.] gɻ~߿|__qխ>.#xYS_TA|~)J -|__gT#hr?E^{7GgÅ6i[z^?_M1x<=ה>d IFOwG7z?k򾆦W*3_s4A?N!ʒ4׋ߕsBz;6L5[zY`=`?c%.0\+ۢ'7;xTvϣꎪw LV{2:/\emB~M B.Vrc6elvB=h;G1­Y} ; Ͽ>s)Zu+nh4~Mـ/RkA/\׿*Vm@b26<E]+$]l͆5}ܝ`#et;aQ: 4vLL[$~HZIx}y˝} T01`rpOhp/XQO"Z.E3 wAFSFZ0w@`.Uww WIJ*:zp%Vz@`MwLjOҊJR]=FRpNv@`!UW]+Vnz&p\=JxYA/S.>e>*e𠗽\ (6;u` qC4_>&jji9&9Ef=&{aGRŷpv2Egz;2qpLd@A(g^Y6ƚ-7-ݍ H ~fmϘ" wXnˊKpmfQ/lcX%p4׸?v̎,-LH`Q 1Z}Sa@B iip_aU*FkQo`c x 8wĥũٗwJ.akL[!U(ۧP@뷢RP 3[a^͠>DW_uׯya5yځUTf,ڷI}3.@*{t@(+&UP]D%a]k)$D}%Z"kY"M>Gs;vWs788 poiLIГo uar\yY:4XtӵiD3׾S:ߵ_d'*B^5 Bz\Ilo\ߟS{Z.OpjA<V9Us00 5Rb-[g,P٨ ?Eg \A NNl8jBRwwRh`jJe/۷ME gJ3tnf92ͥ,lNV9/z`UW4)Sήy gy\wم+[2ZqGY"Xϙ_ W9󛨥;_ Q% o)䆀q~-D/^tkãIoVQVtydg?jybn"XO!ܽaUGbn˃}f)qvavUg`` LZy<|A *r|UxJr sOqIXem'" /jىΞIGohlY\|G9[k1 3{aU(UzlyTkIi^1Kq3jOn]է)!W&eLׯ~^R]_Ļ$O'6(P+}d^ten|qEt-3vU dtL;y<طEC{/MqVm#qpQ:#Z&'x+=ӈk)zЎ`m H< DDL:2i`fP`I >w]/?g R 585CQ{0rHi/52(B;NxM_%2{&' EK9?KmeNOd9_fRZ>;u_:t[._}(/G_鴝y`P(Lv[EZ|5mdmBv5I,r*j"2^J-[I`ʄHxp Q0!S&kZ]*`] ^jFլ+\W@]5%ru L, 9 z(7L+B"FKˎQ%`:H(\'&tB-Ժ* p* XEG@ElH%1QMQHc5_@EzUFv>.f<фE$$>*5xh08:"}Qh3=:X:opwVzPԋgi`Sx OUθ)৙;cAyw\c] bDyLUq-|k)LenluZ):׹4TT06dn2>8 e!^))uϮ \9x%owt*+`0cFU9TPހa SŪnL\֏˲-". TZ,%tZR7T{TWomVG8j M_XU(뀲~g@obi%$iu;y%n罶\2rpQrWvcz_Zǰ~Oߠ uuV8'n׿Bz= AJ)c{[Mʍ s3+4h1e`jUb,Yydc/L}HJ8(%`g8ps$#5+qm,W)')-9iGSz\yhIE|~#p#O}.{7b:W}_]o9仂Gw_X2z qϕY#W̲̲DVeFe~̲1P[j :TD3lengGjZjwte[wF$36(Jd NGAH56 I,ū$8J")"k,DF#I`h-P+o)|ztϹX.cjgΠdpk42C4ioVtS̓AV<`hO&+r/S_2)":3514_v?iz[Ot/gx oð&WMu&wF S&'?W4Ҋ -k3tny$XΕ *뇯#Xƒݩ Z;h}$^JI͠${VN럣d#L~~h~H/S٧Ѩ'ONt4.aΤtR 9'ʈ1 ;V=9YJYp%޿Y m")s( 陈ݧJB8>VI~,eII?W/Xwiao=}i'.w]`<[s٠/jЀ>$ڝf3wQN8{Jܰ\g?}aaK(.p>Q7{;ψ" Ns28_9c. $=;:&hj={_ ۄ "C{b0yg(aq-JZ $gESp/XuNGgJ[Y{21{$^{ŹWZ8ΓT<.ڽIVEXO>;uGz m\ ϕɭPRj"έ x:Q.%b@B+bΠ*8޲}hQ;Go_E?F0]/6ysZ[,إ?w^y$QP~FaBPΌ8Iձ|X@K+B8GUFJx|)H> }>Ҳ4@ku" l%bB"ҭ<ɱQ$CeⲬA$!Jʘh)CSψO9LIE ƈYQBWK Ahy]>a[7l=G1 brv`B?ٯʆ&`vRW'Syr^uSFT%^&yE(i:Bԙ 'I$)ysDbT\[0M` `Al)9C!)SUum)D;MMq .xA)K$M΢k1Yu#gô+E>U#EY8H?~K(J h;%P?Ғ Ikεh4IeĜp#Q$ -G2ahԜN) j1r/x_$ޏ\!p~y[jtFR.7\J"S& eyeu8DΕ?苿Ͼgg>w MZ%aEۈDw 7]v< 7.Wi9Rd0^ cgOH}HLH*' rܰJ N:X5XL!Tu\F(#wew͔BЊ*³c]+RJմ>=ft=߅7ﺜȕ fser:$}*zCmܺ.Xh:y|ڹ]vqB^m -Wd<^_{w}ѣ9o--yk: 0K5o|Bݙ7T zS?zS#5!|gynP垜+XՓ$Vp*UqMe+{]}zdTJ={+Z?Ԭj⬯'㏗Lp>/q HA{Tm p&eF2Y"ՁHOߥ_O祪'>c9/u9-Cs3d>|(sSI !N!^ 8xV||$y{Uirl 8* #DS\i>TfU`$$*pBf*Ǖ^JnfGq(n+vz)5uN_Qh꧇|t;Y[*V2-@/*PAyZɽ'm=a X TMzD:BbH#‰9j2D&mޡvCS0$v.FΆA)ӥ.eJNQ1&qwۛ=ĊS|k3\,W3ˍBUm4itQz FGHUByJ2h(JY *?sMtg^a(d83g|7~.-M?AD<:ab#I 0 %_IV%pP[FriBdF\IXAc27Oa I _H6f }Pkpz@[QS\7cBlpB?]Z\H#n|0L-Lwe_`q)N"J7insֿύbϭ>6z׳&Nc{JBCޛ[i"uwdOP@6NF x憗Aj~%6'LޠINsrMsDx7Rkr%Nȩ?jlEq4S*]Ώcb(~h,u6d>K8/5 >!z4C׫G/p8ɹF&GP iz0k͌ocuvusd~"8M j3t's= "dOĎ&wG/8!]# 9wt5 F,2[\!ƍ̳whxNt}=$ݣ2z!FzVd<~=u>` d'>w}QJl?t0@ܒcSWt*'ghv{OOۏo~xϷ/G/x{C/q4\\JI~{pki)k7Z|o2)^>|y_x7Q9eyޜC] Ah?d2??7]SB#B4b@]|9|)JiM|q1iy*rE5<Ĉ:9$tQ/\Qx!TfHP!^QzhC+fV=uQVѡU BGdmeHbxpqD&T6{?)sS)[(`"*}bw㡳7az|aj_J޼P }Ji X+Ή\Զr ET@YqƁ O[_m3н ʧ `Ir 4H{Jʓ$IL"0tHĖ E5>kCL$8ÿ3&HܐGn&MRlt /bh& y} E{yݍM>b`-:wwn=:z$t t|٧1IH▸Y*𧓞8nzHtH D "zN,B$%3mO+bKꜴ B"5`l $ (6SrD!-O9LIE 9!M)OhK׫GZUyAʗ>yz= {#A d֟.0-tn\7чnk'^qY*lh.à>f[sOsk =6{{ I`v /D*-QZT>?&`l"P\r_<29%2nshM1% `\? Խapgs/*|Zzu0]t|:ŊjAr+ yrʍIrϡD{=SN(aU#^3`?_uBI2~Pi>˟DeJĔN8S&&51I,CkJݜ zYJɖ"Ma4HW(ݜ }VUwp嫗;D+38N'+ynҪg 9IJ`-;꾯5l\,.WYZAWYJ \m!\1 \ɍ,.M,l=\e)UWWܠ5 ,0,WYпm?I bꊂV `ߝh\<εj~>_VIN?Յ_^XJ~je+."Dѫ|n2j] XХ -mH҆h9uHTk]պR-uZQ*:ˉT1 Q3aUE3~:ɥOhRIUwV; Tx㩎F&--Xi#1HӘ8(?6ƹjg{t?v Z:|}0y\ۥPΑ4QoG^_o"W_&1]a ;ֹ_UTjeh{&uk/CiAq/%X͵b<)>&nYW >;[܋w4Z%q*Amp*!( PBZA=>NG4H~10tƦĹw.J}mړ|_NwZc4Q^( J@8㤥1DFE6;mf5M·RbL`Dq̉j[TD.TÂьEͬQ˓ {CDHPyyiǜf:W@4*Y & e Rۮ*(Jn4j\=i Qb!i $zHTVr*RB"@-.14ADK4L>twb]L|<hJU:.=įYKC\8Nn5xv4N^u Н 9.w]x =O`NʩޠE|Y`7f dcgiyg)Ex JcH!}?=uzaYr!$H$l2&snwL,8 QJ5"VtkV̛:v} ώԖtNK'Aq֓EG }s_5E幊'6DB-ME Θ :kt?AC96ĥ~t<b1e~8,f.D+e(.9<;͇pt2\ eٿJ@g+s_oxfBhhͼw 5lɠمf^B-[1Wdf(x52,bqrt9a,$m_W@¥%J͵)$GHEm"m.mK%= ŝb 9!Ɣr#aŵKUd1qJxP2#ť4T"=&bNhL[CMx2zږ䧬O.߻8LT9IU.V+yzOrIq VR B@e5˜ z8ŕEG EaI$Z7jTA҇*&hPd3 CAQ1xΒj=k؈f#>c]"ϖ1gX┩֎ݐ8OnI񨙋lEtVB!tD$gs! HnYk5g1IL X(O Rs)e`k%`DET7\^~{'.V#fN/SdfSsw{[\M=aF" gAxR=Wz=:ʩ>kE^sÇ~U:L{jwNjbM eYo𷯾-^ w{k{'f#|YgoΓ~uZit6AimPOwpMwð:st2<ԵwhEx6-=?xv՛\CuZ(2NVZM^wE5427G镧y-+\EjbyU*7j޺l͟oXܫƯPSjQ}rǙn(ԟE۔Lj:G)5fE]k$]57͆tIKuG yOF%kPf>+O}Cg&x3DO.?\hb/Xuqgi6B4q"J(mz' $gET gK0Vk%DRM4}Cփ7 [Um^]8<ԭ~ܒa\XofrBئl&ee)kmH __CɞA`B?%)R!8a>D=9HkD8G[LS[ّwRGd*#=b(mu?2QlrfW/LK &U0!":x1&9 Zƕk'>q_Aٓפ'.}cK,׫[̑g+|L 4|ޭO߇R SarPyR M?%+ckb-Jn4ɓܳ=do9CLKo^uJ: gND%5kη;&_IOg[spX~{sS?"H٩hHkaI̥r<$8*EH3Qu*I%%FB63yWpyڛ`,9i[ ^4|ZKtnY*K>`߶ h q~s9QMw>.] 2i*XQyK-i);tK'g9c6>Ze|t2`da:frM%\GKF4s=’;ڑ%0ITm.+w5>fdݾyO>T* Ɉq)i)>[0c!%ar\imMW wV%y.: 8}> LNY&U5<g ~ p0%o7׎v~;k<8pY-|&f)\\?BzuXtTe'JL2ͺBj-|TeE(D@嶲P*(!&pdz6 LyE+X0MyLp1ɆeQsT%I 3pR༷1qYVg9rVgW հ9ژ8ʕ}Hn8լm9*_F,̸ a!t(dc$&ɓ mn6^*isE3 ېd6@?K^~ɉCBOD I reZ4  F!t*g遉˾89 dZHӬ +HPȞ{&(zbZب3q;flhZC_FY~Џ `P)UèZn-g< W49qvnfKd±Fπ^7Y*7#1d 94aML,(hz*2 c"Iτg=֤~ZP1uÈ[Dlqe89'}B.<cD[N2?0O.jQDCi$n$"YP3>`0x)1iJj1q6(5'\r^gcZ-.qѴDRdS4K add+LAHd JL&}^Nvx-.ӎm6v;<It?I_kw k7z'&7P1+&*1Wc7h\&Z2 U(A}961%'aq$jws*4MOtYl2UxG7gRI"]u!GMIb!ouYrw5eZ:fǮ{O}'/ܹZJͮu6 t~#ǐ[s5Fnɤt .bjκ?u}Ma DlZyZQ\|^3?77"g(&|L5ճbNѕNU\XOL{{%2H+|D{R;Ss'~3+>2-/OrV|gتS]ZbO|,=˗fhqhR^EIV:>ET68㇑5hTieY6, ږ<LKSޗ9BդI:,Vp"\Gm+U)L趜(ⶒoֽPqd+YsͦޙNPۭN '`Wa\rs0hB6.0z%@L҈RBiD8dR1 i5-ozeiِ3AnTsbKhiGaX= Mح/6S+uꗼ0hљ?r'g#T> L#3dK%4+IꠌYe,G%5sbj|A*F%mǝe{(oHA tr@mF12X94#2X_ {|,Ӥh9^;YoMM?Cq5Lҋ U'=¢_tYݧ87,?V۟jihWdyR%UבN:gTi [h̪dQo'Ri2d[YB.-K68Y:^/< RBO!2P6e'L+čFOFcԳ_Xr%XT`%O*YȞcڤUZ݈i@ޓ쟝-Jqsi>:buKCs\HD#vy5yz|qzzͨ%XɊ6=Y$oݵa֊><1jAsOӻ;ZoS<|E0\a.~u'c fl6HH?\w?^EƖ$ҖP>ŲfIJfEeH1(2e%{׳ݴٽ莗wN1\ު`[urYZ* l) KR`zyNfU\;v!34 CZΫS cy"t~^߾x[޾>yF2 E>!] ׻ lq@TOYkO8?}n7iqσ;MJu3Ѽ%SdJǧ@Zϳz%o׾.L^}3ujos,OFB zg刮dlOY56L_o:?N>mtw&3SeEͮv&ZoWhoλ?t&W3N&#ݹKn*O:kbcsQ/߻eun|"2 ~J_*)M9dB̕uTLV>WyLzN=\q:ŭ$ih'Ý{|2rQ TXukb)(dg(½90Pe6RE |(୍C6*Z&R\BjJC#uF*bPY`,>DZ[{Y○X RvOC0.-{&2ۿuԽUGB{}yހIt2Pc΋eU`mkb2SDL"Sd (E6 2d tp ɨJ!,^9\2DPJ?HY(ʥT P$^B+ MYoEC2t]ڥF$fZachϏ/!x3g^[yL`Z#&GMZΎeH }mDZx H@O۫Těλxy6ya otW_}Ɠ[#&GoxE*)'ha\!y;V1_5dV?uROb7A8MŤTH)^a*Ix$.=fgD+#:^$iic^ծdV3ܽߦ[?s8Ys<߽}eah Wfz]"'火GIiفm%GnjWυV\ZUWc"ՇW$ᲅW#+xઈ+űU+R2 +e1@)<~4pEKH\qR^ \!+hpE `:\)mwJ"JkH`hHZ~H WBV; , \q]"eW$%,8prJK.192#ǮEXq&y%/0<$ٗyXr$yv`{eY;r5" XRwu!yy9,*,WVcZc2εHWJc!(kbdcꉡU:Jt51[XU^`̯.]^be{1P?1M;1n7VIv, 7Yrt'.v94}Uo?q'^[*}aayD}* O(1- 87'?fg0?} :CACr[>}ɟV/ ]Mm?(6? _^''4ǖkגho+$l}f3y[tb1R wˇH1!>&~pvzvcf`..7gok#\Nd1v_#797cr5edJtpUwwQ߯5rl1.T rsuL &3;-K}Ckc˅i{)j㵯pOD[3GvY M5 )fg݃6 l/M":K1T`&!x\J(5{H< 3UccQ"_^7&#IM];OcMź"mYnj(`ɘx͙c3P8}|8o.BcVL˝փȨ5 $[1ڦSTӽhsr u}q6!D+oZN'mQJ҄99|J^LZBa-Ņ(k|.`5 jg*qdDH,X34_MEO[:h^,T )ڂ>dg̈́=u 5k0sS"0f9xD.5.考5;hYXjCvvQ:eƻP4@ZEZ>2`9dgb{uPQ;6h A@ka*4:8TGCaP8*PyUѕ2ɠ-.dKgǚژ[P%/DJփP\Zlhdc5x+2v,5ٰ֞{u0*T+E%Ԏ]SPPBh%ԓ+e , R [RT2jX6\]T0؄W{ől!GsUfl2o%Cez*$_Wx,8KH&`-)a=X "܁VFnTzVrE7(cºoEA0`i-("2%il}8SlFW-e֜]A.ŜN0t !.A %4)8GB`@ +W\Y+8FJvd fƅ!Hm-e@?x7rui?ߦ2v˹~8$QJt'H@wH7(y 6#hd6R`_Z ΝLɂ4QG[SqZkh]/ &ؘ?b_~yzWco>Nq׷;I`wrQKhӓM%oŏ'&X=؜Ë1=N'~qu3~wuu?{!_syl+x\3]_=O[}zxzm5>h'?3qַ'=9>:zh+a(Uy]z*"{ehmo۵C$vNltآsI @I @I @I @I @I @I @I @I @I @vIJ ts@W(hLy%$")Q s0I @I @I @I @I @I @I @I @I @I @I @I @I @I @I @I @I @I @I @I @I @I @I @bF E b%t@%%^*xJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJ%z-v$'nL5}=~Pzq}f{p ׈6Ikp rlKR$\ҭ'# %n ]EJ6`]$GBDWѕUDUD)T=+&At%nN0h)u(k]]6Ipb)pn ]El*iѝ}+!)At+5Q{D>]E,HWRrśfkDCWĮ'*d$ҕѰ݉NQYi}lߋ/]08CQ Gfu낡Binknmy4Ξ{{>sUzQ 봋 *8Is98'wnNΙxxz@y ҧ*ْ箍$eeE߇O$.MO:77bM_@;ɱ-<OrF˵,Jýkɺ2nt}F;CT}0%s3kunf9w`K'_;4u3ĬgGrJi3E y\g crD]KS,+?w_L;`WG*0sFlx!4kX|`T;a9.!PIB "_9ocZ:I>>]uo]u{xU؋!ti.l!^|/*ܵT]= ]1ow1nwhu(y}+NcAtkUDKw"J&]!] +Is \]ER:]ťYz3t%)y 7GWp]E5-~썗֧DW{HWRt*͡+Qc FD:]EɻSb(xZ1Zޖ_;ւ,G) ڥk!&iЎoJrRe+u'M=WePx-Z RtvR+4#1XI^g N&laY%.6_.}fhϧ:>z|| ~oUߌڮ8Bץrga<:q5FusZxN;wĎeY?.MѹA% ^ܚ[ʣFgXo%gJ+͠> v2p5X)e;vK?Q̄sZpa I`ac(r ƒ*sLv1ch އ@x),0gYJaXA18?33*~O`0Q)!|9Z:k?|jq%pZM!)=ts9_/`MPCgwDs&ua}M܏/\Zm=s|Xx4fμ.8zzd]kyt⃁k_zYi\m gfb(2·~ 8R ~WCuaN.Q>VZه-0Iuܱ-ha!&ITJal+H{ Z;x٠\8)Tԑ\UmvL; &ƥ0չm3wk vo;gt\E:H.>l y Np+so˺~7u5W1beg bGqdoyFLݓ]֘p`ZrevbIugaT=wwj4ZTJK%J)^Wl=Ku5~}&?UoW򲒗C/f[N t'im҈nto-8۹BB\`D$5wZ٢= v6Y]$9x E( RZI Z*=vRad"-l V"KIA|+o!FzJ_N4yr[y˧~-Y/2 $ N -Ds呁 c0)X&JWt!xo?ڣIk%"O=dp{#YߦfK Ӧ-u&gvl\:rFεPF@%т(Rw:dJpZ;/VW{5gA^+fj.±@e'F1@Ӗxb)qJsBq0N !0r+HT CR3Cr",*9"`x,ǒ؜Iis,5Led~Z{|jam斗PM|ԢQQkNk4lÍSW<G݅mmb9:UpAS^F/wc3F`vEk~K`םWS;;q2)eHݝ1S~0̚]!_$T :Ӕabf~$^@Qk"om϶yÍt4>x`Pp\tOmg-3a۶UY&zV7MvX7ѼSD}=J_PG $gE7:<3^XuFŵJ%σ[Hs,G2qώ,I~~L $|ig(LHK҈KW3 L(Ah<ƣLe‡5i ?{WDe鬚yIx4?_7nn.[IXw..k((:f w?x" %l^-/UM{H;yV {>q>"e)ǰQFGυ5uD; D\)6x`OQ>] JctvIc1gy=dՍRcy2W߫zO):6Mχmʎp^ⴂX JXFli^%rֱ$t=dσ:XE9cPVCs˔ƆN,P.Dx|TD#39Ӵl-z[ 9o3>znJ0`ݿ >F^X@{Ҁn]6^Lƕ|?/ +xg֬BW9d=ra$z J~TFT%4(ו6IUλ.u9x=2(rrXa1W' ¶6@IauE=~'я_ 9 TNV r`X%N:NCysG7,RFҒRHse|BЊ*9s,97?X~\յn36v޽r:ȍ553uե͊v\ >eosIet<]Nn+:gNsr-=r[w{˻늴|rcyoiZinmٰƚ/ݞ%*r?Rʛ;'1k˭͟67\2[){'Qⳁkio 抅۴)|Rb:5v0xmESxP [;bs2@n\Mוd(>g`6r%.8Lp-&I*{S`!90Bw-աOSPZJ[XAŪ?T:>/O0^ɱ獸_Bh7-O%f޿_6 -U)G1IC21)^nA28%~Is=;<,g&G+\pBIN3ur)_b"UԸ 1Ax 9ipTbm,B*EuF?*rTeb#ί7Go-}" TzLskGOQ5ŵZXVsz~7VAO^@BG(t%H`e+!ZNὩ#{GM[tab  Cr, dWG")'2(,3\qjGQG Tx㩎F&ΣEiJ#A]8ɥAEl1i{\S6[~õۛOB6J]xOY|l9G%fMaו^_}"}1faVASUR :^'hhWB%VsOM>5HWŧ4PJƩT." LpQRHŠꝞ$,P9ť WƁ>PBZA=N#@W-w#gZmd4Ƒ# ,E 8 =8I6_ÁͶ-$s<~zZoM XL&t5d+t.|w}J1Fsg0F6I;Kr" Fnb81cܦ;<2Cz[xOyQ?qVhf}BClLrInpp_P?x>D˜za.XC!xLi6iRp"=s*uaүF!hQ*f;hco!>^f6.vB{fk;h*9}_ jÁ3 :,y1E #Hzy gji{Y-$k'#~Ɂ9E;cyl1:Kc f[+!gJ}J t/!Hu#䉜 hy=c?A?ݝ~-͒53}Rڪod=Li`%ΡR*(T\ZR*-S*-ESZDcE` Nѕj2*+^WJIjr(HW POtjѕҒ)]WJ[% )_Cjt%K9(@J'R\jѕ+j:_St%R\_͠В)fP)muŞxo?uXc=\])-+t@]G z{}zvUz3tLUcMs,'j;cñKv&ߺ,K|uc/ mg㽂 _3w)&Jkc_]tzq!>?绹'67}ј1Pu؛Q"Gqw7W@\{W9^}=ݖ'mݤƮ- ;1n[#~_]_oϤQ^CkN5 'D:kë^n]EQj j }[GJɮE 2"#P"zn>鄉{Jh1$}; ]Yu5 ̺3jmIW(زttMWwz6T+&_s-RJ)clZ;c+ҕ;[sz8])cߺ_WJ]uSt%b5R\jѕ5]reiQDWMW0zt%ѕкǾӾ+ұPxojtE:]͠R6vD]t35FW])-uƮ+1`MU0b=7v݊?v%-Z#GDQifë$M#bˋ>½B*|e_acWgOo_K)zHr}yK??-7{sҠ{, ދnVENwzٮљa3}6Qݫ8xG䏔rSq4k\:a\[c^B14:1DK$3+#%m r@5ݛE_ͽZ_P"?\(={oތz:6&9̬I376L[O,m(&芚ZtdgmN­HWBЗ+Mjz]Y "] wPiw廗6+ 04]-PW- $`zt%Ztx] %#6]-PWh!+])n=ѕFJ)R5]-GWdU+~%+EEWJCR`+o"] 0z\ Ji+t@]@X83v%@-RZ(^WJi7KUpbMW+%_vi(Ru壯&dS0mX&4_DgqQ_A<{tْ(;aF LpZ o}MZq(X 0)ܳ'LߐM MWwz@iJq#֢+%uh+!ؚ+U+ŵ])-])eiK}DW,#V+T7EWJ[~tEWKԕPUtOCJqN.Ċt%TP6+-Z(5EWdѕz[6+ ]-QWl V+hJp6tΔ+t@]>4.L])Ji+mjrI.@ -/ %hc@ݺrGŨ2=A;<\K2ېe>0ܳ%>mDM.]V=CGJ9V+ŝ{1$ZX2@uec9'GW])/>RJMW ԕ\HW ѕ_OuPu`&JqEWJt]mK7]-QWҗIWV+Wu7]-PWң}P] pSU++4Ҷkz] uƃJ])nZt%?vІڗ|M`kꉮwJוRMiZrZ40ޗD*fS3-)<˴iC-&X*3Mo2[u֓)li[F_cFf,K/w{)4|t5- -\*s[񿉉쭴]odG{ͫGoK|"O|"r/ !}y>85[l>=#M6Α~ӱ1L۳?_*{n'@NVzYA}G;<<ܧHMBnƢzЦ}ێ6做Vyfns?VcNaǹj4{קg7a3'` x]uw",˝o|?߭x|k꽎vNj~o}zѯ_찫wlɵ*`6U\gjRZ*~RrhST??j_:#=kl*z䟿Oש; _y@09>_]4ٻ=sMnq 7z|VH[>B۷OR|֥oK>އnn޵+)W7^mKJ?KS|_So{wU{U٨ÚeAHT4H|~Xf!ܭ pn쟙OͷyJeC%~z8}-wv}D=/ ]~v_cl}_^%K9Y cs˜˧! Et:4S!~];O{䔽{V^܍G&r) Rft1{KwmԢ>ּm/M%HBH9vMRle[bc Hp̞ך3kN moa|XkQJNiB0QQ~B_+Vh+\.j() xKdYk\mc\$FQ^OI=j],>YЇF.\>E=uk,R`榤1!Ku/Xg](TGhOMBw_KͲnGdfɣ^f F Uk\4Ȕc5>n!\"UdԬ4 3*d6+ |8&X$+$57!:i*dL'Bh,K8J(&{J=p\F[`B]Q[>ಬT ++5H 26 B1j @ mG"HMwآDu>UnYy$84i? xKsZ\۠bF\*"uQ48)ƄULl^vNRB!'D̿hMrOL0=mje!)?^j-+ՂYe ܦG2RX8< ,}r*&{P\!i$RIUFB1Lt`CFuO)ݖw7ZZ :V Aa#:fϟ~ϴ7|7[SɼspG4Ҏ[~56Y7pZly||;n7]'Wz\-Nȓ|LwV8X~#ł.^M;?ﮞ)x2ykE'*0N O ΰ):F-b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v:<$'PpNqRZc(`'StŨc'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v8>ZoukWL~VSjj{}~7Z]Ob|ȸVqx(%Bc7.JظK (Ç5;tAϒКG՜`CP"L"` ^Ǟ` =SP"_AhGGW8& gT|ݒ)BrUu[PRlPyݫA}\BlX9U+ Pi ig⃷ xٖU]2!_;6YFj!, %A|d~ڬ^΀ZfE'"7GL)qH5֞Mz5~yǩYc:&E7T@kQ{N#eGv_rʒUբ[3d[$=J VӮuwUiem`Lr?qu/qݱ5KS2aO>zoM~grъA"&v#'RXubsἴ˸nwƊCfɃfCf6>ot:L~XOWwlz)np/%mB>LJȖҕtC/FPXJu(*ԑLWJ+']%]V+Bi$'U% tvx,/(,  ꛣ7/; ٛk]ĐPɵD58uvYKD.G}ڥ'lӲZ4~a//׫k}ժ)-~ VI^MmN NJEsEk](Yf:y+(od^:c~=]~KKާ H篆X߿gNXͭ}7e#$ɓOgG?ķ=Se{M|.4~uBU~~=ӶGP_۫+dz1l<+0ؐWg2Xr?A2ΛuqZw,On0ӇBv-$')lVUa=%.=x.z)ys 7~ /Br>yOiϙEŭg?e7|W&a~r|3R87@)O:ꌶjםhVe{w '*awgo/pg\iáqBISp{tcÇ"lyb"w$\_}e=8h+@"J큔 G>Sy*0ގ}~?> rlf0"G*t֙4|ۖ'.gw3˧wp=$Ca!Zъl)Jn"Bz\ &[l %==1uè q#)Gd!yNN_~/ď}|v~ff{*u=|CXZfS6#ǢznUeqNZ+E1BDҡҚNȾ>1ۣ1'9  3ZE{`׺z32҅R޵q$Be~? {9'{nq lb]g9$M5%*_UWWQi:_-FSdM,?]sw}5OAZ6*eDΕr[kq 0WͦäU]IW>uhgP JdqBNʠ1( XĨ 9&J2!,0Hȵ%&\5!%(1:H!jc/( )KDybE*W, GAUe 6] gc#NJ-R~z'F4%Ɖ!uQ42FhKTqKFw"gqC7ΎM\(v'+l5\WxzgoűZzoS%Y|8'u >ގl466%nu[lcBjM#F@ ڗoD) e=j'F:\ $vP1 yE uLJ"QTq躄Jf,FnXӅ8P]( B£Eݦ7o&Y*菷w;ٮoTn0]c[CS$=aFG:I؈X#$Xfx8 s8q&W:(KjfA'툱 U)^DR:έ95 a<];ڪV=a Y©WEmȃ (0M,$"#1с&..To$h!#2"UQHbS!pD$4v[Fcb<X?ՈFԝF46w¦QPI!*1H(xwZXT#ڼ@86!AH`)>UTThB@K ͝"0DaX2'LNYKՋ^4^6 6/yz9s52B4Qз,*Њ$x=21bq>=LA-Zokr7ƫz |9 hwuяOh яt s'`؝;0GZ8:LnͰ,3ixi%]b>dݤml.W<^| C3}v97XKfA1X3 ~fASU$78'!|zoojP# ]q%Y&uZ|2U?ce،Ep~s~ ^Hf]NRu%lpIUV(֐LC͙yW|ߣ~BG7Oob;̺ yj2A PM.߆}~9)U>K7mTALN/H_?d!wy9955lK`2!%ZrEWWu&VokN/2cYmKB]"y5~{q/rx캜`IV&#ivuAoH٪lO+l]n]5w9tf[cg[Vrۦww>˸G+-C>lCwއ E[Y;:]?[gW.֜3/<#_ɠ;+$NWy(+ {Wwpw|=c'w Z?CYz<(68%x&y|t|Siɲ\1Y΄y"jR 3,#hZ5U>Xk 5P m+U1ڭ݀9?}GELNhi5K}3vO[ljyh$xCs"Y*HSDTd\ CXioRt(4,w1RpqD>wϹ "V@8bi3, DLI OF7oa[~rsgYࢳrS. RQ| y ԗ:$cQꠌI%-NKNv.8 * n&9Qнe{stRڠ+]$c hJ5JDt(Qq%Z+'AcPȸuVބGA'kº'"C1sHVq`VX')J; C폵4'w&Df@IjUJNt9#e.2x˼I CB"0I,&.ƹ[|b<EyH&gXn F~~~ci;SV*Cz74.ٻ0'}Fu˘8&G~ r, k_fäsŧP(o>!g7j^%j#ϲQSwV@h)ŅU2" NT,}HAvvc;*St:xqY؛Ky>څW#$1{48eQDSf_g7'OYY8?ǡ;'`ssXՅ^6^w"Qrt)\Qeozyè+fqrQSZ(=jZEpE֫c]^l^|1 u9^KZGWkBIBgMCq6f'kkc]s~Ut_|_'}0P&'>3'Qw7ts>gF-UssRǣG-y|zDLG4̂m7 0SyH%T2 `2bI̡8CnCw*ifBiʥɝ2:j\>LPmZ.pMrXВiF pvEHHΡ*JW\ l,* }kLyp rCR$J(` LZ\颠NSXBR`1!eDR*FaP;sù'q-f26`(Bs[.wNq*Cp9PǸOLDoDGLC R)"/FףtibSۧ,m{=#<>⎹ZUM6U`Ʒ "f`*IU$ru鋎\xrsuRCgH^h);zljtT `N8gQX-b94D )apQjryqL4R(l&Zg\| F!PV2AT(P)rH<8v܋YݬvlK5c Rt6K$T:i2ج>^*uuxrІEVCrl pOMqnCqA?!ǻ 2Ai I#J#Ԓ:%j A}dP+VRVw,8AI kUE_[$AIN'F(vt-;wΏ\p.&8Z%$ 0hYBYKTQ`@O!)I KJ eŵ2/BpQ=G)5EÜǓ>6?e$Q!'B>7L[+Ix0sɲ.[h,p~1D{mm-Nmvu'r"nu? !Nq?,+4i+XVhWBD*ιB;UX&67:vc9#-:N!tDPr"^c">j[ DHFjy?F[zc7-?M١0@} Fy efy+:['ւ7%`.-G6!"$`\v y ߚ6,kB>]l;DHXLq^]E c}DDH+Í#G/˙>y29BYmD"΃Ee@h DGDL>$9bL)ɲSG+&(a$^̧HfxV, Xm|9V6!q"r!$-78 dik?g-I6gR&a7B}M<rQ 7H险~*,Z"PdAbTF1!E-p&q q8p1xΒz nLWl9Nz/Fw8g7_y]cEC\F2v/]7cJ zR*Z (ƣ~j"t8G.z"+'0 EfVFÄ4 $ :. Kw. ܝay,wl `TR Q*1+Tt*&5=-tB$/)‘:hSTOL3H$s^0QIa\WhKlT{+D!3!s:%*EA C I59}V4"&g[3I>tm+ӥ p\2]zx%;*GȘ P))0#u$6g& psL!58yP i̱9 o6epE+3w^ְm/R3,<1aPNtd\>nB=[Z@x 0Fp'Zˊu-Oq\GG\fk.e( MTDR0)II$TQ\\IZύ$)$0&|-iψO9 6inC'brqS :Mdھ#ezyZPֵ׻zGER!T,BPA*IM*ӒgJ66ͥ㽮9:H?5rf..(H yWh!y $0˹\(sH}^MJįr:>=wLx2=lӛF5 󨙋ZdE4VR)uS DIڃL9t+A˵B &AdAˁ:oeR)UTfh@ZF 7dkfsħќ|pV"w _nAp{lqeѢX38AHJ rRPզ+$I `!yɀk4<0гzZyjcFo9O!E&\2yTȚ@k>FB",ApL`0T~+ [/6;?%Xei\F`E'QF HH( !C%ECܨ=[3:O-&C&Cw )jڲ빻Ć/!dC| ntx(sSq"54na#kp"&MP^!r 0(EZ|`ܘoWS+{g] y3'2 ؞ġԁ@/CL?vX_UOxq0*rtk}{ r c$eRd/Ut cOu|<~o=o*PΟ=ˆ[IJ8ElW. KHW%{$>~WwmOCE/z$5Sc(-^#?Aˌ1:̃OǓǓx?/SB0;Gy2_z /;߼6C,(0__hE!-')^tT=zxC}}Їҭ~"QB{jI I?v8eK߾6ɿM/w?/c4e]“/< 헲ryMSYj:L_VZM~;ꝼ?O0=\{(ʟʌ4oW~WeŻ:_o\\r1pXw3ט~S(Μ8}~y'7 > &=G8d櫩̵}U5xP߿ E?orF7Hs% _lݒ="YT S®N9v;_^ͼ(ZV0jd÷+eHoqzqr;_PϏCf6)Nbroy0%Pij÷HTȳ :6w\sz:UQFup|T[/vIq1l7x+f TND%߷_nP߰ҺӬghzQzFC!(B ʔ3x%ڨs=d <^oEω`6eLE25@H3̆2JRJIiġ9%Ih<#P&Xf>eWǠB}Y|c1's_لumImGon=Vmm*hvfvp+ 4Sc;,w&YVmiRRBڜf0!j+p&7RR#\i.gs{*TSApm\VA5wd)w͵rIP|Z)'{Xw/SNN:ׄ֐'FMK}헗DpBUFg[f϶Y\3mmm´sm g0)pJؙժ\TAzwSm&=ijXDȮ]; ;$ziȂKN.QVtXȝ,NtR L{zG$2S7YNܵj)iؐy9)Al\%pWO`J\,PWYM-\=Gb+33pzW*K+UX W8UR_kÃYU5GQV= x>r&_~OҾIap!NV(SS$!~A?<_J]CoBf~eFΏ}Oz-*QK5ۧ "=z~2:fc;#ر#kqTĸH[#JʂU1NqQRbt3e lt-`X{|qvQQV !(ƣ~j̈ )>"o[a@<2&lˌPfpgSZD {qwxcDGV]) :k֔zx"&"3&(^U0QY­ ܙOh)W81ᇧP qUz8茱]29<*Pә?rsxE~n> ,M)r##,k<0mU!\P}uGitRSi[|?[2 J%W`0B s a*E<Z4xԍtqҪղeɲnWt]!]S?15;ٶLkSLvh+B,dq)Kö/d)-X]xY PJw$ 0_ώ0DDHkSP,T}rިMwOnU9>U~,͗6 :.uP]<'>ͦys/bmYIYE RP=AdeTolk$8OBD Ǚ-% S42 1 o&7rGݳ2## v.9es4IJ`DZ7Zc]ką8ʮmT צ~ p-^6>^dG_,UA*e<f\$Qb)1;$γ̆v62xk.27>d}g 4tX\ o6d0Es |QęaRqG?7`PϽB*0M0Fp'Z+fV;hu}VMZje :he( 9!{Jy$9P$Php))\Zύ$)$0&|gSG' OMsĹ#"BsDbDvYu3:M{cRZOuR QL4bYүER!T,BPA*IM*ӒgJ66ͥ(DW!| $wA93Ph\r$ąxlIk@`Ű] դMZlQFTNoh8=dxE-l"Ut ml:xD␈0Y-"dVYk5n @Lɂ P)u >j˄RXЀxZm#k\7#`C5i-؁-nL@" ` .9U`~X?}ބ-2-8$0E][v}@h#/uē׍vbކƩ (>,$J"#k,DF!I`h-P+Ao;(UWf[}Ŭ?f5g9n2nԯ٫4\) P@g>D3L{ٛh`Oʖ_ a?q0XQQ8ήnR(Y[Uy:" 3<1^. OKv Xٯw+3~cOioC{4L{oqՃ^qatDpH \de'P9(6YBI swi_AcH?ޖt7EYvVaٴu=3мazoޥx|6ٳ,e 9LJ+Xΐs E1 A/v6)|H'rK㷽?A>qsi|1ϲmI8ՋKbiuu'#6?_&xq_Uߏ% `}*`=M.6ɋ~Wrnwֻl|&y<-zK>5+i>f/W/^E!{^'}Ouu?;s\ߙM $QY&}Ӳv΂wY &?š-=,ct{Gץ^b^oSÇEK3|TuU<#0lUCs͆^  k yBG[^n$kKnB=h{b[/z ۄ6kDw#0ۢ5&ث).S@1l:z{z~%K)ϯ|M>S<0MLLIM5&rzCaϛmNO`n$mdխ62E?,aQcY 698DBmNu"a WM)0qZFe)A Xp‚ ȳ|\mϥ/Qw9,)eVgѲG\n6;W̊ӥ]7^–r3U y8셉 G6GXT `.PJFyИ) sK_kZHѤayHcR YTqXDIlldq+%C3u  4A)D6)N{J'i F, $_\݁kijGn,0XjpÑT E!5ymypi 9/SR+HsrJ9 3b2{w+ldݑ\Tա?𕋁Hr͆G<a$0 IbNQbl\|" Rx"ȟ3>'O!dD01:RLݑ)peGQ輾ּ*aYi0BTa5B&͕E^Xgt<"wCmƦ?YS_zx~m \}k![OH\&B gٍ{[gUK7կՕ gepv1W0R% ڰ:w184Cm$ZGb|HWmÐauef9Aa8*u`ŴfŘb~&FmW`<8Mhd$ ́}M\zVYڸ:G8U[wXT5 ߪrK`~uz^_'/NOx?`ʝ-$`M¯#@߃;CCc0^34UlEw]O6.oeܻÖ4hnDi:a_t݄-ioxjUO+| (6US|B҉݀@7gW8&[Ϯ&&(;:<>ҁ飌L1& RHvԂ#6 aI̙ƱþN7:{OI;;Egwg# 1n:EUHç4Tу[o_\r].b;X̵Q*g er+g)2P<>c[g]o=Zu{A3є^c)=fP`8Q' ZA, SC+.as,Э Wa.M8坲\Kr*9&1`С~sl=B(8L "vz-;:`a p)1浉v|PT.R˵ Rm ts9.MM޹ߊEx9)R9.EF M"TyBT bJ,TQ{q)(1VYE 62( F刊`''B-YW@'4Eͯ}?p*4*,K[\>^r+ U6"q `?Jsa#\(sl9>_A ,о%FAxDyFsGKA0=I37^F)Fa _D!e!x0k5f,`ZFL&VFs+%c89[kU`Ӆ`ٱ~"t}-F$F8BsC /ZVrnbLq$qКrBR}tNb <* 9N|0a!%aaR:\+r6 jWI"$9.j욂I tr]k9TX~|8'e*B.7V){* k${{C>|{Už\#*qrdFQE=n;D?>Q#c~.2*9V 3cn&Za)f#̑ s܉5# fb$THUJM˸D #S5RPetjUp!wAw}*1(GB9mSP x.=Ε%W#XMKr؎!7LYv:+AYiy4â*eiY19.&Y\N08-Nwl|11>);H9* k& i߻z5b}3vTxJ060) ੢0IR=2;)!?C* [O8LJ>Dd+}L,Tt(7TD_= eQ[EȏRT @WW &Pn612,wf][GYA'ǻmeɖӺseY "גV@t˜$f}ש׷ m؇]L[{"hCSj{ay U1vlBZ6]7wz^4y]}Zt{ކ !u:a.N?]0g֓Xmns1d8J?D6΅ikMhaj{ƕ_iުsNqb~!S8vj;IEʲcQ^I6DICrfp8`WyK8).%v!ҁ%ڈ^X4m |ί9?O;a Ǥ8O?XšW|<[St>5of4]go]\%YZwu} ڧhxcZj^:?*D+!)v||nHJ| xυrTXe@YNŭt^]Ah$wo*]d?1Y !F͋V&L(Ƀ3[ó@;Fc325ܨ=?#YgɻÄCK㜏݌६"mmNIp87#W[߷^ތZK*ތJCO8E(S#e+q%gB¶{z3xq \BYB^Qz\I.{gIht7-Xxgse Ezś٥gՊY^ޫMOn1 gd6/޼*fq71O=>)g{sRszeG?n0=.2{< ⸬_z9z`$C(s F3fN32 ˨} 7?g;k+Ai I#Vԁ ,Vk8H RE#SjZIEX)I%hs$PzaiJh 4pzb tm;Ep[\"I]`4oK vb^aS|pZkT6zL$@k<<2fz(Z2vն۩AIR8 6%ke^"!z k9%.;'K/% \6!y9ITIm|&@) c^">1\,K  vQym|n5&yw&,Rf +.= @E,DTH^ZNC:"ǐtoj#q msAņv/P[w;6h ȥ䀺ԑR$IK5)q&Q }kRbb{^dީdQru\ iX!3@&O% :|hϵЦۄ-ֹhI4 k]䠬'BxE Pq:[.]zc9 Ay!4Bm7%-0&Ƞ'ȹlk KH 1FX HC(*yGbNw8w2nY*u˖dcX1H8!8(! PǸOLDoDGLC R)V_-/Ct{_ y;tAwb3dy1!ǜeLsC=۟LߞVs\ f|O RjX)@A"~P4H䚒-OF.PoQȡފdd;Na9k="hABIL$>&ZN@M.F DKpփpB(JF"(N98 6[do]Rt`~k?yooVA΢3iJ_mkxC_WT2fR0~e3\J]9etۏe*yw !RN2R~WUmWWJSWO]=YAwNi/&n~u~g<Ĭ(x\'CI *p/؍+Xݤ3Bl3vJ ="r:>kzuvo?@.OZrB+U"GQd%mU{Q/=JQ~ëqc)9Ϊ:H%C+%L}5v3ޙmR)ItL,RpRdɛL&KkLRΧ2@o+I%?> SkYy2Œ r͆'xXDAdOb<ΚNg~F-1ӣ as#њ^<}~Kʹj}{D%n_}Kou僽rJ/!T&yue IL#qc( Dx0H߂1!+=_9cS@j] ϊvZ-2+Ff08U9l} zRO Wz=740 UjⳫ?qN 7 3ѕܔ.fv @U['=/a$w8~3i݅UCYʺev:—`|@b! !-MKqP`}wҁgr+F3B>*SIIX ?E˹ʃR Z)!Bi>-؇EC,E_nl ,X>l?\mӹv?%L$+tZCgJٵ}ݲk 10. !oSh\ vzQ# HRӪS-W@PȾba>4g33 eSg kY[#b!^:%>0gX”6R jʜ g{rE,<>UeRvKjېTȮMMh8{K/B -waO`T"ޖIq=:΀W?3֝lKwa[E%&\!R( 4roAy*LJJ3$hlg[g;it i6dp@f_ߖzؖwQ/wPo3P/6_[g^--  Sx9si-Z » [x[Xz㞵h+qB\G PXF I"əׅ:q]O۔ [t2 |k4޿m'ބ-ou.+cr R6Pͬ42*B)șBu-)Nx[E_20Mn 4z 7ȥѶNf)sR_`厖 s[ڀkɭ^tV̚=-O̶.3ߢD|+ub2FFI V!&D{/"F`$Պ$KU .e.#]E IJ6i`$PO1&$V6Fl1"jrEC&'e|mu6*BLŒ/tDl>Pɷl^9T`on3kͅ"'$7ON`b@9΃ኰN[#N1C(msN t;}$bower[2<.9#.3*6T A$6yᚠe `S֢2/l)N1l@1P &2Xฦ:6Nz5z`{d((yhHpwwP*P]&KH1_q Pn2OzaH$keI{3}T+Yu<NX8e~Esx؇ٳ{"g%,BPD)H')" W<Wc~^sOרn}e ޏOMܣԂ{ls2 I?yzc\2_ᯟ_"n4~m;=u}t,Iš1ߧُv{fW'ӽO|޳O]L~[Ψ<>HYMe<.޽yjZ<2Oy8YTɗ]U;.^Y |BX,|_k_+ļrůS}Ǐ'oX%G+p<*U7lsQGK4ly`1P?~^SOUDz!Y+J4q͆9C ^ {(%ca硱5DW?G`4Ekr_cnht)#-r٠-RZoA/OX_7}MRݦrWn+m7 %.'õs;Uػy0Wܯ?>; xz q=.sO]X7^4pޓű^3fڡ0GO y,N|_eRfMO-rPBGP وAVmc |!|$JGeyy{o}/Oe=s%>н?SGp!He<9Ą 1B# }LICZI41zkbMsMu&ɲ M2a΄9dukxcPdp< 7.>ͫ{?pi*JOnw# qLJ$D.gۍZKmwRSIh%,&2 $$EY< ζ{-}~?tt}@+Iхtn.}ۗ_-Xt?I0ݷKe }^~~Ӯgg~jT/?;zXgo&i~aOwq|蔜͞ ^OjiMJzyI{\e,irϦ]-F)bC5à BMYdc-$aڊ:#9}@[Ѳ5V띉xj'nI^2${kJ}1oZn0;LHǤBieu3 (ZSB(S28@.HIhCsUIqM(H7-~!6-JQ%ޏfC.Soݧ-y-ТswMϊgQ Bj#s,#b$Q-l6Y'[-u:Q-]YgyDl~2*%*;PY-ID$H]$ Ke(bY@nY1M/U `VQR(@$+36 Gގϵ4m+gBfh&go {e-2ൊt6e: ReAڪ B3A 9pztB`u! œZapB9Ui7,o9Zo(G/`{ϝƵ;p3AXqp3* mݱ$4IIgR\Taz1{W/{jϴ.L#1v{4iqSOGpG@!3L\OH%4]ۓ5)˓t\"d@!c)s̺\c=&dgme-7+ȹx7;Xj)FsP?[\~hfjׯO/N5Ԍ0^E'U4|ǏA3mmVN񜖐4yTO(qfWK4lT?wg޷'\gOs1nn%휃Xmi}zwGw{FocFlHJ/r0b0Ityf@hꨒ0.nޏ9^/8Q[=jF]x~Jڛ2&x7S_yPW]+7⦆x? GΉ?}8w/?=yg4ٻ{w[Zu~G3p%$pv&x2CsYkhZ:ЦS&|q#/m !ޭIY(~7b|r8h%6|jU<,Q#D6eN8EKSeUx[Ro):hh$ה$nq;dۘ@Hxb rCRI6s ~́EIpU^Jϯt8bJHb%u&bm ,(:sAR/q:q*T',$+Q[Q%sQҽ9\; +4nݑ'10/4څv]lt]j^'r޹F)oQZ˒cԂJնmi'OV4Fqɣ AYX[^BF dRQM(K8w 1y!Y%7 0e9;An BeTS#ztHUwe yE{̩>ՉH>w$~r@'l7tq<BG?:^]6 (ZrQ& {0Ȃ#$D!}6H^qYIDQ{QX,=ǥT 9z@ <I]IR{SdYPT1dMVE9"K5\w5rCZ і;>%8>L{@Ak//G A! 3xKB6'5"iҘԉlD9̵Q m[1-f ye1&E`\^F,$nթ `ΘX6RõfĥL UdԀtNHdw$rsc+\sORy@wrHlz/Pr21"|=rmAD`3QH0}(RN%ԐMRY `7$0҇Kdo1y $$VH)nZR K"gO@ռlۯWHm5Q_WV^QS KP%$tZr%,n|,y&֌A]q6ǘC`N{Z\̒젤:t $>E* 7sϤjkj׌J8c[]+B}ԅ{Յ]d{wra)΢&'o]ccH0 IѨh'OG{lt_vx1ͬЏwﺱɬhi(Mi,Bd1Y(H LƔ~ϭ95v✉y0ZwlMemZ`WDŽyTXiRd%s',.Y"FAfg6cp+ rEV4dIЈDi8R ɨN׎ W#g>lRBe2(Cшc[h+kD{ԈG㽂 !)Ud'$L-JgB T2p+C$KZ(+ EE4^#~>OiY|ոd[*EwԋGLԢƠe֎m*22,3aQ*A "2puԋЋSfwl>T]NeFn࿩66w?GEPW}]fv%+uOMf8ydĦilN:+diU:qξiLq0[t:k 6Z} 6*`"qM"Q-;hD4!zYҖz"I-H AY)i:sh) ^;9glC=LLB -,2Z:b+\* :+V4o\RR9sZ+:[(DO)Ķ'Drƒմx\|B}\ifȈںd{,ENXBՙ"s=)LOa}]psR\jo}5%1?;'@RӊJǦhLg,!Ĥ4$je ;$u=GdNިuWv}3|!wRYqNg~܀RkI6{֨yɲRZ X u8[ME&;)RN\j6Ddt.r ";D8xѵQϵ]ioH+?u `:NC:0j,Kj-vAER-ٲER$HN[ujN[)w* 5ze@(0.#&pϴzAFz%U -Q1䄥 ^*m4U aU8R+NzUk3};;+nı)Dn PME$$Mgyβs cx2Q ,s#(cpR 6pd!(8 B 'sB.L{ ܶ6O!^Sٶ ϐni&hbNhQs˃nɽ\ULCKUΑ'9G`(v%| ]BBbBʭ% P 1HZx 2ҒtyɐÑ HPbRv'.ѐB|)pl,z aʽ3ø)ChNPƺeCMTS'k+se7UNӸUkB&#=J3NX#]*q3}]$0$OWOjTGxH$ H>@8OƞAZ-;]W=+${8BQo7`w=ݧzx63WG,o^mdF\zlf% ~#eQ,=%00A_J̙o3kpEEӤ H A8VIoYbyW$R ƽ*嫕+WSAYyVu}oIEMf>f4t%wore`/WJ=3heo<YEWU\%JJ+px Ts+`h*U2W]+Vbp$gWTWy;W`w\<'p-d%ms+WW` WGeUmd \=CJ3wrwړzg x箒sAWD"n lw*U_ lWJռ|p%5g;5w ~Ȁ*vf Hl;\%+h•YwڝT՞ Mi($]Uvd zpm$4>uAډ/ĝs-c1c*Ц__˘a(=jט" 9|]Aïj5ä̫PͨzKMLY̙C:<Am| /Qza得ͅe$)Ua捌p+ A̲^8'ą^jj\ h_jȋq+{9YvZk$',Ci >}|"FeRmER]s *2Z\SQ.T5͓'sjr۩QRц0ԈYl2pu'\fܔΣݬ[6WV=FPCp Vbg*ծXKvJV3+D|*U2W]dzJV*3+LD!`Jr+pz*Y8csA=qvRH~UwY Y$urX:1ӸwY^g!cR]#<\hs ,HM^u3Y۠f<5]g?ן[UO\%rуo{"64׏eL g^38Sg$h=eR+Tz(Uޅ$xS,q)XZ"' {= 3.x:wXoW VoPf׺ƌ.4Qvdn1U͡}^ٰ r2R/4wz,_=Cg$*RzƌƁZЧcĜE Nuz!JUo$un1nWIkqq`)g' ] C!eWC!^U[Ue|v{ںV%x3k xt}A$ *[/$K!EVI l&w^ 3||-w3i+ +{H$(w'(uSJ9Xz4? 0NCsKFD·o& oN^yF~1_c ɭ6ZaN*sʀT'EQqZ kޥ K`NydcV1E4^JJ_nbij(aHbs/` Mjq{${l!9En)M^A;B;,* 1LZ&JZ`38e|yk{.0*7q'Ņh&VGgλ;m<#~l:PH|ZuKc(չvHT3jYSIl9zFN<2\g280)Hd}zy"L02:CST $HMmkˆ^Wxd!R&R`1т+d01um欹+ŠMݏ_OGmoB{&n蟶H+A1iQ">6s,RRN輎 "<"\5`x`#eLdyC%QBk:2̙13F4^ = yW[˻ڮA׬x gTC`VD&7ߦz"8Wx; ` N9U`MPj}~6tuZ:C"Cu'RTg׶ SȚR :p-D>0a|S|8np?eqae!(o;j٬'glRq(U Сь;VKu~Ǿ,Rmw~ٝr2U`C:;u\'B|>/`Y)3̎C7}>3*>c?3>w:3o>Q32-ZB-2 NI|.fMs:*! 5!<ܝ=laBr] [d/-6.7(~|,R9F:Iunmfߜ@kX3)]c9CS4&|ڝVj<(|^RϊoWͫVmUGi4Jq%[ji_Pm-/YP8Z/UO\Aoԃʚgg_-޼-t8R~|;U}y{vO_5yveW.3{$Ѱ5W^?7*YtIiX>d FU>$ ݹV`NH]eΚtz 3Ӎy?e1>l[s=&~.ph@@)JnrGue:OyczܡU3|K`ZP߿/tU?=+BHbX"I=6ϨaCWaû)Ndx mWMc)e;y`*ws>,*s.wd-\Lp94}^u>vMNv7 TyOU#ouꎉ^ f-;173ѣרT*i\I7Y@cs8OVui&\986AR Vx\"ved-۾^fg#L \?J070Y洀h 5(_+ȃyp"yH[>HׁG=7;"]ZG)C3$ҜI# Z$D6yp~,ݘug=`=F5کϙq'0 T;3t7usq1^@wr&r-iZ HĚh\fFZyFcipZ( w)%u R)9Xzm"Fp4w2şDn .%nCR,+Yys3TځOV, <sq5r,ppj4UP APX )aREqZ-NcF1N b(3*?{ƍ俊=Às6AgeY$=h_[-C-#mEV*Jp#3):ɬxPң#'Kp1rvKF;d2OGtk|}="vCuQŮ-g,eԔ,#|\h 7CYu9W J~VV'4(7IW|hP"rReAy(-G0(&j#1Q aC 4`CIĀ ICB KtQ8ct|C7p[*D,N@]Bȝ^iTa8Z= ʭ*TZ?M?U;#q" }5'ZJFQx2F#ђZ.mRe;scFT$ Q&wIo!:($q ^kT ;K[PixuUqqq}{*y<<q"yA w!C "D"FQ5 8V8Ѐ: h_Mn8I@l[]Ԡ,<⦓ >|6vʰrzG69=Sd D+aU&)WY!u!^M>ܣITl~.{3?w4)YQ OfB>I[/Fh^C!I'\ySPi2nUX QmfHP":̐dڒ1lr92_"u&JT2>Qglu^]0̸gY*4riq]\~2Mu-L܋]_dɎlݕsrȭ53uյf'"!em{ Sj.kVSε;}4ڑܹr"kx}|uO<x|?rs;OX}9JՁilVI?٨C~\=miެI>sv>q:dY?,mo< #iNI-n´|eO.+ 5P m+XBm`>X?{N` ZmRSag) xCs"Y*HSh qCr9͵CKr湋2FN|sg<27`I(#[=m6=Қ\w)A4 b ]~~'ϗ:=;^˽7Ր{e0$h2QQ|ce*BͧѨ{º' "AD/ḿ*'=g<0+,G^VF{=? nsښGH"3 $*%tϭsF dx˼"X/=zA*,Hp"`8j hqցOKQ3Q!1LXi(&~~9GiwT ?tQh\ Lw8'}&7uȘ4&[Z~_MYf_flϋǮJsI(M㷏3Do"mwMbǡce,7gԠ12QB9agVSgn6W~t{6:#[-?" ,P@rEk."#*[\!\,5_d.y&Rn: Po׮V$U•}n/WZBW'Pg]8e;G x]=42^T9B{݃^n+o _-^gR!6eܮYmb^{__jqGqJ5$3 yiX4p9]fyxR ÝGBsSDtYۇ욵Ys?βc=2'>L!J >gt0iGSi]bjԪS/|ymG;[CXqnⷯ r6!>ȇ-]a]Y] _AQ̯CT!Uʲ5UpjbC =1GI1T}vk!F4BH"DSFA()u,C$hdo'}jCZz'^1N3K{s4ϹhiD֖$,pLB576 pNsY/OI)8 e+p~xkCOwN>`ysXՇZM~(NvYղ:@X6}\/7~Uo݃f+? ^4;t*E (M8dQ[;g1uB&QD &tɔ.U> 4 #P,QF$uڡb #17+;jp 8ae21`( BsG.NNq*s4GGqc'& M"F#ZYC) U]z2NB rjӃ͐Ogsg:G}]⎹ZUM~7`7 "fTV]B"8~P.D)|ё /Nxɫw- G8j DGB, hAˡ&Rr)& 7I@MnݍF D <x(+ATN)r(J;xyDܹݬ Uj4O]ʋK,QlSmLS.3U79H^;:P_5^JUn4i<(9<.'xq^&(4B$9"N- ^aZ U>2EƝF++;$y֡k˵R1DR%.F bƝmZd4S/S FO$I".hg d4Wr%*(cP019%I_Ęq Dqˀl!n*2DT a c0gmd6 ԇ)ZF8j7L[+I1ÌN%bp B;oeOIru_nDsd L$f!ȅ32<: eREH/c'$c'!߫oc hkԻ7݁K C~KO1BP&=!{uk`/y/wfXC&~?~xW6'|CcD a-nyl|xgqyq-m~YGpv2[#9ĻDךEMLneSH. nȺ6}o9b1ïͬuf=v_4( 7/twR2#\~G6ZV ~ V/$lmV ywS lmV4͞mf8!`{,߰NnV]݋Z%7G2W2z3FsiU]I Ψ+&W#bjն+ t+!u3ꪒ`WUmWWJ4zJk%O봫YZx4ƕ-ེ;i2ɭU;&iAjXUb2b$AcEb+?v?nobWr~&X]Oo~~O.&~6z z_QHPeK:EAVe4H#ztH)gU&4NO'f[^5^ 9~8=99x S{$s=ѿ{nש¼Àw{?~( oUL4'y+0:fL Xd0BղA@\@{HJ1%FCx`v a}kSTK}hXr?{f{-Lbnz3 UJՑ?6'@{1ۿzۉ\W 7wo{\#`wɇяoW** gjmJ-;=N묋^75mP1rMA4zEc 2E k}ZLT5{-/eg5^+P6@FIx098E IƒTq)>@]]Sߢn w&BdJƠCd=f6B|$dCL,MbZ{c.dNA:vBX]l[r\2 )g[ȹlk.,mn' pR #VH3Z*UBQD,y}1IAnfLY] B7gZJ7R2tz|뎹c=v8ܟM їO-x@\cC?@9hBWq|hVɫgBA $#i)"`T&AX8F,Ԗ UdKP2,#1ImQiCGb28E&KȮ5rnGv3Ey`y9ʅ, vfu]v+v ?u|uLk.nN6 $_mC6 3|Eu㣑 hJf.躀rkyz@+e eNZq"xa<"HT'csYY!W *Lx EQE|`GFy/IQ(g焑_gaׄLJ;[#v9osz2m iSL!xxJs@Deaa3G뽰gS}o/*&I/\Rr.6|) !f H)b(GB) kH M~~i*_= g'9:#l"t1IWBsT)\C"fN9,!2£Uն&oHN9a%**NKYs% fOJ4)Ģdldls>x~;A[x3sn\hUt^2hx-#]|kw;,?bpֆ7]r4vb! !- THkEA{(C&k(oH%VeE#km mۭ,p{v+ $!Г5!g B1b2QΥԼ:ntJXCiU.2Ez Uf0:ZȅPo9:9wbH?ŋW>܋wqP]WuZ^Z8  s_m۝4`ԸƈNˍ\⿕F2v']7dhl~:uLy)ɀWݍ0.$Ѕn |g] +8Vec1Ki6P]?\ ֻS jWWjHrEW`ܣY<0ϭ?ffg"@I`|sHa &) JI&"4*&PLg3vv>]!Y4T :a^f=eRɁC]3 q@\\&=A??_UXzY:--wX;tCi횄&eڣOelxp 䠛Uh S:[(l'2TJ.m( *1KB'EP䑍aXSM;}M^nQM>ئbn- b<2/3E5 X{0́F e L6ye%*"hEfkcj{&!(D,}RiR"(s.#.mcȹ#n43A5|F^_QMF[*0[_^ 4J/jdf2Ĵd&H#Œ|`So3ؼwW{]c @ֿ5cfD VXdm4HOlK*i+TFaM-?>ɢM%)1ěcӳu7'towdyMGO٩0f6ClmlcW@XCmiROwQ4BwS P IQRD]PP^kr&#2Xqi5gEJcF:<9[-Zu_qmf!f]Ċ'b Rd BjKZ&YU#`X3d*]ug4s<8FABEXNhGeJP$ )gYTJu'㝨\yZ, 0|.N]Q;/ B _?YIb  H$2 SIKJmO귲ADX +h8T0tQEm[vx;|a} %Ԇ<-D3 Ϟ$ʖt^1xuD#QݬG ݌;/+_5j˭Y`5 կl䛈[< hO66 +1B3i/hYX~ݶkmJ\`x֛1{oݬw|t:i(Ӥi,5>n6~5o&W~s<~x1F+ SQ`,; Ұ˩Y0کwԗiW;s^!uڟZpR;޿Oo?7?ԓ╒5#ɫOZg8dPj԰ߥJHo?}%x0Q⛟2mgޕu4o{o|$(SC^JhFwuGЛd5+/^PZLU>5>NCo7A٧l*|>~MJϟޅ7hr/STMVX*k4?^l(4էϧeT9ЩOΚ7&?Ě-,lc`nF/%]`ޗw)󼅅k^=*(I]5j򻿍« 0ϟ6^d:,dn-Px`M/=;6<k?OG7)`Z]zwIJ `Fc8WY>BYz}>HWh43X47WP#3gLLӛܹIf vzge3J6 t8hH 44{7)44\mwfpQY{t̞ޏMvW'')N¸Nm֬v{1c-*Z mE;i`2݇,tp|Kp~yѯ"-({Z,9Nzۘ|/5#`vz]+ӻO8^" LҺTo?%d9BCYء$DiV ںْm~B"e(2ků> <1kM6aj^<(yH<Ԕ Y}o `YQQ:˴|q9*a>+dA1RpN9 0}]2⇭C!|Tǚf)|`+99MBvZf;qD;;7%m|ՑH۽5u߫AS ŖaJÿcb Ǘ[v.n; 7ֶŚ6lo ƽjFoe@modozr]ۼu$]eI7wth&w/](^|CyY# ,/)J[GL *7RFGJnn3R<#Վ,5:6]dC'>>*غ§qOhp/PYQO"=% t"/¤T?u 7~Z\bF2C}t<t` ORM[7ţ/r,s)9tŒJc1F1sɭ!QL:=Vk٣&Q0LqPuF8<5c Fᐧ" "(lDx锻Zr'O7$ٲFEأT]KV*hun}lm?^u,dmp-m=qGL -F>En0/awKnfKќdsk,QV8d>كX"X,|lH>K!CEX$A& 9 LD"}HNBޤBX!9Z[}".h!g}+9se% ޛp5x6Hx,g"ג&Er`hNFۧ>q' ٤(848A QbzwJI0HaDJ8^hW(u?,ӊ ٸV)VвL'u,XuGϒKO8bB``)6jX h3@S`!c8y-8%&PfTetۨQ1 Dx`S[:YWY}<'E=/j׷cALQ w{$KL]fdW,f/%a ^+= %Վ;|NYr |eiK~5_?`I!lot.c8o_0h.àT 9l$9'@R/ .}VKDJaq%=;*xQ.`Jevh]uYk[A߫IMA` e3.㰤 58{S*~^6fUu7h:h)Fe1|3EE|:J㻙uePA0l̮әbrW]g5g7_zɁ=^6CQ+7B/)ˉu+5υ,g\(sl^p4}/@qvJ ;-8+l8aL{ ;9Q2H9tDn$;etD!e!x0k5f,`ZFL&hKVHKDsgSu'QzX­XmX╍Iq".B%s3Nzt'Lףan I)aV66ARcT41z*`IFFȼ8eMΈ*-{â!P2zE)PG rP`)Q5 ,)R"5e ;^;Ʈ0cSZ;|O kh%f2Mq#hUXj4ׅ/yWO~U!7cWL[+D@DE`p#b"%A&Wl RGAU42$ZK(Id  !K !#a:0- 1їrg=ŶÁ_1wjwRm1{`6a(B–*x 2Z!X4( +t8Rfe0r!c2ph@k|@NS4 ^#ȁPڄ3pRo$¸+3ؔ"ʎ)S=E31dm`c!wJ!ꥱ"j[j0H!aSdߙqIBQ!Ɓ3uFb3[XA9 H\J^1E Lna,ŢKΰdS:jOtࠃF<*jD "%U"tKlOAwvC1=ԛ-5aPVU\VۥEenoDJclo]s>ʭ@$g2Mr-R(ۛ9|w̑1s5# bb$TH51.QH#y T,ihofpw}&J`L% Q*PbKs%$aK>87j :P~X$[eӒ)[xzp$_t6b*׉M=J-h:9&3]2gڙ\Ear̈́A:PUk Wߖ|e3cE8i鲹D(9xJͭhZג;F鷐\=O4JCTyz3@;L,ɸ@Qt$Pn&݈d| *^{jWħ+7,u; s?rʺefE0ުu2ûDr (\`Ftw;A'Ib) $Sp5Ra^j[bwMﭖ畻m״Pn|H~5x_p6>Ա~ Nju >D]!Ilֽ_Q߇Mx\kڞF&En5m4kFlBtyΔ9 *gxr- J-hޒ_{d2{˝0ޣ>&tP Kv8~]گؚ޶yL%A; 7~ȅwXP, 457 7O Gek1I铏?vk{L,c2"k"QK$E룚+&9NxQ <^MIrtq ھNd?Ybh"0X!B:aO{Ǽ iρ¡-:V8"@m^SUݹXvY5#Th]YL,38P,5ElI,\T!BJyb8 (W$y0!S&aX7weZfk件 { 3jC)" Qص齛 m}*i޲,K-):&h"QqDž , )` ڔyJ("3#9EkËE sn#IT F,x zgQiƝVc ;"APv0.p_KĽPLPo֥z4+a! (N{J'i F(L8ޣq=p{l^, 8gfZh#o6F1@CVk8(R Fx#mb ,`MVyi"Zɭ6FBt|F,/]mI8imlhg?h4 wBsQe'8^W*"Y6$V\I6X1PϱX>Ji3#_Y{v[Д:2{|} <*@90[>Eua )eSDGB J1u(•_ W &NPRETa5B9&| `Xf秡g&CF$$,>o?MFK)s5A)%ϡW-i4BΆ6YqBPT9LeKckgU^RM*yUN.'m`0 \{]\VsK|n$jR|risP[OꞮں!mDyk7r чQ6D>{WIbtw zmkJ_4")CGRإЯ%M>ٰҼߢc*FKpz8s%tzW/x'积^'Ipp0{0~ӮbZt9zӯr->la@s@e?O1? B\7Gl ӕgm~U(&T?](!6c:3gn`/Vm$wmHLݮF\w7Wڭ܇u\)JZ59yk'=jq߁飌$H΢T[j׎Z08yF!,ҁ9>8`"҉#P7 uw4Kw5:9,UT0GLMR$.ǒ1I+'eLR*&e|IS$k.qWIz*IIe W/B߽q[oVS16ԫQ5ŬhvMzB~7L|6OOMHRqXt[O~KlSg8Ұ2'sg0oL̖ N({KUo_XJUbҥV3Xȥ83E3:? ql/ofxC)?+~uQoK % ϑ8%Xݽ{%䘌$ z,,#I顳$l2$#$dGWI\ŏ@Z~ 'I9{d#+HJv pvj

JR|,p*I)q W/Jw@`UW JҪgW -zpE'n> 83%Rwnue#s>\o7oy XD[2leigGjZM;-vFOqX-̢S)U1Y7EY xON6Tz(Uޅ$x C_Qdm!O3^ߝܯ5~zp :fhF?1CtX0TXe\h'տ~Xm@/пm0{|՜=|V&x %grڠ}AǴ1^D[}TN 63" māQNœ0t#O /"A"$?8[GE&:ngI&۱Kr5F#ϝUl6}ac b(oU2Z3H@yDQ8ͅ,RRNr:DH|?x -0<0`鑎2&(A1$=(C#0cD+!Y,Ǟ J7=apR4XZ+xGI4 G2C6q6krRR+ջYҞM" ;=1Xiܣ\"LBH0̼  , CR3C0H8DZZ<ҷгzZސDP}D1e0q)!!ʸp$1$%E\ȓqݤ$ZZgόG?4R I dy ?DQ0nxK $pɑ62 jS%cvS8$2NHQnˮc 8I$RO) da^(-o^%| ށ<:-}6gUɻaw7NRs=`a\}SI~B'? ;{ɟ{xn_T߻T 3c$V%f9FS~J;p;k'?}Ù}`?pŽh&%澿B]I}UwL});|Bδݵ;}jHD{SVN/[}n>M$߾_~vl&'^y#uni} 0/^'τ[S5}}\W.R@gϽoIYL}mKu 8tP]6N-Tq | ǏC}F%VSNËϝ:dx5ϫfO#GTv9N.M $*8ʝ7J!<ڧ4Ll.RFXp4s[_y7*Qel+w}21KS|)zZwC'Wޟꗽ-a`IE[UK5v5 \)!|'8ӗk4.3]/aS8UXFwov(3|Kq-̣J ̏&f^ce3/Iu3%B̄m!g -on3!rF%ZU`hi)1)h[1̪NAQ䲖}4$UF+LIBSD'Ge^* Z:0VBkcm,a[u>Ndx w>q _=|?a%I5q=d7/ ԺGK1bBN1p4f=[FX\2N,RRJ*&v4ܲ:A:(֙$:weS0ψcHr cړHfaon0Js)g|8@2+!H eZ30{-#cWs+%"3&ΆQWBлW^8 b_Ʊ#Iq"ZSNq飆jW12#XPSr q|1siƮX3c!opan׉/Y/V;ۓ:˫ݎܸ;Zut/h#V'9E`y1rKhGI#IF !{E%$ؤB:,XȒh/#vHHLBL{#8{~<ΥȌڢEx!gF#0 cF;VD") @Ppx%Gʬ̋ iU3bVA ?#ȁQ1082'n Ǯ(3#lEĕ 1 k S Q/Ym!@!aSd]sIBVDiCHqbA pz "r:4aK!-2#b6q6p -h:iɮ2jq79Ag9\a%%yU!u "%@KŬ}a6uf<Ի lM:w5G] DJf},-ӤR+,Ōk^zhnNws`:,P!eThƸD #S5RPetja Gn0;>L"JTj6)(<dJbXr\W!5LuE"L:lͨs=n2A%Z9 ]hY׳'H\邔%Δ* Sj& ilk]()cl*>QT4M;#i (@F# s>qU$.~NJ}tR| f\Od+QOu%Vd83?Ha(w0 7x XU{hP=v\ǧ઱yr=[洼,mR\\dmXu> Nȕ鏓t$f}RĀ?.&muIRTKgLIY ھvU>p|gJևa)@7rMk*'Rvi`jN?)|k>l-"n{YuvVژ$3o3MMxhv/D(LɒCJ.׬2 d]y[:'Hb> cR5ˡ5 _ogb__/aR/1",DDꥦ0+Cʘ7Yw]] th'Wmti 9om"KJ|%KJ=ZYK0XED?{WFJ/< ڳ6vhwcv !OkTbDVDJ4YVAȬ/"<$O1\s)"M.:4ǫ5Ɂs9MLL=0hj7af|eu] )15)Q eJW1m\`}+3ٞhYok6v<ƴr"eNX9V$CDRxEOz¸&2e\iJO4˙6F \(\8ͽpV[8(Cd`noQ=Mtlclh3̵ihrؾRLI a)^:f'x[omuϯy-\\3F6 4x.JB=AAJe`,Rkdh 8yDp+x/=ExlQ>x'.g*GP ь*)oTDD%,6/e U+B=XʇPѰn{M_с9IDf~yqT+cw\[FriBd$P#SHpXMg q|u6!xSy?/ŕtpe_>D''Ǔ˫K%ΚI|79{BΧ6\^qKPP>4tQ۪Wm\ԟYW-yΞ!}KIr) "B f\c\ZZ*`o($52J1ȅрښ"uI{S xPҽ#Ý%1r8;\7 f;X>ThMQd]oWt8€ħ)nHD+<^srGHPMI〖_2HYZ1]FJtv׋/wD'5$F0RCcxѹ$܀IX@ :OW9X}qP(I!Fh! 6FM8%\ =T Ox5F啉x4Idr 8dżqhqqA; .`!H$pI1I^Al!@.TI7hƢf(md +d> DH,(rɳ+\͹6ڴ+-ۢI$(`Jyji)x."Wȃ4*Qd xFxQ,*5Nk|VG!{K!5\"Ӊ6蠵KZ6, an$&Z  pVYTB[!$j1K5. yNXpJNXBXZcǰhT+9ﭵ4g 75L=Y8AU18oau9OnѠYD8f,"\9C-E=RniUT{T*YB<`X#yN e)9ؐJF)x&pV2[ W:(z:ut?сͶCZ{BZZq#A.gW};&(xۗKD(\K(E \20dr:Mͧ>h[Q 85% Q&H`]0t?(#̛>1^^Q'>;Иa.waSɠcӝt4g$U "J)m00q'Kn}i[ḻҬͧUCglIl L녇rmA~,nLkY O%wWuX ‹ Sh8?ߡ% gG 45[TlLfcer: Uw%e*9twc _:"D`E@Y]}JW]}I]}KM+ԕu:uHH]!JQW\IE]ejjB*5zQBQW\E]!趫L%3zS H]eᨫL9uJztdSW/P] FRW`eå*SHUwIB%9W{& BURw%+Ɍy]U&؈QWBPU񶫫L%睺zJq |WH0\FɡL-՞4zJ/Oōsq>/daǥ]"CK`3pbKm3diCCO KdknhSt]*COb ++a)td¾ӕܷGG.te)-I]R-m{u(u 3p/7NW2#] ]7a pi1tpb殀6wR8wut%%ѕw< n\R %!ҕ~GW ; Z ])Z{RBF4d3wpb *Z{R>*:C-/@Y7&ޛ;^[ 1#Sm1KΡFc-p'A}v`aAʎN tFTw8!AN]C'ng1L.g]_( қ(,i[Ή:f)theWm*tġO&Z+/`'ξ4]j7fwCD]ځґ;͝ЕR;])tutecDW,CW ǥzc=C+PNJ:])\BW6}+E811-pZ\KRn͠䣺:D4ծ9,RJц721 _~.u+ER vCHWHWp2aAtr̠M93wNvR|:Hҭd1I'^unU'Oj-$'ƜYfi6iz;$iIS2nZc咊iXsҋoNpg^vuhejG~L#]=g艼 `rˡ+˲R;])tute)Y^RICW )RG:@rY +v&,ЕuqJQJ8[ +ݻbJJ\ ])ڴf(!ҕ8$R-nZZO{;N"]yꗿ掀tfp]ܕ{?w(6uG>t\ /p\ܕ}uW;])JG:Db. dr Њ+G:wCRe{w zOv~q-mM[_.7w^~}? ~ovZ~6P~3u 2C~?\.Fwo Ƽwoи|Fl %Y=iCe5z.ۏn7/^kEJo?kɺ/iKwQ{r%MkN⧑Z۷WUlrTԐ (薉<!.~|~~zۛ|7Ye~ۑhQps0%eq<_$&)Y[ ɖ\bw]H:gr3mV\J9;l#s}a(Nu6cBР>LC@1R l0B!*ڜb'rdld?'{j%Ra\iE3Z hG$Ňb$Ztmt)95۷/!bi ڊu> 1D&pC: %%=pN1#%$BK79ͭ?!м 1ZOfh!'+Վ W4jЂ#Mv5M4Z@@T1T݃6dtD:DmcR cOi#ԎFVmFM#dmR ߚ؇Gn8|Gz݀F$$~"M61\Vqh騞:DyKR8Uh'!T{ΛHR0 "3fjm3\fxJj uXJqN>dws=!v28Ǟ昑((E?E滌) RlH2'XKՓ@i%#BR8'ݷnNr֗ Z2'"YbuȦٚa|7!'뢧 -U2ܜ<&Ht=KE ў;:K+4P3YeP4BZ%76 `9r(-aGPQCQ 4Bs8?ԀO+(qa0P&PyU7J|Yɠ-]^-#[økD\KY@ ]QcП^v=V+Uo1 H@Q7f=ak7A5h[5( T{zk2@p: RL1 v[iθpTb0VBll`_a: \d#dJsYb$8(.-wC+PMlA%یb2 \n^EzjPB] !w|ƨ͐jPo]P?A6M[@C (!2iL~RTPcRlANE΂G nB:?ĥs`5t9ԚI!0΄#72\+(9oL. /իY {s6Y(J8E5n,\ B!=& 2ZA!Nd|A)Ih 1kY7!J%`;CBpJI(X(g! Pe 1P #oe&{µ8b!z_4yZgF$=Z/n{ .M3Ig''ͧ1*LUit"'$_ }XymgUxsۯW:Yx nUpUd=2?u#D1I|tx݃K<`KJ#JIWo#fBM]e êc,O(v5!Lų. !(d5FYbwV4 h^3}fHHٹuv< o 1t<X]f9PtmOU䝘6ne2 v"}MƏM_}wuv%R)\5->b=ѢeDEK6 xC*j]J;2k ) 1-_ki*!ⶦ3-ChW1u y@'^A7HXVvdV(@0be@RODጰJ":)يGcwa,EJ3-3jt2$C@zX8ga1&ǂpNf9hU"B`IgHgYtgWMMkdg gV ok*x詷f:x$$XQ%jw;$a3.0;/m.:* ֛hihɾL g@כU,|/VӴ7W}˹^Ij .]< duAgjOa#M5g{hwG(ljY:Z-ݚs9&fL(Y#h4vfc$3@9ѐYeF٦FQpݐ0𐗨=T9P.ϰ ڛ(&:YU.HwWQza2mMUZ@z|B27oS3,ƾ XA|G6XQW,'nrJm'w cwY'yca5‘s3֪- f$U#1dT,k~xk*mL11]ǀtA6v;WR35kՃ*HLJF]ϙ@-d)Z{'е+9tzGAц?joD@R8*CNk (B5Z3Fa2k!yP/FWfiq0Mz|dNYE"8 NR #tk!7h NäiAtYc6\WҘ" kЬۍWGR!.OS,(Z@Bׄ ;/]]~Fzֳ+{-F(Go{ ՠl[ r9>-WWuX +ӪA\%gaNuw8)j&X c71oa{k]8FRl؝\nvodӤO/oo?ny~̧8;[|қw7g^ ?t _g.+zdnn/n܇~}>>޿ɯbK77ǛжM>ۓm\dkK =yUY,17\UK)?{ƭd ˵vj$ʇɭuDu쇛R<$(R!)N*}!9HI)cvRqy6)Tլ5J V.6JBHK),*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@26e\+%Ek@`K'\ۂ0A*,ʀJ T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*,@&%qLG +J%P tJ CԘ@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P tJ LR@`Z-QekV6 @wK*M3rsR5{g||znj>8c fWvt|7otH ]wއG'c"pFfAcXvKMs^PFu?7P F^Z]2ġʶG,5j弪ZYV3sm q+]T;s2]ȓ,೧ѤI&%Qa` ƃOt~3'S 0YPZP!0S&ZD.~vPҷ(l6=e^6[e"OlzzL$!2 cZ\Iw WkYˈ \g%Wt Wmz*Ep+AZW\WZ+JMԜU6XU6אU֪},D& +B`CԵ =`6U}l@vup%M h{U6&־t9\e+%!•TJJ"[[W\!W`&{ f+9A:@$M߮` fs33z4Jk.m\U6W]eWJpup4濁8Mh>=TL0"y!mq.>ýIj16n`T-'̲:4Z)s9j(Qy6gP[RcSd[n|n Ooe3əbEW=/IeszK[El~p  <7t1\9'B"PPa'ms?o605A]t~|N+RPMz*Љoţ:L290oA*O~m37= O윌F2:YTWxܯjHQ z S7fZ/f86O?rfm>lѡCFCՉL0 ͓B^4*</253E9.W(%'s,0YBk*7O ?~q#j/~]"|]>vH6M{7ll0w0-|Lkp/ig尹 Jop(R)pid!R( Í-F()'%%l8(E:Yx+Sm'ae-#ylvA[ fIxB &4)`bx4P"`:e铌SbLx"ڐ2JJ!SSm`*QfsEgx&q.؜;b>=n5XEٵwp#'r?Uz3חOy^N8X-g#uzn/c[NI ^ʅ!߲SBs)J͵@K ,Ñ HRNLH\a>4Y%g3 p%"Yr:϶܉fxƺz7#`E꾂Μ{GS~ܸn'7b;/ #*sL:tU Ug4B*w>C^2ٞV1K>-nu7%J ݢ`[?k[5tGJp1.2_lR99 6dXN0ZZ!9"WxYyitJw&*ɭ5Ӛּۙq_iǤW>L V e&,FxLw2-۶l-4ң-0oʀ os^l% )Ȉ)T*qż\Gı`5)' Po֕8㰕aR/S s'\p3pOG_|s#v|BY#&?Tz& sL!.uߥS.`JwץK^oj㣁]X4b|F0T 83A%~깷B -PFĵVR`BwֺЇ?b]6#]^LY3(ZAU !ދ`Q[8%d!hI$vLs<.,u&ߖ25xZRk񉕥Vm5s爀B%l1/v-Cթ"6WC\ү9į$y| T9!k 62W[ ^*PLRYZ $M%Eyd3\{ŝ:>ά52(@ IrmR,>ʢ&J:my)^BtU Jį9Gu/XOU Vp] .#gT,R)MeDM3"L>؄"}vcՁؤS& "8)O;ⴏ1)*4z#x7vOs}-o';;;Q},G\ٹ>O9;˅FA۔oA?cwzUy0}^U_.s6uӣ7x7nAMgɿ !fRPTR?oZz_kk]l[8mo@|SF՛gB.[,WՇƠʛ2nɔͪ_6qPՓ^er,ogkmُ_ۿˮ8֌[)!Ӄǖc'/:$?NysiF0'~?.QgٝOomߋ-O ZcY=Ԥ9,2>v& ӍR`\ԓ<:yВŭ;?vty!9h&%jtqo\p3,6tt\׈:K-Y_YVY%t\}&' 1޸sNJ Iըa6Us,Eru BuJ@bRGQ~SnR}W<) cZ]4߁?GHí^]ܽ6'}*_nR}MqmbF0] u,C>.y5/Wҩ\5, _$e_ zܽg=0&nY4jHQIst0ԴV3Pb-:YK;pB ~zU^O N(jLl3GyJWoc+½U'-nqgO:dLPƴSts:}h~vk#0 Y  )7Ah2 <=iԨ4j܊4jDL,!IJiy $Y<1`s8ͩgvrthjMDk7.߀2&XmCt>ϱ~!0.'a0r4Sd~ Ʉ\*؇ACm޽B~L3?.ݾp7,sdܓ1ht?C(w|8{TۚGlu^\`3kTpAa]nnX瓴T.J,<\]Z3W7띒8e߽pCl=[߉ o=NZ"Wň+_ԚCdƶ{MwuEfĥcvnojz \Ϯ~'-ܸU]S3:t;6u\5jr~O vt(y;B6 U veiM_d&D8?5,o0jT!1( *8+@\|n\ >NBxn3zdB%ȔA# }LI&d4)@.Œ3g=ݫJTge=05U #̭ 33n_~ma?u|;-a1S}h},qx,Cn>cx@G/wW>ƟӼ"V#}V%`hlg >eLdA=D!}6HJ|θ,rtҞ(l,=T,rxK5*$Zs!QIJ֡bȊ)"h s1,X&s:YJ}D[F^-}>kK}vo9 Z  A@Aԯ `wY8}啢 ZTbN4d#(3F :Գރ3YS*Լ095|63ڹW7wlvQ-f==٨I]}S1RõfĥL ,*ԤԀtNHTD40)g<7ֺ +89 pn<(w:z!!(\FI]-\+x!q)EݞlסbFu rYfipNH>u ?o xwT&Ԝ]UP(hA7&dR(tlCEʯP,UiSe+ZvP* wh$A:(y >e3aXf >A*KD`ӻ+c#-VDr! f9s> TLJ$pevN)I 5dS7\J1K2LV@Y[B/"-+$Q7A`- RYPt2R? d~jOVoTѧ75zbѺ$8$ UKJN']d7><[IkƠE8S!0=m.eIz*.:mSZY9gR%c5rKzX^Z,ԕe>{/iޝܰAfg^œ]ˏK 5$ $hT ѓ䬃d"[2]/aS˜̊({!)ؔFiV@t; 䈹ȗqnY/(ͩ{#kqǶRT vO'I%-&N:ʨve%Cp cəlY`ì18iȄ "-:J$kh4b>T}Y/[P/l+meh WqW3Pq͍):d>!iKߕ%ulĪXE1j%9d"hQ"9RNAíD#iBY-WQXF5rKďgH:\YK\tx+ $-J ^%]eFF$, J%(2A:^0 ֦L ( kan ː⌴KiK=-H AI9YkrZ@ cE`<0CCEڊC}Z)+_)9y1AΜ .N*FONR[W`5m;^fOW> 43{m]XʹLXĜp6{O?&)+.N*^W;kߍxI T"eұ)'$ti0vXc{9 Xk_u2F۲닍As˘9\ݘ'3xىJf@L$k=kTּ J̀iP+-_Poh %$;6,lIŎ~Ygf?lm$Q!i;`ۤ%"RW`$1S>UΫRxpV?7j\Yb67i::DeQY kTZBgBj9f DnbLi`N*~70_Z g/G笂SK {QV 'lF#O^+)V?k(+򏽀ѺZ|{ ݲ׻Gv/ZḴvj?JsSgWMd+ټ. kﳶg7l3wv^OnnRu:޻~?i.n2y-ߜls۵}][7lmwYjRmӚm.>۾C;! l3_~\m;^.l(nYpg?{?;ͨ6fPvz\'//m'ڣ?ǵQjY۔E^UQT*yQ5_ -"mzmTmrfHۀfH QXHJxZmmJ[ uOPZmʆPpuyMڳJZ+|)]M#JcUy`5/3'Jԕ͡ a/7Y_\]/5gݷ?? T-_]a] .Xg>m܁ٜĤzZIJD^+iф׌ KxMҨk ך(p銀htŸ`i ]WL 2jňtE(G+URĢ+E꤫ "RME|4+*O.v6j2ZSL aZCQ:a&+HFW &]1 ~&S4>E]9n-߭ιܵh=[|aoI?3*:<m9?dּ뇷~h+~i[jpД>Sj /o%vƸmZHJ4/ێ>MowۇDˍ/w>?>0%49sQa׺c?8hkDKk˴*);O7Sk; țaqio۪.̲yG휣M dGgBgO?^|-G+s*\lX T8G؛o?83p#zBƴ:GL44 =af1avhHAZBX$F+&]n \WL1t]1egAttPR`19JW>]1)M4j(U,bZ|(mh󏒮EWNHW F+EBSNXTDb`E+aBS*t5A]+XiFWkEWDǚ=R)ڧ+k21Md `2]1 т~"SjLFY9{(YrZgoל.^W^'{$Ơң'f{ %yWlgqu]R.W/1RV <ڴ4KzE@znFZy7/&WJ`)Y+;3uR>dzl(͑ "yQyȱj@7Yeޫ8JybWȾ(]1i]WL2jW4銀6qEWD B+4.jP0`wŸƢ+u*t]1IW E+^xtŸVj2ԕXԻ"`bZQZ)JcG+RD+U&]1O_WL-IW+tEƨhtŸ. Dkß:Md4NlyRs-J0T֢xԀ^tz 3gNhQw1Jt{&>㎽0ڱ6DiE N/,zs*Tk;5pa!anh=7teZR+""]V:]1.XtŴCS)Jicɾw+d3EY=ϖkj}Um^^oL5ƸyQF9}bK_[mKSyݷ:p'U3{^Q^՗]ws\?&,^nn*?A.>\nwn+mJx]e}lVjŏ$sn%gY<{ONFKnmNbdj-o%_|szSs؝zSݩ$7ȋ(KjLFuIzgk߸[wZmA4ɳT\$S=32/}Rh=qv g qɿ @wfokhZU\ խ.2J<^tuWr/!ͽc“mk G4 N+!:ޠPwWbx*I|}>D~R@ݣ;wy,r0t8A[2'?؍IyQkn!Kʏ[GhwTzƽ'oC^?O{w=C=kYm̴^%t8MAt{4qw;f@p;]Xzc ~ww:6Xlu{dKwxU^ԉ٫bg'TMRM?~x]7{=Uܧdm*Jϭ+ƈ r(Kfmea>C rM) 3agjg^{n#i'(j:ƍg51%RSd4>I]Yp16+&]9> );ӓ+Q-%︋A)P!1zεoHs9bfP!ZB?cPq:LBEs΀ØbE0&FaZ(L|Q <)7CONoKm~ň4M f\fM 'NӠ}ϢQgmѧA Å u#MDE>]IW-z F+G+{ ZdbЎKz])]$ "]"Miu+ mˤGѕ6Np `eu4"\-D,bZuŔ^&]MPW`UwunWB,b{WLt5A]5bj'`b4+u*t]i}G銀4qEWLkSZt5A]Y 0\bѕu dbJt5E]9mhh=p¬P"-@HߙQeΔ:*= Z;>hpb t&2&$i2OaĢ+=q=)u.Dt%gQSqgCWCu`G]BӺ1t5ҋablC^zc;G+uy(6t]1IWԕZFW2]1 ^WLMu w1`\/bтuŔw5I]|+~P"\#ӎr`ʠNΈcZ4CK;Rw4T61wU{[ҶЋuYUE MYզ\TFLUi$뭳rZkt֪iQr>uU5>@l *|}#.Oچ%rq~޾l6{,VoksQѧi2+vm.߾l[s_t_M]7U D}nTC.s4(]4'Ǘyv'o^أK d.;[|6^]wuq,E+GU0>yH^i 9ZҒFq1 DrvtwS=}f>S1cCߣ⎀YYew9>e, iW~L;87 .?nhb exdJ ]wQ| 8Z ] ?v(9]=E H/{CWkb*>2S~RܚO9qEM5qj̳/gg`b/Ҷ>zz㷓@AP%PkwԈߞ>xgut @#mJڦ|֮Q.?uPmw|x]vs`9蝍7Svtķ`M(/|2d=~ݧ>wpǃQzru$վ5avzčxǯ GGy"F}GL8 E >x<XGeI%(ҕ#(scfMNthV՘FTU*6Æ}ʹRVMݫR`tlƢ;~=uJ[WƍLԱz`vL*k5SmN1h)L 7'ZZj j 9Z(\ #j QS)9;D 6FKw y9$T,\j6Jk婚 5 %S)5zK}E,:d0vcF4Ccv..ol.HSZ䔔ջ7@%IG2wV'G@ۘQ\Gh~QE4P&sPaf4ޅK@cVM{'mv\ߨ>:D![z]F8H$7綽8K*6-Zڋd3<%cNBk?}0 Ԝ_ϛUE{r'gj*uϬs!+F[u΁z9K nu}6:9DkQkNҩYJE.)$qu$~`ۄE)%yNVCJ 6ZЗ2l-7WqzUIOi;k^,HU.C2+הOtOV)Y0f;DrNU.Y{V+ RAvT4JkCvYwiJ2P4@ZE;60H`9- Oѫ N7h NAkah:vζiPTJV*.:_@2yܘstp Օ);:+ Ņ%̆duc]=^џ^-FU-x $XIiޘuȆx0P6PB kGU((JU&ԓ)e , R [tVنp5TM`Xddb!]AА$2.qN +QIw4 ~+*өPm5$$IN;-V(!ȮhYc@+!7CA 2nP(SP|P ؤg""*NuꎈQEgJ7ukƃ A'b΂Gܬ ݄Bla< o+ ` 5đPI!0ΘUD5 q*9oL6v_ J 7PP;3+E4GRFi֞%AQ 2#}@PS(H\F*^Ibkh 1p f1r 94DR4ԠˤDdJ(k eq1!+aMh#ǻ =sA <[ \c/fĥ*NB1 UIkB9}C|cgu9{{m]tq:Z u`c{.0AACZI|txḰK</Xżcx¶VA;> Ddڰ>T>wy4؇\t詌 MKе 2"1wPڧy zȋY /ѡB-10=u $$hLEQcY3m>%TmIJ Xut ˈ ) _3X-DU3rB`X-;Lo3G`X!Xc~,AJ35VEJL T "Q8TDTp:jUàef@ 2q]HfhJ7a#U2'YkO^ (TcJ ]\1d@b~pQi8 sfYujłrri"&b9\Y 59I]p* ixf}.v{_8o[mJy{ rrE:;;ifjߤkim1_opl8 ms$B4pvVWL۳{.O8Y*|kusw|m;ܔ}ۮޥSqjs[ gRGoߴ?1]?\{>Dȟ4fq?qo4릩Μ㺛L-|s9RJƍ?E3W2vx=8h/7{6R|D?oylV3zie hznƾിL RQKbTƀK?:UPj-EџGez\i_! =0] KWwC|7ѕ]&d C\ ] 1.-t5P/tʠjAtbgh1 hɄNW@76 ]=&G +z9t5Zh18PzcZxȑ_~r%l031 >-jɎg!ɶ^Z<̼CypYEH(򙖜eN)bxB&Yw'y/|O 3MKQF)3ZdKTM0&wE}ޤ 촷0A3o3bbdב7M&hʐ1?gJf&"cx2eK/Z1wTڨ@>ÆuS6*&y-6 PKmwXmFa`M}1Y?NsWr<պ,>|qeTx=26KR<4c׮ @fzd2efRcg6(}σkH\f.j/r锧tE | .L̕i2'Ķ ShpH٢f 0\q7dS{A@ GSyECuqTN {c6"=q}uF\o_4i^g)) i.jjգdkX37̤~gN}XgޏW0#9_Tx3Nzsj{z3nOyL;tm2(X\vwuMFy&<:(+ܘ)4:흥s⋩?ˋ\03F Jm/63 _}]@2:X/yU^oċSsɏ)xE8⌫ 0E'8xJkG8M.Oe0p3fX?p,# 炂򹡠|6(!섊$z@CAL,a0f$ҜuTb(o 6qq{P1rCZ萠  .ʎg9ppz%[0g ?G//GBO!wk*zڱm0Hx,g"Ӓ E2?4#@## Z6LqB}2[MKѷx^EPhi!rFci(^ e;X0\J"%Ozm"Q} "Y 0ƥI|rG.:W}󙙌ej{nbQ7ޮ`!< 9GۨcQR*\ QL *w7*NcFWbkm 4wFm6*GT >!<)hag ng x4_ݫrL/2X, &iSe W|S=SdM̦|;. =݂'6 K%6EaT+ Y)1o޲,c?2Kt3T =a `?JͳtMyu.r[znzvxt7tZL=Nq _-z 5k/lWyڤV^QcIF¸82'NY'A0Z%=f=jB}kvR>R N9ť$bdF/cL%մfl9Ҙ.l3Յa]ȏppJQxbʗ;-jU&,nwp? 0#D yi8<܈DIPBH8?\H☁ 3 t @<GQXG M{^F&Fb<leQ5Q#xtYXHRziHg>-5<0?EI0w,4um8XG ha$&=cu4(&LR#1sV݆5bc׈SmA/Y%\lKՋazWqࠃF<*jDX@  C>܃ &Y\W~CX#z_6k,(ʨApǗ~sU齟7Mdr>ʬ@$c2M2-R(;9F##-͇9j0 GH2*j4c\HF)LI)2X:k5*8 0mlq=@SpJ-ئ8E\I,K_r1ś!Įs1UUR4dJ&u1)@Yiy0:E^I>:G^CW'zGt28&e\2cڙLEa2̈́A:P"kB{{tD~,of jX)ä(OALHEr[MHY5 '͸LC4@Vk.raI΀t ɨ&SSQp$_!YFOM?3RT @^"IT544rf0KZtvV$>nY,`dʼntKYriݩڰ,+I D]64ɬY]uwoŶ< ΠtT:+KtZfˠ۱GJfPo^}Tʫw>&D~-!)]r{{lHJ|쵲2a)(0GM,jnDh:vF&x&y|Si2[61YVڣ B& MqpYy)w:"C5Y&up2DA-7`NaooE^=-mͦlSKL Ebb"gư1,b`Nb2PxA[SFrEs&y$ KFe)A@Y@8z|10զZ&:Mt x>ey7כ ,nb>,;>]2-j4XtUntD # )\omJ("3#9Ekdbh|A溓<%yli/a L$TtL*0uZ ZgQiƝVc  B'  X00u 5 ݣM?vt,XC Z8)h)_'0V҂>?ϵ4m# 3 8"F#<7yK10XdXÙזGD8 RÂM '@‘898E([m08t>#f .$o~,mNkOC݉\pN:]_$I?':[(&ik|?ָS$ иogU[e^n7'v|=&s9GL|cs.ǖ9H'|6vCChYK-m-5C5#Kg DBG* w:-oխ֝\겾@fC'ze$5́}~\z >'5(=XT -+-_/>Cy~x4qu_Wêqp0,!#3 ?FށGM͛ƴi&M&{] &&[7][ y}]JsWH.v=fl@WR}'@6yozS(!URzP! 0v@~&WsMxS|IE޽ |4$Ptv".L߿DF.A8_>tσ[GyDsi䢠L $VZ68Ъ=rq[Yĺ9+޼bAmҒ1-ЪPeO9ay[tmmr,tC}}گOw*+?|)* UmdAO܃.̨ٱʓpcy0&u8tkmH 4R?! 8$]qAָ8ˮSҙ&R߯dR8Alr^몲ZOݟz9n4:t!!+2a* #lҬ Q`)89qCw*4 FXPKiəȭfsل[wisù5-ZdQкM& !IK2*c%?)ʑl136 荡gkL`Ayc<JØRdγDh1>p5&ζ ұک,!h1@jA iGcĠ3R:n"H&JE).ea(iwh•ډsv;!}`x Hie) 0F_o[ ^M]GWTi[D>Cܶ<GВ0NY\8'3'72^P_ MEUQU9_-N@:ws Եչ"VXVȜEdJHcF9G( \3#DDlg|,Y&6yPl elmP[){xNJrbEΔDX0VɬAHc9;|ؘ8|~ĽmAݝӵ' rӸ51tɤo} TȢ jW=?*tNຜ;֤[^Iͅ|PY{,WkJIk(E=Q i/De?IT hiv’hAg7cYM (lP|N9tpnǰf6ttxuOxZ.nػU?d^V#@P=X#2k [s-ȓ-yl)dM*$ĩuj2ĕ I(54Zj8NXgiW:R}m,S2!ԜPH~/ow6&ݸsrؿm[ (JnYkh &ςV\Z9 4`ȍ^9.AYT SAM"_I ica!8T0&'!jilsuA7bd*$P.^x#`LÇȌ0:;Hez v<֫2_lK4#"fã0*#90"6&*d@&tӱXˉ߽kls}b`Z-2ƴ&1 J&cbr 2!cR΅Wp_mY+/_ 1'eë2NVk9w?,G!94J%H$#A"x-#G "'X4\C*:ey,/`J騵H$A:B))hXn%:в#hg'x;w1>h}1贮,|"[:uO+p(ACT@TW~n:\*$*,VV#T0Y;y nTi:A/҅!˚XOlJ;H'H+UUe+S{../ID-Ȯû^#'Fo;T9,Wg tJrʣt.};.+[֤P 9Jp'm,, F;ط#8#'P~@yq,竕mqLf[vJ1wnA+t)me2fQZ:/fGɺe˓uwy"#<_ ,Y &d"U K$Q*X$gًՊeU,K_"肴.E rA=ƘZi&s1qƈd LbᏏ7W%z@:6}ȸCz^{|ts?s?^I "Dp+#T*#e `skō=G]y6h\fZbg@$|,9RNE F*&f Kbnx}'K5?-))gѿ/ri<^^ͮб~Лߟ~?aHF&#|2>#BtF%FMSYf $f\Owa'zw3 {0 \Gf,@tRs5v}p鐌8s#Yh0(.rc}g[ds}ҥcBe҇gqآv?nVZ'6<^h,F3w5REԦL(i22L[@R}@] ptu;ӳtY !~'e)I pƅx!Y YqvcNg1'{26 @ّD)0 Щ IБRViOuZl&CCw (jڳw HBȆRF(p}ѷոzf<9Ĥ E%ހKF$@G Mى50eyѭW=o$|ݶ66GLX/!'x|wIN'̝ћjӷN 4`ztF[}˿?}T&3F?fBݯӐ|M}]ӕ=8+B/ӣ_`y5E/S`A*Jp0vSKz}z[HWZi*+~oxw Ύ>ԯ=z[wӴKܮzӜSiEd8M?o5]o:zqhh4I.E+zRf iWϙ'QҖf={~wU3DB~zhB8ˎ7r-;K^[^Q[Žuܣ67B>55O~SvW+ ć6+T 9+M7;";qs2dɪa=`^KfTBO24>DuO$?rn a:=fwu%'C]/R}Me*KÌi;߱ lovItl0㙫Mgrtq][~w}E _ fz74P}{`5~[?L|iƓGC3#;.,9E&V8༠bObT3K撺Ĉ>X̹dJau2dAց\dVbYP9sY`xv,1z 6$D<A$S2i.0jk5]ldwg1ze|VGZKލ%ov%٧rwGnVy|pձzRczO;$*A(r}(0^HT+raMbH`%"VGz*Ңj*R* 4WR1U1W$ePUVȶ"\@s:j#pU]i5o*Rޙh`5dKu8N2!+ JW6+Rtq p@\ܾRS߻"y}ԠD3 Hn`wjT˸"fC|/WЁZ˦ZaVk]bWxƵje`m Fe˅5{}]MtZ̠B+Q]HωIbn/ߴ0+ :L/?ڽ@za3}{TSKEDzl&լC'^3kgfCLLs^p20 v HW\pjPTvh\Qp`E KcZW6@B tq dAE vO0Hr`Z:He0WĕBW(X3ڻ"}7BpEr+T>(%Q\V]DppqO1X9?e5V5wE|^IaAjK}E*nhy=jkZE3Ϥ9Qpjg1Jqᴇ%#L\&]:Iej({`vz{",X Y`88'2NeH,Ff\=w%U8[FB \\WqE*}qNxAqqr=\Z 8ϸ: FB^ I.\Zu\J2+jB}Llf0uU.@IAD:˦P^AX>w9;1f@K?KA61Y A:DX/fwMLĸ}PS& '6#75gmJkrl6,0QeTCr4(RTZqjp:N;A7͋;WqrC;QjR'W"p2; PQ Hf+Re"Ad\ WJh%#\`W(WW8^e\ Wԑ#xF -}J`ZSgIpQpeDP t,\\&$}RS֙qu\Y Npڻ"wEr+TMWRI' *6BV$ Ԏʸ: <e#\`$ .BF&fTڐq5D\)O`Ou:i W'a6a5hԠE3'J*v;Q;ҁe\f2H[ίޟp_%پkCOGq*ub/}|s^*ia+d+w8V+RZdQp(ԁ Hw\pjm>jR*Hɻ"޲ƻ" R4q5@\i#8FB^86"Ԧ  WYIpA6"M0HjHWtrjGWG5ΰ Q H.."}5hSi2+gςH0D]ZS y}j2AX/ ;eW&+RlqEG0+NBJٻ"::P9"\U" B6 5'2J2+c4-cJ30 2plaNVXv7R@kBjgvwq>pzl l^Z }RrPj )M'B8iz/F =Hߠ(RmO{8*CZRtWϙz)PpMv#\6q*WH-H+++R 2u\J!JIG-W(XaP"=ЏT u\ 3+ `c+6 M?Ԡ Wĕke+C\ܣ߽pjK+Rs08D\YIFBQ0Hr.Ҫ) V #\9]\`@jHW`pnSn)7:2,#`/'r\.0ZͪHr]N'\yG7!; 3z5}hyN6]bpwi5`ơHC%|Ɲhk/bAa8|`il%f+ $M*ͮիqdǩ'B)S@^Rί3 #pY~.oWɟx6۸_}9軿^_z[!uw촋Q<;ʇ񷜛WE$;zE4t{?nGy?o/Z0:l%쑠Fj5oK@]3$zr]*A73JƊXQc-2z]~[f|(hnNkՋ=rK3i;&V՛NFRf]kx3^H^'wkx7Ai;öyߎ#OZ01-14­3~9N9*Nm_YQ*L,OAFxk2{kϝz p]z0\pEj-+T;438#LpWlpr\pEjOWe\ WhЊP8qR!j f+KZ/S2{WCĕB*N"Hg]Z|Rٻ",ʻBF6"RpM?$g\ W@mU.oFxvsݙ/ SkYio{x/\D19PzvY=G5))h~s{9KO<ML{4&ɧXIX%[j^Vן5Ie}^?뮿_u־ӄ6M^ Nc U=^j^^ϫR?_s28̣y>oo~!_V }Y(hiJ GAӣ]jtM. Ѳ8_W޸Ј4Z8wwJ>l.."wJy3~™iquX,Wא%ś1TͿ#!ΚۿCK]\_,ϻ|s~ϓ6ScN 0-1XxoUySKKz,QrSgŋ>FOe)>},Do8[6)q-]kTM'l*æRxlZe3wy=6@¾5 6}ɠќ~ |\7y%CԠ;egeB/xSȓbRw'f1ةMՇ4Zym܄K /h5L_tD "m|t~Qq?ͻ'#;hӖz7ZtґWڭ(ފÂ҂az#/i,T4ؒ;DTg N Z zq%Q ǴG~|.o?|ɤ~3Xn%.>OW1ӋkUhD:ۖAIY ?A@]wSJ*k"'&}1P5¢?c6n]he< E b]WrJE9)*@r%2F!vo]l]Lkif t}|[tVkixLKH}Wl֏ 7jFݻlRIT:4i6O\^M)^Ay=y;L#͠/8y[K1aԜVQ^YK\[]uIH~p2#<Sv9CZg(PSdμ9ee"ԋ-)]]?ؽ3]ok}6=*1hkj;3]uEtܣ!G?]kqs9 Ctvo^Kف~ZHc+!ڲn%i6JǦM NhK\ĸ>2/_]/or~I)O?^4Szg:>>Ji]jT߿?۔do?ur{:uwid[&`ei Lt\-jjꉭg7] $k̛r3vZPVAqԭ5_ϭuऩkc4Im)su5Y5SSar6R}n-F7{If7?jH/k+1ڎ3"_u'"JH5p`ZVɴKk`V݌Vͦlzsߴ{mFMέSYڌYeB;7dYfzkΣ=i#eդԳmʩ,RZ"\\pM+dQ`C+1q4ju2M o'ͦƍx?fa0~Cv跋dyNQi6=)b>X5fA_fu?]}:ewhk3fSaF3.șl]~W'Qu4=KpLuf`Z Ncje͋׋p:`1S 89:q&j:C>:Ogqf0PUgX0MڱhJƷAI/QђMѱd=.criLt6qgf?:6,HcewmHifZ$8`fv@9vE eJ:9#X~UdU<3aaLqAbV\-_?Ezxs8TG %@Gl7lWGL%O|VY+JPKϲƪ i{fQU``3LWyp}-e-Ww݇:Eo=̒/w7twxH8' l(aM1pƋ/&BPT.NWF޽fbs{ò+YBv.Yy=:YAR@THs.$ND2\kp LEVV%2g&_q,. 0֝R^+_V еwЊ#Ta ypHtG XƤe 1LbEUr@5I"y~ u`pd1{uQLdUhhޗȖUġD/PF)y_nDINmOSΧX̕qv!4pޓtJSi2g& wiB&т^B:$h^|~fȠoT=~y6;Պ 4Un*zU$-/NE)8;]%HvI愊Tbw)rE1N%}v$I@9ŮvJ9n'9G͘J:_Q2$"YդYDRhl*߿T 'vgl%Q@Bjwi^mӳJ U_Pte9=KޱYb:1g.J A9Y !@yyPX",dF'NF-vC̳ u g% VcUz«{Pkj 1ae`)nyA(h 2B6a@h@1k=vʱk01g FJ%4;=/OdZ)NrgsFe떳0Jׯq1oO)@L;KEK՘3D]Gx^W'Fo00ڟc/h~̝Gw.ǫ,7R`Ziֿ9̳4 c5a:\vXj'o_ӳEED^A/1LA!/<,7`V>aJqA"z>صճ+M6#袬< {tCĬƃ`V,&긇~= pn{xJkLv FFN?5'Jok ¼x$L+m#ϭ6B1̝cFY6ȃ; T9|QsIy~Ηqۯ/ ((.د]*9d ˹xŗbv-&=Y}L8v80ܧߑg%U*E53`r  1bݯfh Y^1Xg1ALIՠ$V7c"Jq d *d=haKxꛭi#h`v&3Δֺ)SoU4[L _V}9==by)gcy0@@gOCjZ/CBnVhí2>,e>G'rv d[q]X'G. !ۋ/9<!6 z>22лB?%TS~ U}@H0JڢCav.jՈ珷y¡T i6SNk!:#wMD;1mG5b}=PZNo:½6) H\\$#s9%cX"8QqLy ޞlqڂSAT֍p @]$ WLf(Ob@gW|5Qϱ,ΦAC JT7yrv/$ un4jGC'Si>^:Óbm~GZN#6k!t)Ow2Yq3wzq]#v׼û hn٩kTm#E اGU`M9rO>ik#PH]!ۢty1T$9BĜ[tҕ Lv=DeP<-PM$?ךrʏZeˮw;'"~q:-|@KUIBcsR͞rbBmcC},#h΋Uh'd[˻! ɅA0λ ffknӛ/Yϫ8hNҸQH^)M Z9h 7!azUc m րPe?]TgWc>FYY>;1D v-!>S&q7Aj;M c=Ik{Dp쇫DӠb&LAmIXL|n,㕠R8ǒ4h W@QՁĊH`:3bpOBt^>2 `R/,p;NP'YSmwdUw a$nޱFn0"Fb&ȓ)C0:B;MM'ibiiW#3KtУԠST}VA*gm|6bp'#?%>Jo:CY<_"Z\|?o'wE?g_'|Ϣť6IApNb9) X#S/ [6ϫu!c9gN"!"I,F' )N0OH#/eHek|y5Eȸ%VG*Ώ/Sg&"4J(r )MXĹ#ʮꎧ%> 诚GA'ˀ{s5nINa &!Rb(EX3AVvժ̱mUmFd D#3bQ"I*b A/}Ww|9ȥ[Jg:*PTiKuzߠDY)6)Ji*ݾre1VLőzHJ| N%Vk.?9`$QfPA2܁Tz+3Aǂ!'XLsȯ;/iBYHoG\\,فY_&,D0d\])4g#YDbac~ļGq;'lw& 3ee֟^;?ɷJдϢGb>IjLmVPpEK%$qtIIF1Hx[Ƴk}>CzˆC5鎡[hn>}nӐ ڡ!=~ғ^?#>C-}ڛ'w|N*@uk&^{^[ ÝGVq;#`^O%EqOP?nť1&N{$ni;݊i\fb-Ȱ=}9gO>V7G3v6'^Wku"f{t'vŃNJ1FeV+ L] \ rÛy?MpƁNqbP3C$[;m05O4\Ĝ]9WR`N+*_ S.?8{P5 +8{>r9C$/ !P$ ^zu`:OER{qĕZPw29yE2g9"W]#nxj;a)]s` È+,BFXT1~K? uG>&{;凣)r)۹otϘ<+}rNɘd#^'gqέ 9FFE3OMZURҤQD*؏/)ئc%-j!];+Oz3Ԝ0n9z1>]fa0l'N+µ\m׌ Zև قYB|_MBJ*5Gf2S :TD,|AdV_ ځ&DmbW^--H=Da|(戎B6f>8:lK*('˽I2\Bg%r# *+4-ih~/AR@#e5V?/gCiϳ" I[Wt;}e LfMaJ{O%܏lUB^[Yb"mBTKf@L,@/E6f%L思ƕ2r^X)H+(a/Wx^r!%oDwKˊ}4Ǧٚr>e{jZᘶzDC6h,8dK`]8ܺ-,tm%#3*ט?(߫CiɑdZ< h6pECN(J6#E͜r [im Z<pSI9=}{DbfE%z(ΣFGJ5ch(_Pb1^@s301 ? Jnud a##^2pFbB8YJ0zE/ *އ_ pvdI,&f1ZݑP0Y2/)C%L2!&7Aʸ rV\m9e"Vm[ަӤT\|UGx2(R!.ECܖPf#sO~W)Ji%"nKAqa2fAB[VX3+ 89ȫAU@ RT69T6@Ty yFcr(+X-%Dk$)D A&`b#^-ۈA[rr<+ (PS%Tΐ@9h$9Eg~za[gכ!%F; &SIfϣGaun$tbu:EDWjV%ɓ<H$:s$Gl4,%aƔ{^{9SZʢ$UT"u.=H2q>ab%SaW{vx-mDz^9oIk\g "ΖzI)K6Y'ΈBPD474-ӷJuCJ+=AA`sI机!D*#r{Cj;ܦI&?nẇJkEYa2[2P.tAXo1P 1҄p[Ah]ТHօ_;j)8"uc 9i }%L14*snڜ TmΞf4dV2"V'.JPLͧ*I7Oagp"Ϡ}-q!m_Qa/SԦKBT4aM cXgڣeruϣj& kM*HH:+6Ans/5azբKB_;*6#ŠL䈔ei#"iJSRvŞyC 1uҷ`j3{/: /~ËAiĘV] .n}vqۀvOI[pؼ KmFާJh>zwNq # =:'TE uAДی=0/$: D:*.(ݦd5%G#М9S0 m  ^t0>JNapz!rb>Zn4k (kNN >(8'n Ʋ5ͨӭCKnqmEgl>w 2z}d̉w$o&_ 4"ShSQ.#;UV}ǂ*+3 ovz'2Wą0f좻n!=ll5N!֍!rE\[Ef 9]+Wr Ӹ8y=ufe!L^Ph}Y_ΤGSOB!'( ZD%M=k1jMEj3̃ vI=퍇pj y~ZC@]w x0fQc{֪Rg=ث?3Wq,.lƅp3 lZ܋p@sTpw}5 qaT97`8O^G@8#%ʐ"92"'Y&̯ g-C4zG=%IhZ+Et.{7RI٥Ƽje#4oHs)!K}"BO$T"C7K*da{/xAAD҃t&1{g'E^Z5A7S ϙ5j0ڠP+ǪWqŽSmOΙv,[j ;M{j|V+~R$=O\wsJ_ю7-ua !*ߤvYM.2i OV/eB$(_ݝ=loH0do.72&dB J:s!Yvpn<\[N<ٺ@SrjsI./^8\Xy ֠n\v#w= iyy|MNq}5úl4:Ff_sn9JB[VX3\Z`N!*Eӻ7Wuh.*L˔u]}n wӯ꣋/ziF$SW{vxXNŨ[>ԣc>&ә f ڜԑGM[av:M@+(Pi^aD>5i6CH r{y(㤍3":1q2SSQmdg`1L3cUyߓY{Ɓ6LlVtj*A \OGfU 4[Ң >$$u,u\M* U^OwӴNAMw4{Yf1-߰ر͍dmp"B1?Cv\ mFT<`A[[)sEH QQ$)izΧX9.R#h4=n )I0;ǹQ˜At>̖k:w̴[ҹ>%u̍䨢6BoßJT܋aKSU-+PiTls-Ik.X۷)'D:("7RO(B?5k.YSͶ~]bY<>( i !k Xd)YQ .5J cEc9{y ܧĝn,1Uhf̔*/Lb [b:h|6$:A-TCB˿>p)|i?`eR[!l0 mӗЏu&Cytn9HeDe\r8Z0h?ӻE2mrIsۏӏ~n _o@qѱ([x?_k\ {EOA1R ȸ B>](ɸE` LAia LdT2(m-\c y!QzatL-ժN0k^1"yJ@v~脍q9/pXg[7?X8I; Y"y>E2)%g9&)Cx JLVR/b "e6e﷤P+6I!؃]aΫҝ"pZq_TE› [f 7L+_~M2Ȝs*64z4}vDx:Cf[n'j)*GWP݉_r-EzaWyD6cR}y~/Z#a}][(JaxoYw`_7Y:jc)+?(a  _B8Ѫ (N2& Z)E OY Jk'{ټHGK$C# Џۦ4-Ce?Lៈ. sV=tg K ދ% {F2pJa d9%Ez$ZOc=n#9v" ˲b%XI! H˖ Sl**JjY鞖!y# *0ނ+~wǴ 1 _HG) SmsFr-FK28sRm@rb~Īq?f8;~OzFaVQ8ؙc,,F:@߻yy9fZ gw#n_;q&NŝDu'|'";LA:FW`ѳ(DL~gr)p7cWmI⫝̸T+gҪd !7aU`@3 ZY(ZeW{}GvBE ٫TyÑue?`yEhZ|M@>ާ`\~/A'}_ہ $EAOeF)5[IJ썷R*6VlTe˜6EXE 0<}8/gې+3TH"zC+hpP)%y5L\cN'e֋3A!2kԕ #¢bz&=w~X_|c *&~}V,y 6?1g\5oNP$s4wIO@0'x'f욲TuOcg笧NhC#/fET(`IsԄEhSfTh gxdEFkWh 8[2|`Ez7Wa+mDL=8p׆׽ʦg"('bu$b0^6aE Q>:LƼO, L,Za`"Y_bx)g"Iu3S8I92%n>7%V?^YgH`, Szm͌ڀ/ނ7>šf7S4D3Taѽ9>!u,OZ *npgK+-R)ڌ\X (jS.!O2\M:BH#AQ_I%G/{:R }t[cI3q #Kω+YRsDiƙd/c4^>`/i"깃BCl#+Tn/9}sBfehWr Kܻ+ܷ4^XS.cR ?2П)\d"jgDPAcQyFCKTj*sp&z. ;fo 8;duɬ 7bқ6L_0)Ÿ{D Id2/4r믥tد!B=Х $;H̔-A$|td|E\09$ϊ4X%VHH g4EfjlkH!}Cb"#>]eNR{}w?`}'f'?Ŗ|ۀB)EqͶsvZhAݣӥQJ&Ȑ@M05'tz#k9't0,.]Z-Qkx%ǡ-raٰ@/xX,0<,s8g&&ScyweW=;5k++sXrXeZ$,2+  t3yl:/:эnv3l~t58g JdUHghf!LO=œV9J$p=cG(/wȃp0 iq^7U}@Qqsp"tr7޴ b`oJGC2,I7G"k1~sܣ@kT2fVo0ރMeha\\rzD2*to5xOYJdyb{a/?ǹ^cUB˦>e&5|eӋOg[_3R#X6r}q{B0L!e3D/_S̒L8`padz2<,3\XvXrnx­D& gHP}>)q^ҾDw,O9]bEXW; v@B>dQ`ښ }@iB9~`y%%Fepk)++s<-Մxی uDiQN6O=xDd9}u*f9FMI{DH24y>-VA+ z!Uǧ]aw='tZO'ZKM` Y\f`X!+" m2NII_km~SMU0ѴJe,lW eO &)0ƻn ;VjP`J 7_%xieDc:Dޫ m?)ey+Olc 㽨oRa 43Xsu+V-BjO-DwJmw9Dd9K9&u<$UA3JFo#%AAҭ搥rI Oj_F%B)]C2mV\xa>W[uI_M*ߠ:U`qtƴ6ls,au)~ \=ҏ u7={@g '8 ✱%zԢ2,{{<L5t][o[9+1˘K?&`vӽ/;^mK$N2EIvu)ly#<>YU%av;\Aew=a;+nMqB שYM[0v0:iZ(=eqEkr쀉P}}nhHIov`DjJXTߗslLċKyRLqټ (x[ws[ 8|BR& \ϖG$kwUY2A4V>;nygrPs,H} 9zy9r͹7+4Վ7%|34 t?֩[Č" öΰ‹޳=%DCK؃t qCᜥ0bAa#6dŸê\(s"[ hS`&c^O\=.>T K\5%UFQ†c) ~DҠ-sPF0w#Alb3O!/g_P8!iͰ_@Z* ,r g\J y"G07L9]_l3R9aLB^6)Kbwt*cl O7Fa3׷>a,LQc쓩~z>dTC*GKvU´TlGpKrp`>t*l]4}9o:v8ɱT/7^Sdk2z8T1[jЏǭpn_4EfXφ߅uDZ-sQJŀZ"ض&^rCz<^]ߌh:E}gÄ w~Nv5f-s.ﲸ~~>AxG =,}5-|q̒Y,yӿ28fy^xR'糬7 &|coLd|g Fntqgj),?#h2_LzhvL>fl\M,k֥>+jц c=ۑ]_Mbd“hg19!oGa(z9\FVcxU-.N`wr鳵_P{/`2U|{*4(ajRK[ԆvH|Tu\c5f?/TAUXJ1X]vv'Ulܤ܆Y%*&Y 4ffW+I J X8bbK˚Ue_׶*3Z^\[ 򒁎H@hǸ!yZIsD".Ϛx%n9^ \}jf;jeW z|PD_9m wOO/wց4 ?c JގB`'^F4RZ~By/dB_^>|z'.uO %MOW*J ,>ͽ4f9g)卲}967s"S[yϼ9Pi6k n๹ܖiZ~9W[6D1PnǤ-R "Fcd2y:ArU&e뻻Z2w3oNRL5ۨ[_ wbRpzҡZkhoъ ^f녝i|=Wj`Eb%NHdDŽܯJ][8cwh6Jfޒ0qɜdjUz!]~<ݥEeC98٣ngqf@R,q jcȫH3のp|Ѹw<ӴUm%sʧݹۡL `M\4er$NFQbNa6%CvyM|Ԅd6)i[!'0ߟt&O\̉M^G2ٕIr-5شWBjݙWe52Kבm3tQ2$il9dEۏzpha!vñ@&j@B%T1&?tCSߢ\-sr*cz<`ghSL>^1B6^!TEW}T9kF g-ej4+`@FUY3cc GYo~v͖Hm[fҊI!pMKVG Fn8mT@F:VygǓ,Rb97+b'/Š =:&ejd|hJix^@XM-EP*=l(IcA{)f 5>AA)r,`` Tyf|4V#5^;'<&FK\.4!n6y8: [W%A`"pfUb*vRS~F7rwVέm&'w[rbU"]`JTVFƶ p6B lJ 9$HdBmb=ʻ2!I_})ÓO 0Wndo=m+SYv^;FNQyeWkV F:f]$-s. fx dJ΂8nj6:@h F K SoZjRF &6wBضhjMvT^Vv=TΜ;PP7LoEu2ǏMԋ~j mzvo2s!ox}81XNGss^>mͶ 2ת$|p׫4;0!cÕ(?o_0[ wc^lĶ$cbǤ%knlq>di\\*!*,ʀEJhfmZE"QU&ۈH@kƬˤt|ʭE[u] 9lSl@mO@'ʖmA׼ĮC8\aW]DիHlpRLȳDFmEh˳s: #ϑ(Qo|bЫ0G6Qdcw8G6ddcwBG:bP \']q.#_2}h.E c璍gؖ*^?η"Ƒtesd7V|q! \>N0ȌV jg<}ӷV0Zr-!8ECϽg8ټER~svϰ^$$nf~䚾\K?<^]tM^bo 5fa-z"LjGӺӅP5[#ߞCk^ E8i&AV}o4ʉ;_zK,ex4pQ{2V$圚ո *w4E (G܄h)+^գ%<:Ձ]4BF藼`iu!WJ6-دՇ<,q87g]Ŀ\SIЋ(nL/_]7VCZںN/.?==Շ^uf?\WK\lW # 9<9x?/; o[b +XnavYۍLF AŔR Zz?%AyIȶ%rW, DNxC~όD}IGCO%c[CO7@Txi=͡ i%;LNj7G@XxCifbaOۦ72~Q_0jT' G_K> fi(a &=>aJ%<*o/(ё^?a}gFGhsm_0}  c`su3T*FwS}``v3ld}Uw;qjct Ҿߨޘx^k̊}BUփ6 HsOFW $XAo2 vJT{ѿF޽{!:v&xtLAr{DأY5WX(Gż~QQ ["Ï$& &]u 倔9h>:7T< 7d.5ds|c26~&=3̤&q28-6@;!0HOؾSkjӥ%]X5jt=eOegy7ڃ /sX 5Z|ڶ^0-Fx qQ"2uлW{r+R~ۂgWB.8L4NkT@J1X_h~]CA?YXϽ ~ѢۋQkR=ho+.߹_qpR'kځd.{85(+>JL:|C*,̉'9IxĎ뮞 i<7XW fѹk<o3Ck5/SɀJ+DQ_kˇwGcGn M:nQdv<ېPD3F|OdQ%)t&{h`!+q;3]sGggP3u(&u)8'܏:K ;*VI_,kZ)  T=%u~$;==٧L/O.gQ "7H)T *na/=F+NGmy)D>@S4(e[We_$*ptwXﰣh_A 6荌ntQRri`EG 5+ M:ٕ]V}fNp+ޒ֋}Lq=7D@o_Tc9ע{~WxvP쫣ҹju¶a iOTˍwއG/G1)ۋXCs~o7}8-{5op[^7*@f>;~>!{#qYxPkVm $~LLExm9@rNAOXcNjPخ6QRƂq>JNTǖ6EBnp |J c,ռ?c.Ao( U(j)Bͻ^&=/öCYB=ճW+bW+US潲ʲnܔ='fDݴJ& *ZS.AQ!*nωZp tmxbPYTYkJKjJ)9}*vtV~^eY E2oOQQ Ztb4eU5]P V/#Mzo*Sr` jfskԢgAUVF,ǡx H~ޫ Eh@:l<\DP0nςٓdҍZB(0+qc SU{e6ot3x;̵Yo¾7dIQq:SFYf09(>X)w0U';rJU8:p6a)h6lT%w 4?%DI|-oʶ8T7!(zHd\>UwbNZ6CS "J[jhPy[` ,yRMm!hb%ܰ!Q ד-,&j&F%SYzک')Rn^[ATmmfBԓg%:֯j{B,M2%7dH2ӓϙeÓ}H#!vv+o-qZ _dX$yXl%{Бv41*)(ݶŦ JC SD.v-Tu$&LZ}梨nI2v}]'*9+ݼx: 8cښq_I@t-wi)tr"wS |qi&xO.Xw0% ăěIƈ܇j.2Sj1ldG Yc-`nuԨ8Xĝѫ>zwm!j2, tM؈<%0o[`X;XZf`eL# I A&&`tΙ1@2@)D?ub;=0\6! 5m彩3]Z=֦ 1INoR#8 VcqYwͬzK#z9W<*v+'.MatoHcf1MћQL|^u˅f #Pqѐ $[ShEfܱ$a NwvQ9_%7f>m{]L;c2 wpVOB4dQd2Zn{M ve\Ge\=L9Pʻq-mFC_9J-8_B̨)N_vQi4_F.\ =!^Bf"/&fx۟!XeufsNzi6-K!3f.r7‹h.?Ʃc^s5K5s"Yլ"4+͟ȠԼ\1*B+481Mx%fLS)ےaۗLice$VSw0lזl0u3+1o+?(.;*w= TT@F$ UƖ=&)=V_b5Y&#RGaK~j@7;'`3>+e1JR4dI{}GC/sx(Jew8]‰ 3,F5Q3vg4]Yfg ~@;ggW$VQJ XЃnyd0<^DAmFՃ}> <~`+6wFq(:Ki[ tŧ}ˤ?c=6r{P>M"ho%~7Ew6ARt4 AH Rȱv[威)g4]EK_~2/GδyԾ15!&mFLv{g{֡ǦrlUxaqzoȚ6Ш }`]KކYfN4dr_?}ܿzHͶ&b&E|y7V}uNؙZ#t ==jEsT1G'1n<zt B;B>3`Fe玿Jgqy,C y> #`EƬwyiw KJ]پwߥ^88KPMzyzWCܸyz]8orSz;߰rM)BqU7̡]IeGaÖ3_|T>Lۨ(bJC4ƹ"H손3 6 sB=qQxSyQ%:"GFr<7>㔊T蘻!Fzӡ\! LhRhzGșgv1-`1:UeC. gL}<9q3 !G'Qq6j9?bK򪱭8D-ȯA(*^U0e)uYr]ԩ(>S V:0*íd[PA*$ny^֒S߻yd^V!ƿ8+Dj}Jq .,1}rA+RFnIm5RqHuE-:䵫Tmy @:VntE+R/h+FmGܨB+Qgg\oYWhB6 shgۨ(/wX+9`+[^gfǢζsO+0+< ӊVd7+XHQﰨUKi,AZR d.AZ_Ce ,;I;%38G {4n؜($;JOFAZHZQ->QqHa͊zΡRP6-1y;ޗ+y#|pKỢy:#k7; rT;XF:/ť$jFf[\D>r\*v*in F驳v2;q7W{V>_nR<_7yk1VPؤ ɊE|#SFGŮS*<4{P$5\<[S-SnzXMvdfWb@#VY,xN5TԩJ-{++tb') {=`1Q=Z`=F%:*$l7KF_-i,uG՗36:Pބt=Sg۞C(?N7drӇNcʙ=ePz*e2]BAm)Mi5:v,\&%kO .j챊)HgKaF3{(mȣMm0Krafܨ '&ۨq!> xcAl 4 :Z"S.R'w9g{G_D֠h*4"雯OgC<Öj}_h_<ސog7'esIV6/8Qk{쇘$^2[} O4 OY4.^~痥:jiK,E[q{8Je"D_tQ-22/ۜ_lC-UfACpBV3"*1EH4TDC*A|X/bDZo?~ߔlrTYj>\|(owߥrujShxP89$#KYG4+RD@ko1cC~I`+Dċsn/wCSg B.@9fB5 Ao)͚b6>p''-rBB4@D*vk.B )fs@&$W4hJN%7Z%<"<'IeLLbcۜȮULpdT9"["A) Nw}cFDEۤqwA7 Ի$2jOJI{`almr7d꓄%h{vkGv]l;꺺/bبveWTyI[k^"(kmŁؽ%)g1r,ܵJ5 T0ѻ%H̹3Iͳ*yz?q;Z6X8;tػS<ր]MJ5KIZgtx '4UK*Y %$cIW7>//[)EvSatiSVؚ\J>(B Pf;\(WG!rHΌ^Fp j󰛽j/![§rP\^ċhuc%:5vƋ.y|ilL;P]ƒ76Rq\QEb#"T~6e5lF7g{F6cw}&m g># }|x.ڱK0p&38HtΝUEsl7~`cty.vhk2M'6+~h:8UFo|e3NfJinMjMR!4:i{c6 @?[y4}CqFdύ'dynf{Ni׃yAfoZϙy*vc0ף"d+5wՁIB 8./J@ SC&TCDPFZTUhS}4Ve׭yK&!*E/ @c%9ϡ8n;0x@ha2<eצ6d"uB =@½@RFl~;E)*$kwLm؅Dvykw3gf7zmHw||C;XNi-vLpJauգ͟\WY[\2ห.,jj |Q[H-W)j-Z_,/*h+3KDwq GuI}\&.'J+SJkgǟ(t!d敵\*U*7%)siKY%j"7+Fax*"s;2 rc T| 69^B(BK&e#5/>o]7_<;~IG*nr[3+3zQGaAާmmK7? qTb¥ף*ӀjI^BHUU u*'iTdKuHtj7mgjJv&cW Zg0醞0PSy?- Vwoh `bk2%+jNH*8Rk)\kۈkEFDעD[l8 L6KdzKMXO9\Cr~<ܿ`Dl'8-M|H~V<݌8|0JwkBN\]piu ZRzxݞe?1&FXU1,%~WBs]1"h28Ln:$R5BzN)PQ(ad@0R|ի:D hoO'ed >LH9:߆gt'uQE_tA{Vwu'ȃQ ;nF iëǮm‰:Qw W(")|dbeuQRV?!x|߉UC}C}.Q{j~DpbZz"~h׌s*}vиao~4_ξ|Y.)Xu1(EAȡM۵~ 5 $UM{SSR^H aRe/uAfhr.㌀,#i|+0w~}h[fĸqh\>U!Jc`#|sRPic^N6e=, =|9r477*E.Ͻ܊ٌќˀԌ5}V͉.\K"$zFQORU!ydĽ?~H!F٭Х~99>Mb$@y_Vj\;㹵7oXtN`Jlg?MH -H7df1)=)]2 ~TdO0 fK2RZ=/"#ߨ/"KȔ~Ѭ;"Ɂhcg^j*EVߖ\ rCȉk23I<,!J_crאLjR3flAit~6mcK\/y>[B@S["02;oB޾&95 3_X}/{_bpSuHS*w*䝅뚐 ҈`'~蜓tmƏR-4_ cyx2ݔIN + MS&>N΍c8$Snxd U ~P8^H0QP !DZB}KoBJ7xAD$G y`)Aqq^s0$c5@A:; [Ss#L61l _h~s@/ NhughBð@ Dqvߪ9RpBn9Q -nRSvtN3 HGy"hYy|Oɹv;OɹrMɹ2%anq~cI{tI1zFI}wZc: `4oov-oukALRܲ'-`)^Vz 퇑 vv5㧋7_3? X;TZv-W&9C^zVBaw<\r Rjlk+iqJ.1;ήtO}`k]zrF]#лwdh55 ͙,j%s)mC6E '8[bjr͍1`ɋJ>L'J6GUl[TA XnUf[)o"qVW/#[\C4.tpsܳ2+jt˚`v54FS]8F+cj([Hj%.6 l a`@Jhł0u۬C[TJF~U.w_T*N?~sv[xv;_4zh ][8G'8舐}tOx˧wH6xXYF<\I]S`J=RCk1Jȉr69Bo=Jo?)ĕt%g}&2)j!pWl` Hf츯P|{>0?'eF dj{:Jw$R}pe<쒩= ă,Jp\و(ưqT18zNQ]hMs #u T)8)lhyDl;Jr)l rw|͑[oتrcH sBSbɻmL-SV>1YE`&VE 2#/]tv~apn4cMKA@l^^/y6K"i"z)v^r؅(s ^o?VcXբ/ʚY*>x!Ť_0 sڹ^BEВ@-#Js5Ŭ% 3%$?PQ9xˁ+GEht/4F,ƞ+~dzzVZn9yCF&C"IUlǕCiiL:a>Wv 㱫9>zʫ>Fg*|;Fݺ3aQeS/TK| x]y\I_-ۗT':p ;t\ʓRe!+UwDUj{>j{ʩ9 1w8L!ʆR)!޵5G濢K$hʓqR:&=7i/^{#iEp$P#,\W3x Z#F~nKY:;PnSq+\\T?M%ջnnOY(d=˓3ji/g #KCSS +0s3QLmѺŶcK&z}ƺINu~.4!  ey$ H?ɒ"\\=Hmk޺ *.](;v"G[f=r~bו ^N}d {-Yg W~ 2W>aW݈q14u@o}m,My3~|K=t#V0~+ KJR71}ᾄfeG7B2#.wm߾&V]l:VH668NED%4P6\'RcDWrN|` %BB9T|qS|(%`sg?g0RKզ 57D C!ߗ.600bK| s)}` HiO\p3ғM;ތ1&TmTQ*jtVZN'1*dbAVW)kjF)gؽU9%.'w^dٳB&(9%ã'; ߭s$TQ_]9H^tzӵ/]/Z_^_]_93yc6mL~*B-BY:oNoxV_T0vim>^Gތk"KLOU4Z}, ܉ BLѫ_<_je=dǞ#>tlpÿS,Q$]hX [n.&Hvː}t ٝc:l=辱NS$~-;ô>&mP@r  rkxo Ya.''N;eٵi/a< ch[2fxj}=raXxJS}P]ST*~[hT;-YO <oJOIO0x8vq,K>`@}v-9RqţS)m o" ;ph'ᜠ舍ⷁ#ftP"3vns'F=TPDhF:9A6jt(n>n}dDڍFYEHG-ڼP* yb 7*ϟG;~A篻9O7ʧGd8_xaMnWu;>]XzyGu^i$}orj=ԯכ*-OF}zԮj%kEKoUh*+AZn=S]'B~?ʶ*-^EV`u}譓K-G_`Nٰ˺}]]6/Zmj.!oS"OZ *f,c4r|jk\LltL Z|J+K`Wn6t(mӎMkG&h(T.(7o_f4+K!@ЌBϛr ـ}CNͼ;uiX9~h1#(gU9,:b$wŐs PFDK$pCJU^ׂWfΟ0,A\*bz}A ǫq#&+\hO Y@fSN ^z4<|Te}Z7y׀q׎a!p`yj|P}hPm]j:*]MUGWW v 3i=S՘Qv/4OV. cvN.պ|>KJ1;>_n_qt}wUgF!ȒLmu jGNzMBg)u@_h~Z柧W{QzުhBUv6IMs{OkrxZ>to<:0S}ߝ^ϋ˻oׯxqTW 1@@uyw|uXZBĨ9 i+Q%bE-6\זc!LGԡI^ߛ s#pO7^}SBpB v_0cQf,5wbrrgw,qUB0zAuBvY9G;~F> PtR* }^(JL(͏̲˽x:NE(:dߘ'С(Ds^ş+6+DF'v7}Bl!Ch'`@fpvVx@@A;iX[Ç'{7uzgOvj<axo}QgޟmnS[}`œd&=nlY]"#xƛ>3N)o mFeD+#bj wkEdž*BNT5X _}x؜Gg(^sRTRCA+%sj P3&HL X;: mӘ`F ؖ-DF1e^;K?}s^Մ%|"֩'V#x K%d?{WƱ RUǀۓ qAV7 }NuTs$%YG"{N,г+$98x̳oEE8Y1GG<tSD2F|hjO@':R쌶"#/:+MTcR9P9 häjyz> d4 > BbZ~vEPPc$"c#;+8uL 2FŶfЄ8; 㩟d2j_&co։#Q 8c$J@;DQ蝽gN~&dnnRp%+pW (px|8h;d濩j~(cK@ാ eǰi=Melp^䙅8c7"/oyٿd'ÁzRr޷24e)U_R~_},Yk6 cb{@yK^rK >5'#^[et^LduWSb^ -Gahb1Љ:  C&OߺOtd{}@&iGl hN哿0A9O=(HkI}fpl4ҋ$sҿk8F\׋4|o\i%% W0Đh77o˃r4vODW:tپb`!w#~Wl"F $\)e~+tp4 t0hYfyv#\?FZQe\6]4=LssS"FZvuQU(Jؠ `hkO s2'y`D - 9d~+ ŇԘh-!CeDa&+&HMY$=DIeEkπ='g$7J'"Tg^]xG41Vf|aFkXHKd,V2 q/ hBmW"<(DBԢ2Є#xP!!ҋ+88+>Ȋ )!sp[n+RMٯ&4I6~1!TF ­ILfM[/CrAҎxoO$*e}C3dd3@NF\꘴Xt膧b۹=\+x:uɝz?l](8L3b VFv* % i-Y#@ڨ] IJJX(A* :AI ibFP8ݹh~\ v?wF.ZLSp]i3.PEbY92⣭p 0fXh/h2vQ} (ˏ$~P 9M9M#6܇ q8ӠiPϱ8>Qp'4CSssxV$sܸoF THy,{qJ0G4lc>Þ./};|9i=DbOkٴ dr;D5) y2h9b5N k 4II1{B% 2L,Ok%2 d}~;n_VKnmuiρ2OǮm9 `((R4a8:?15gOm+0B$<3Z sY]ldnضNQ | J5f9ɼ/I&rZ7,*ycmG1+oPmZ%=;ꤖ 9~&@R!WS Р08;N}p^*n]Ymy}6}A >#d`nT2FDNJ1)C/O=o=&B oEa!˙W='* 5KXEzٛA~9%ס% 9$9ٞfO #N;V⒮ʴ.նz7~ [.R)b{&K&QK`q2#\3nC R[JHl?K&c QP )F R*CEKWñr|l3c Ԧ=; df፰ۧY,\vjj{.WR8+puJlޝl2(3krvFA` wz{6sj o{!K?ZWTmgz5^irV~MD[ V^kBqVk[%&?p#h*ɰ{5kaokij8YD>j+jAyY޺i0 6}Tk˫IwY[ZPr"}: [ӇtiTQr mNM9][E|B@T!u}ʇ/_.Xd1#s0a 6inlbM\d2Arp(xj rYT OQ!ua6L(jdcU{BpQ)AHQKpra~Vya2.d~uoY ozIy9:I` ް7WT^Ldn0,H_Z`7d:3w`t*ʿ=B^ t_sf<7'ےޞhd_6~Ǎgk j} kH]0X9ktNAǭ%֝[`g f*@pZ%Ȓ0QN'7˟3_ B27~%dMp9А#J#,A8+*Y+έWP?C8|H&~.8C/pzT))T|t"lSvS0)~, %w,M3U.d,FH )hĸO*"Trh:Av,Y;Q elͷEA`(|9#Q@a琧ܒXJ0[d.V孍: 8uC8){)ˑʲnV]5 OI 붢p5$7INrsBpsrn[P<C}9.h( >굤7 H gJŽ*ZB~Ͼ|/:g_F 1P,eU7(1ӓ(;ZNZm㛜;ޱȻmZm7n[A.$MYQ~p^{#Ɨ1r4s|Z\5ķ4xُb΄>zxvf^5Ooh3Oott5{1-(G5:wRp'a|KTKQcStoAW|٪+J3EDeqsIbYD=@96 OU]UyH 壂3t ?q&hs׍@]_fknXռ7\i;\x=f1(X5 ](˪ |]Su?\;'L\/#ZjtG4xN.ܑ#T}d)HM9prwpM=uS&e_q}Uָf_}Ŗ^/i/M׽yя_#K雟"N y3Q.h[q4RhoϠ`2 o/d)BnAjy]U8"PoY> 6"1yCw:<FԃOBuSO*+8L R3@E+qWȇZ6@>%=7$a%u{n;{IGJ^Imv:`[ªSE)'pƾ<#&9H!(&u;WE$pMF$. "AF>vZ`J XQIBI]:Z)}?ҕXY*±^řO*dÜ j]wQ= Z;@A()@q?$b .؋(`iv*vHweRDT1Q)Ux{9@s\o.'5 2˲:ǀ?lT6!x|-:`:P痩GL,zX" ҘEBOCC6,X!S!Fd8ޟo3a^,ok/XzUu\|zS&1i,{fVsFao@/--UV&::Pۛ[}THe ްd@gӍYJyCV+CJ>b}oAwRwݨu>tHwe|W!\ɇKut>D,uG 8d2>DC.vơR3(ǡprw=aj%19ާ$ 8;HNZ7%Þh{ĸp00+p:{2#ܛiol9:R=5[oRcWELwC|ZR0Ź|e6%TC$pO b0ڲ 2 sI 6z!)1)^ߙ%Dncsz MxVlw~a ^V KCAn姴cZf%QQ(8<`ӧ#I A}VP-k \` z͒J=%ԒϔDpzD)$8 r ]mxnu7I>.H`|Q$yWڛtp1 3ZDnj2PZmJ+M@rM@kIF{n!ܕ.5 Jy;]r]9fJXO`J7+z"fhD')G΅M w讶3ڑ&VND@ OxzI >ݵH͔dM2 ,E뙭H(J'_ L?eHpwc@X*/-bV ͪ6FPH8H0ϗ,V ߱9{,-J0 aȑ9)Gs_A`81,0Ö,swS& dU2a}\ReºimKvqkV|(5vUЩ dGw4TZJB=fUGfcV}*z,n3}B`xa#H FTZLu%N]<ɬm;f&g t6ޞg,TnsNrcX"#fJb=Lr}O n_ty"ι+Xr5U-uC-C77'ouݴIPP]9m-8>1_ T1\r0 "\W)E (q|N ƘƜĔx&UYy &FKp~] +[ݸ*6RtU+fz{ASsI7Mys?N[S%̇jۙ?6"AYKm*Dn*Zmmxa0o#ťJqǗe$TH$J$]z%H!i;(0#{m}Qceu'" d=!OiI!X$ `a//V$+.3H8eT7yzc F?l WڋUo2B'Iʐҧ}̪A1excbmqBBmVNUUުC{5Rw$\}d2$|);T_]fOU5!̋#JF= 5?{ l#@'Z$)ltQnf)+H.W(Iu *XNܳT1#"%kz&a[Z2)IlsOHnsi1:}M ;j7礧T)1Ku{EU">[I=OqL] T٭u1#!N8tp(A>mkT NQ0QY9G 5 Tjy1=TKaMb(P8\XMpHbJHaS x*MBȩ\ R%K,sj?| J~53PJ*T|Z?z}X")H vџ@ BqVƤ]m]+0v)dJP{*%)ppFoS%'-ɡ_HoexhwK)$}pT$;t.}w c(Q$} %=DLG* pc+kXog\vBHxT 6NdzĠAgG|>r#b<>^=@I u;ƺA6tw`QS!J0SwL7BC䄘b&<H71Bkݑ+^l "IhClQ=X5Ⲙt{DQdVqF(–dF,r$xGGK\h-`Kl1B%,-Ә03QO.2{5c5Fojɫ%k̇I2)|A4Fd9˭΢Žj7|fy5I~QeNOҟ*40(EJ0ΥJ"rUr;UT.߯FF1sEGUPe׬Ղ&\Gge%={g!ÿFQ:0]ٳ"g "OaR[MSt}_wٟ($3ݞK__?D}ӛ(L&Zj4[R+/(:|쏮CdzogPׯtK zWYvP8{{d& xT䍞0&6Zלָs.g;h=jժYr(kwG`3>̺}+sEvR4sF]2ì\tٷߢjt=W}μat;~N35}x>zӞ,{ qssCTY -g/_RNF+\WSӎ4Yg\;~3̆?gh,4dw=ǓSs}[Wv'iv֟ʠɾ;]ZlZMx4Yr5IǾiD&}~ǁ~'5;0fxb9qG&/'묛[qb5/Xbї- Wxc+eͣ|WAmw'lva-\h6*'STPx2|xcPQc54CVGk z`| Lkf%yIϞ[\gh:\:~$uuݓX.7tݓб` Sfa/XsF dmox(Aer(CЧ#)K7;c-7T7̽ݻwzsQ< ɡdjc3r┦XWWߧ_u<Ɵ>{&W"4}bPz"Pq,;\9 >`=Fs푊ZjU]9CZGO?!رaqwjL]r6#eIQ._c,N=7dg$Se5,5!+I7uH1IHL$$Yn4"eH3Bh2-ș%@:NXJJ>/OT}yY"&ِ%`ߦXrL]Yɜdʂf2cpgCW ʠ$!B =T R5߬R$0ay:'ny{u)6?1۟gYka};e`9F8 3f?]8G*t :?fث ~2)901sNl}nK6f0`Gj#8׆h<9-'.wZĤTƍ~)9yRG9XG|m_t>P1cBc`eVs4|iieY#QvM_ c y5+;5J̈́}58Ҩ<0O?Oy{r~0PpƤU)?Ny L?ť|@Jk0 '`|@mHzyWU"*x_2C]Q[-9% "U倪(0@ ڴ1q9hQt-hF^3޵)$_/f/NvgjIv{~v֚%9/qAQʑ+Sګ5n8q#qE(7B?|~ ,i1=LSȻ'WƔO?`""C8+N9ţ~|vr?=ڂK şBşwtգL#~xqs{d$DK+]_kLQ%fg.(Ln"mH.]ڸPBnb>~=p1Hm ]tϫhq ҹf>{w F3`>fwS0F,OY瀨`f b:)5B/ڐRJ1&Zitk'?,-1Se ?iȀWWu8Qmlj9RaC@VjCN,l`4%ҬTrrDTA]c%7% .S C6" ێL~aSZ/B ׭B2 lM`zGBؐ W#'l C"PZ2"Q:nX`G-c-@6e]X3"SJ[> {FZ펵1ު)0a!%5e>hB׸":dy@-~2m5^Z9Oi'SIC4z~bF3w7ZQo-K ?LcQh$~h5'LV=ŚJm0iIYƐ"%f9 дqTi * ƾ|D2)˜`ҧH69g5'Rn 鮁v JF;XD 7 [dE8A*' \{0`سO*5 ` 7+!m_٨6|d_kyY>16&;Ƶ(6XU>QζiɦbG,"*O|}`~44^Z4Wz/Р1}ihu1[ts\e>WumR3?t^ :KI:jET\D-.\fs-uL]ԟ#k5Ow5i&<2gjVy ߫6ix[h؊GH]ޣ%;Wfi& {3W?[_(P{;Tz F9 m1G9"TQʫd=XS+2F͌ I-ĒnBWmmOs>rqxIv0jհK5cNy.3aW+w,̦e0N o7ۻuh¾ߗ/p\G~Tb^Q࿞Y⽧.'7/w,_Ol$ؘ4#V-^}nQ$ɴs%l.e.m ,dqUh)Gؘ*UZ]'n5M%ʠ3lO\w ca]Jw2osq4b*sJQ9xK28G#j_F^~=7ҨJE϶'RښHzQ5F-ҷa{vq.zbװRpL[L9x ,[e;cTkE*[g/'"z%k/!=]u]z{]h}=Rwn'zHWTlN>/)MI H*;Q짻)!z 3a]6 I93h+grN i} )~9@g_%•lIS\QQQUq_G&cJ[A\ 6RTƣrn-W؈Qz/j)K@W6Uk@(淮{j~- U{-^k$`b :Vif墎țeincҍHe|1R @k/* ;/:]TηRz5W&BM^΋n#įQ\ƯQ^:oZh'_7%8e@ֆ${S!:B9o,($($() IEАJZQwXWs\^>R9&c{QJvˊEw;Wf"at1Wd|:>\cfb ݚZ㛷YmQ<.<>߆i+WT1V.f6Q\weHksEкBwۇQ/ìʲ&vGh'̺ƚg(Y}GR}#n^SzbVj:HZ?Go r7Ӡ핻gu=[{;P=GEZS_UU1ߨN˃#T 8Af>[{AmBE pc(MqNA l:(퀾 xQO>ᓈҺ/\D˔V NSsX wqp Cӈo ckv+;3xqrܟ]M"MNi='<'ZL\zզ)m 3T!C^hr"9%\; C: s#VQU3,LJzwqԑ5Еm..kk(X c;BJWl9 ΚMd[S QlcR+>7-D%.D"UZyJ#o,ԳS ʓUYڜ4|js-V=vq.+O?^2Ke5K蓕c%dΗKȤ ~OUY;LI85 N]u]UmJ*]xkImyoLTC\pB? h!$wph9*0Õ+v AЍ )g \hr2?:]˧^]i GrMz)uc#m7xɳJ.ڷ;r[}+h-~Twx6n*6+`L+ JԎ534s{z!*6/@\R|n&A9̯ӕ Ko"P,b޲O3w`0{r%'g*CQ3dW*Za生8Ɣ/k*)8rQN];~pRMo_To7e)/?~`Ya{܆aSc->y4^I/NV/jm_zd#P}x FIqO JSl;-輥X %Xr7 }яqVZO]%_@,3̙yLxp | ȩF(jÖ!VІ )[.cj,F"볆sZ@".ŐDb{ʀLoĂS!Z/fN# $>$3>AmՁ1L{-)ЈnR) v5xD XLn4xdAǍgJwmu~NX,^8/I$,6<`֮li,yvgϑs؇} 0[luX*FD\92ǴO_t[yyS~r(`zթl/(T:%UҟRs bm@譞O(lҍ1L&@|K@ Ȗ]e¬SuSPؕ|O[NS^dZlǤ*e4Jx2qcD Pbx-zX+oPFjLwH. ot-O\borJxTec9@zpl #u[1%#xF+y*(fn>|-OC~uQpGt]`=б:K1 $SgGM0} YV"PA+xmG%H)ġoe#2+klm=t/gۄ-[x }ĩw(~_Z}SE/5hy`s}~em62wF?c1[ԄǪTOOKTf2JȦo5ebO\r(]'_.6~|a7|xbI2?xW|xeZU& ʭ&8W/Y~R<7K?I]2 Uo:\>e=UB9]'!>ݯm(r$|m;{3sn2N;hFZڭym |mz;ڍǗZePw n' x5҅[Bk!RƀXcEi2]X)PLY*IgY(I.ɹW^uQ'nnu۸td\6#{[AF'2 hB^,5 >&^aT۸t0Xv\@1P O g$YzLPڋ'#KVޓ6nIzudN:vLg]o tE9ky1xrpX,8Yh>rfAf@Dv"X1xl6f,#,4OZ&klAniuL^=1נsAqF"89. Z 3VvP=Ǖi 6ʨR= MWd69.@֩Zv8*bymq =|$|6˿-Jݛ~!ηxĽcr}*91~yTox* )|Jkћ$YO\}grpHw7e2w-]~G\r9wJ?þ;}Z4. NXΜA1(h=^ #O<pw8) |Ĩ `z!њ!YꜨvUD2l^gx]Mβzz]?QYQhF5Ro̷Po ?'\ *uuAS A읝Xe2rR~yWO6'1LAxI{ QBiH -y"CwdtGˬB̈́68PN5ɀDݯ'/i{*BO睖AYMtF/d)ş"YW b5BP @0r*TB_g9/wE-]{"d֑ӣd1칭y: 1؞A)ae;0iwj&B)e/VkeAݨ4|$ u'ke:UnæqyYGIBi Dv%ϬTBHf )iB%i16ֲ_wO.~5`7?".LEO[ 1u-hبBNqed|eq.R|}\~3URz%Okۮ/D>C;[ _~\_S;kkX{|x9]g)ZwqJ/j2ab3TUW>`8zRf*vuwq4ƪb){f_c|8Ux:QLv#aeU| =%"Q@DaL)lє:*/6*8I}~?^I(_,,9;6ʗ7+ 98Q$QtL3*>\_,%X4G9X˺,F'ΐY)# k#|RkEԑF;ٶ#+#yY~2+kX$@E/8`O\!U/lua.9,׊`D᳑NuWgz`l%9V)٢n6H +Չ-9pXKQpAT“P$1M> <>[ʽ嵛N ,QB|^T {8E͛yWٿ?Ԡ1xMN ԣ|6a)ޫ_>d&KS.m2Iݙ z! )nVDmA@adcFWN]SwЬ3X臚P@$<:d5ZltX>d_!^Hc,'8PGkX%ISp)QkU5l'&=֜M`=M]svmSW>g=V!AW4{;Ԑw " fs`j`T}"˗'>E6.Ό9dT&L%1&I5c>{쬮rr5543 jhG1ʡNVػ&{v!lΌos[C6^&x3  a"}BzEԄ#-d†ޥt6O'4BiC'& O,FAmK$O.+^|o@o&i6|%0̰\C˄&>Ň_(96}jVy`+OJ4(8@hC%눈Bz1B4*8 ʭ!rF_6ieTHmKY;HƨB1%XQlQQDno[\ ?JQyĢ<4P`xdYa99~fێ7ǿʌzTd+a P&9dL8o8ȭ8LM (yѪ b&52-S )`qRʅ510h^J$)ޟ ZpR>} ZK,#L1*22wPq׽3ÜYgcheaPes-nNHHnX ,兴К([gýôR;7kO"˺o(ʯt5g7b{ƽxZO,Lfʯt/dY!qeմ|~; &:z!Is2689`LS!D495*dAf);DR@YZiH$kMÝpgXpMB gr,vO̮c4ӣNH(Y۳k/k8}R SEvw6sPΔkMu?,FhGЙa>_=~߈;>k}_JxqCf96 aA2U+gSwo{ZAªw ۀ^?6 П -" K$)l a8NLrSa 0L`Q<=Pymِҁzԁ{] c[|%tsCf4<$37޼}fA!y#|1H 8O"CdϗQ0^-r2u\|J>@~ 7QD&p!i:AYi5m>g c!?@" ՄPPe$@a.`}tvntrOO##"ܓ3JpއyB&`/{`c2ߖ =xD EKLQ6gmR@m!%eV5@@aLD S)JxkgF3g%FIjY~{Īf,_PKQ <ܖVkۿz  d_D ',934("dEPB`EA3.x*CEm5P2xh/E S\RaםX 4GJvJ@">լow Ns!tqhaFr3ɀlVb B RڸrXfk{{_)Ћ#40P EGM">Hި`! wheG0qSGktG1VˉQk}wxojfӇ]UfqK8Z(mo^Zo&Xe;VI5m2qi'7۫kzzssI_k*B͝U2Ll֋Co&ӛr/xNysmrY1uS犨<23ݡI./v8[oғWiAKuC0^p}[~1"^+Y+!M6U&.E /*0M9hIj6u",\:P=A~|Q-umn:j3QJBER @/QrT^X -Sk%5*|ހ?Lޗ9U^9IׁLٸ\r 6kq}g3 ୗ TMЪZ%LF}7q1/wտͽ} M*{{uY-off;Mdn|p鸗yCE+1>o@j@ZU\D)҂F=l~v&q׳I73=Wn-ò گ (_1>N?UuY1q"*TⶬmܖE?=A?Uͱf< [aߞܿ BZr6Gw $:Y#0HZX'Kȓ<'P\\bJ!4t\JJ 8uVBZTrhɥF|da< )K 9IZwv+R;Ӻb έC^x&O3UvLM1̔ +ېNNp՗o*fՉIA?/?6D;X=i*@ͧZmf潭QCL:M=.s0T L^Ap)rbGB{CDϦ60O Xc'Ѽ?NP a^>.%ș,]c7Sɼ)c=醹^dRUyK=rJVzQ'78eynmFUI €V앱I_^& RAOva^NMe8NP_~Tp7[!kޢiG >jnCQ׼vhJs "5:#&%Od0ZA0x71,'.bDGl01@P8brLQdX'ֲNQ"wGݻˆZŌ[E J+9o~ &\ټKW(y5BC[#j0Ѫ(Ks\}dH! PhǍQ)1%[48N,'APUt^mngfc 1BFތ @sH=$~y#]߅KR9uJQ?Ӫa%5TN0;?w2+HBQ+, $5eUBDGmJ_Vj f777Ra@rr@?Ȕ"F!ӅP ,VKA!*~{ i'n4@;?e,Ӈ v6D1 %+sBc hb (W)ilqHCD/b͖v{t: TS;nIin5PMV1xygԨMv ?ziraG"BtG~"Pӳw'f6Ah6&AERaP&6T $\좨&-kfJ1"kK#HU ҺA".58 Q1RU.x*MaA/&q71ѓVL>Lw<\HG30QS}lc^ !O7[FJIM}cOՎW=H8n$2%5Z:|D`Lb#5*Lae 5>ztRZGTB!bVV< {[|NDa?0Of"{s?: h#>u0g'd[R?If$)#n~-fbD]Vߟ kC;ш KDPsnQ:E"ge{CO75d(?⫅}aM΂&˭23G3=K=pw|H<솜 &MzҜH5"y>pOcqƅYGgK|Uѣ g?G1R3s@܍&>8] 5XT[mEǴ! <(p0c"Dϯh{<SKs>9to?:g&q+QBr+6Lg~4* ;$TIrCf5Rj/BQ(*ɒ2Cw[mWILK<#ml{#(Kdۿn%bOi[ɰEi<%0k7X{A5|i$GL`.6{aWsPG=*QtoŜ[}jPÎ/= SyB_)8E5#U ܞH[>vU{e/.o`z-A"(Gp(9QL4PvrBHv<.-Yj Uh/oiR d`KA$rB -<݆"jNV;P8ԭN>&07|dу/9#pFNEP6._|~dsb6~vx-|Y?| usӿ.{=ϲTq,2!od!/wp%ޞ& ;2{G/}/ikO~K;fWih;;?Lgz7>\s63bڧPKc:`#ga>}y%& ]7 M k@w ;9m.:"uD O4sL*"T)͸ T}*/(^ 99#,ȸ0RU<&/eP +@÷ ܌&}r[20s^b }J1NOGf)>< +wʝs]CG44*!7 9֠8cN?&|k'9pؐsI)K9|$ޅ(ٱc|$tbXKN&' (fM;pR] ^jTOvW<Q1Q+L]ϼ鲔:RspkGESbSJ H{016`(sg")C#Vh9w>pR {3D~Ϥ7XvN4#H4nT{3kIga*KEu[8פuQ(YeX-ͮk`RcU+k=Ó3L٘.FoYB ۖ|q&Q ЩZN&6{o؜P["~ ԋ!L+(s#K vhK#ࢸ|:If;$AJBS}w[Oo)U!kDɽBQrQ(JY[ӅtX[J@jk\[86q(ИF`p oSpqy je(F>9C+e0Ekm=X  +̤RbcW-s$"(%F/j&ȶ#lMM}|$b1A>R7tVbw4 lQVLv{kXFW[o/Z y@!?#ȸa^6UĽ\sfģy|/nt02X%-4hrD${*w0eT.URm6ц"Ke{ Q*`a tH70D9˔qHm)FvgqW>GWtY6tVg@!b=I2%t}LMj^@3.a 8gO<.#,EVX.@N=6̧J #4U2DǪXUMޘT.gۊ&Gc*w7E".};U!I8UUⴶ6xH|+x$@x#UH Pm3] c3#=”*_4m}]E5ՈI(Tx.hmjטCm k1Yg [`VU%ʐI%#' w`ʄHx/PkT*[J~3ފmQE4^b)#FOTrf,dSePVsLJT" qr+AJgY(xD>yRDD43"n h>D KHD #!vG:}>m9BSu?y֤ ~~8p]{^cBHi 2OMJ]x4|Y% RN5Ċ L>0BǁD:GGAŅ>PM 5~1r(8kU{B DN1`[  u1-t7756AWo6]R#0bV!NxNΕFK,T\!QMe[5V[`J0Ѥ} c/‰eLp #B0]44ơ`O9ؔKF dUk.Y8EȰ9vJ9{0y:|OĂ'_CKgfpʨ* fsi\aD.`ǹ޲fyOmPBgsyf[rXS=,xF\7jv:OﳙaI(𖥨Zߥ7>fo|%r3S|n4-ʽŒ‹((}L G&K5=Naedl.@;3R 6pTk> @/C{dRe iIGy>Kz} o7KU75ypV!^-AhVfGC{!j\N"]΁oi0D~-pN];͑\ $|hNRo" Gp^|~-Z0Xti#Z<Ɯ21ȒHsi<@et[V1Yߚ @%mx]ڔNv!,}tr~Ui8. h=?%ƉϜp;L~B &1Ku戅RyX?Æ c8 /((R#dg,CV fhB#5+5HubbP(# V<01~$~}/WUQ=H18 <:w t*3;^df]p,$#Ie NBppݎ FʏCtDp_E.IsX fkFb >®GA{}k¨.ss$a"Ns-! S^gM-)]RGtc ?}Z;uF;C`c_TOcR\d-DRci®k:5`Ux0W3hp$f#{rne tn&w:y  *UAyz׵ U=*۬e^ow߱*DBI^VSGQd+|njUI>nHJ?SWD" u!QdG9d5B2gRNPyOe\.{0܄?Ό{rC5G1:/Z+49ecDk3Jti O1'(>:(! < 6#=&DY^,ʋh#)}$Y_MY*DsdO)i[vjHR} ٬nkހߘ= `[ ҇8i!USHŮés3;vZf<]=-/<Sr9 䞜Jg<Xw 3T.R*9;+ı>3fscc8J7^ -$ P|s5g 7qy&3`>Ob7G.1JyIs*p>7+VJ>RJ" ӽRD1)J K.s?ܝGkWU}Y}Z4^UZfa\Q|)n66!ZL%g& _><_&OGkRA6oKx[I H2J_?|x/ 5vx6j.Ljٞ5hRSTXW W`iRTKĨ8(lxFƱBH)Fb'x)9_~_0lM,L[~H|lU4dU1Y=[ԽF KA:QYp7*LUR·93ҪFu+X[SA| =!*1U髒yՏ@3)~tz^pTOq]uPQd5EQu✵O+tF&C;d:'osp|xm1f91Cb[ ix>j1&=^UsՂ,$x»\ RRzNg^s JY>{VԔx@Ў.P{}&ƎcaO+%%c҃=~\eWjd Bu0&ʫ,Z“ VG5(#pG>j/Ui5@sN+`K令% )O=+ҷր/(UU$^kll[PihZMJ` tj'Bў29JTC/1ڮd`RJE0vM`fS.eQ8~nA~VkHjdχ Ʋ|8!\րM`ŲkJ=( س?`MygfQ5I_Ԛ^Sg:9&2 ;vL"cw[|֏xc]]^ & ש!ZEg6gAq!=\ğ̒ B0$׷O YuDOa9@ ' `s`X)z|]F黛ÊIE<@2S$"qxKk&L3!tЉnX{F;>cFt@kVR9">wJխ@X\ֹV"&__ D)R$|2#?3RI4׽qdf2x7G։z^p#z\C8`_NwK4rkm~:O6s Q7Y|.\th k]X4P?Nta"Ochy5</m5Sp-ץ;jW(=PglZq5DBfL"^m+ 糶{BtlS/],։I] O:F`A[5D%#Lʀ[y8l]21<9xJv9jTm;1SBa::ER6ĦК^uQs¹b|ZW z yԜ{̟p=C0%3S/'7huF]3MBKkױS+f&^(N[Հ USbd-J6:(9Cю&yAw a#k̅~f͇QOjܑ l{tn)0O>4B*WR]Kڱ'c*1=]P㾯|\p켂9gZ?y Q(V<۴s06#,,7b VÛ 7&r K7t𻭤&Vz5I{9 A$$}.δ<FfnA? $ T L=B}64)v]Aƍ*k' CG:_/P>n}:*4du4wo6~ӿadTҼO_;~7h7{\ u8r?Peۣhu8_u*vp= Tb]8B `o } u?XJ_L!~[f.vMzX^wOIrO NRlYG0_'۩w>tnWIBߒ{Ie߾ߛo*,ύmONއz_Tqv,:2=ha3Dʜ'W|;c>~<^߻h&Kl3ǂX}}x؛>M7AZ\lϑoZ4 O ^FXSy0ǿws"<[k|=@<[pl30{9Kx>?lqw-B+E9ʺ9ys5z UZw%?ȠL$,f#R3w;9]10yt= |;&FA+ r#;Q#Xͬ/3Y%s~S޼)&h32kϗe<|R~?I\mFŵ/Ra0BJRM 0/J29i 鿿uυ,LɏO|a~i,+76%9\8RE77V{y|pR9?ue_,]x*eQBZ"'|)y"`7٩DNҺ jwQT;ί/; |3:wМ-?}=2jDLY3JdʐrJAFDג0p=υE/V7"1oX FK8Wl,_a2WL Sr%L@lv-z^Jc ̰g6`ן :潟5!)STG;jaoGw4RXKSJE!/!bUL/eT\KNz&!.Hlwc6Y) j\&nAڭw̪U ƭWyyac!T-l8lݶQvy7H'QjW\ wт4\ G㢅-v.W:lY~t~e/E iRYm8UC%i-sLDC*lS#R*ِt9n> 21q|Vq 9+*WkX x/r}KQp qm{Q\{wZ eĀڭ]CEA\@aQY'aHnhF6胔yhG7f" ǨԼT28n(rQX%Ҋ2s% #$ey)L8*q Fw1 ${;&.Tʚ%vM0 DlFZhv^srԉMQio>0BՓjO{OB6ZV׫e&pLa!_FH&FGDIBn4 йӰ4JR1'A6evMzn׮X岕jjt͵߃uFnЄiq`, Y ~~7ZXM\_wpoCQ?HE.@R3ZIʌfQkn:@H z ֒3#VKoPlN:m%"䉁GXx"\#uQdž0gh*7v("h\$)Y ZHhh#9JG8qa9<`b4w/G: "X'3!l xG|NGavr,;ɋ;Hm2vw $d<] df <`1Z_aZUa nHA 1H.DrXNۥ@ P x䜜!HjNaXN!L} sm.N1t 霵bDkǑJ1qb*g:T`KJiS* so+07IfY?%e<[+4 ߨ|~=5v,9̯/2ϧ/~e}X̦wki1e5i͐Y,I0K V4 4*L^?_MyҲlXdЫɒޏ"A,i&uٴnyKJ}X敕pnj/nvUg僓bژ?^?\>].Ų o2Ael.mE@‡R$| PY$[:yknǜ@N9"JiqŰQ 7(a0+k 8PXdǼ^QGy*JB!NdlRS;i%볎&(,a,>ҺńB,VP-KiNRScR8h`H$eb}LXr,F_ P%r2'y@@ %}o AQD4tq:Vt5*bHFɐo.YUbI:NE1Qa5/Y.}:a=słӍ}̥?|zR<.Ή5ՠT% (^}޽}zrS=gDypL+'Eo܎ky}YW߹-Rzu蛇cx뿂_$8[l@~Jt~VFű@wiT)N1{ lw`r[B Y@H40aF"lg oR \}8-C\~bVGLWyk/n!í=9>0|=6󍮓{V\,6b92 ^݇k78i^%IuN(n./1w;\ [ڒ v 7~cK a؍j,ʹ~6< 5*?cer5 5uCa!Q, [rJ۸Xv ܂'A5UE `-;hQpWۊzܱ@a4}W!ݕ8tHIU-]5APH)2hwn*.s`ݖ}֫!FQ`plV~m6ROy<ĺRtX Įs}R`4Pb4:/6Sa`+n/lYI"'(svQdѕH#`jtq[+B@A82LAcVD?pD4I題aΌ᏷7# ncZ)Eg7gUC#mJUKcP @Yi%K))Qƙժ0S(S57zR@C^9 d8 ,=m| a1'elr9ibAϓ57go}D{ f6?[VmXr>f'?Mo 5-~׏8blh-ju8-$Ekyߋ{pw7^BtĈ*De׈=J؋CK-N v$n=\DsE3z;d*甎3!gJa[.gY_XʃexiK19! hE9F #Q?}`&[0`e=9]t B#V7vZOh;0ڔc^.njդ;ĎoSc+RC-]8.IذKKQ]*A-p/O{\җ'!HCNn:0R{clC)鋾P$nO8b0|izۄ je#H=N %R,̧-2 ZnҖO.\*$qWO" GdtPpŐ{J`=3ĦCb;=cfD̚\ rtIX<c•(JfݎlW?'r%R#=?NKIK}"2*/zyM˶nɪ"<3ܲ[z<`\J708-IMa7gqhT\ ufMj" I: N"a"1y.`Xxhm$H_# ViǴ97zMmv⚙&\"9Vh΍4{uq/_P 6dDFH[Jٴ}챴謩-MIu@6L1)]iCF՜%,Fo{.ԒvU*Q֞p%VsoZ@gJAn'jThn{PfcoTJ7to7`\}{nS~/HDQ1ڡpwm~Y,v Oݙ,v3gΜPiCʎ-bJd[t"bX~(]2>V? a-'홟g8"`keuuG|`P-R4 ӧDTpchoWPhF|ZL:Ub-sHm+k[2 -UIFe~64ϡ e qv UG*#" O4yOif@K6'[WCMO=W+vdTu#].1?@lUrzqTS#9}%<42Z!,`B4!|"SȣAK%L-* ڳ`$.>}XyMbvJxxM²0qy6@n&,$tneRVtUZRP;\kT:^] 5LΞsSQ֚kakϵ]OƚګJE[0~^R ?BXt^nټUEwvM$$$[33n/c-Ų_6mXck?_V_qȮb[KE =HhLr+i]@u~ַ{.$gɡ<1sJ*)MpCS'ǐ\D\I 08npL8d!:1Uc =wDj<Β(W=J5v-c\LnUppNT޸3ʀG)㸡t)t3`5B;֭.u" W` 뜱MCSt:V`iB2%&F,%~vdMIWY3rgQI)n,O+BpݗNz"ʅm&V^OƃM h^)Ɓƺ0 Fj`FBE/0Rv P/0D@F:;5cRBRIEe>: za*4 UC?Lv f_̈́(:4 [Q|72kt1%!ydJ!'$X.f6NWM)0="OS5˚oU/T28KY"d#K $,\U+DEgK>?f(.XDTCZ%D8mG98`D3|ѯ3!)V%JM" e8U|`пU$x е3g0@_ J/? ZįFY)/Z_ bʗC|Hzq6-$Xʶ9G+3 >TY8T$nUBDvb()ѹ>QgН36`}{yd[$['4ݛ!5i$@z+38a82pFγXpqB]ߩ|>D:u]SPѶ0,t爵? G JD2z+(@M_@p, p+[\?Ъ)"Fs4`$X\Te)ThDYjJdtL5^SϋẔ8S`:{z(~vbzp`Fk>g5{3oVCqQ}oVc,:~5_# Apa%Ro:&$^AK}_ @RաRZNVmoZ臕J5ltYWHg_yfLKlV\lVʱ l4MUdž\áUR!JP*`$3 hn& \~u|(0ܺE;BЃЁ NjxvLcNuL ~wsߝʶ_Yԑ0 E)e*C,$3"Q)(%J0 RTPE=t Q gs6mWnd!mr)a,(28N 5H!vs`$65z5xafԞu[ǒ}< KL1%hVͥ_ߛkjχ>B7NZVڌBڱzy=(6$ƺJj3Bإp4)9ʂ\O8(acvUYpQv`&m7޳wh࣍NFX3uj5U֞:" <7|m|K| 5˃Mi8a*pcQSt??ILf q;We-fZ6qO;XVO2<ވc_YH2 .d-}x[rϣNfL6ooovCj=r3¤ou_;c5_}3]qϓ}MponKm=xFK)!f O\qa"@$@7)`uuրCG*F5|!!HeZsqnBec2Z !CEӭ#yx!JdD8&Z(O|2*fXNœ.ԵچL;}}h:Lf.E] :W9qO T=))dV\1ʈ[;J[yH/ ~~_3!ƙ':؟2Eah CiN|cW88I|ur:K8>2Cm6MExr 7|{e˄!\.{y]^2+_\TN@& 0Hh2RI"h_=L w_Gs;l]&ꇅej7}V^ihn1M")/P$9#KĊb߼ﷰ0'| 1-o/T*tȏ k  }`HVnW ڔ*c@›}> Hu (c`@(Dd0z?'.DRDVQ>' >h0(Ɇ+sDP"HSLrF;?qA4^}ʼnY+84ˠ2DSSS4g QRLe ^}5~Jya4,M9T9QYʔԀfH#S%3CY*r-: 'fUiK=ER9ɡJ'RÕ8JV(N[Ac7@n)ۮVx:&pG0!qRAH%ERH*RfkR6x۔HHvMy%E'fXPCfw扠$dp ;V-/gרXd5X*l`تhXa Id$00Ԕb+_{7GX+qsMF̐}`T*`R\d+%SkYԘ,S:. }_yA!1>>>OF%?|OAb"@-Qb+W]%q(iD"yrh L^SLZ{ZC+r3(uY@ikTPʴ]D3(2chb"jiIE.S QI)@beI9QZXO%$,HD+0 qLh^0hBdH$ Clʬ|;S``} lrh4'Ab Yf֟`>Dj d'5GI(cinLtAB' vS^S@3fzbciu U a)[Z0ItH4Fj]Dv X H)Ms%#jpd˖?LAɲnՔiv]i&=EAw/+oÿe@a=J{uEA6Am"Kצ߻ߗ\"Z.KfmskoHs-rձT!sD_|4E6# csO3Jx 1"s>O&έ|$]sLDk,?˽]rRJZnRX>:ɇEa woDOlOR+|M#:ҽ{_:ޛk/`xX2=a;q|Tu Cf] { N1Weu5L-7$W󱵴׿&z8;Ab}ǽsw.ޕqW/г4Y9fBkFdXgI{8+,}Vw <<`d%ѧDWD*$%(a;N,{Sp2 UiK r28a\}@ק-6̂9;qY~[I ;ht~ߪk+"G9r9ױ/^.zEPsm]!tzzv=R45 ,654fPog9a –\ ^,W%݆kqi)Yi){qʁT(BLDܫ\"ȵCfS^v+ܜ{:^(fͅvy`֩z][P(Hg=*ulA?m$mp˱"~Bv(l} ^?A"=_p^  pTfXxIC5WDHM BGiZ";TrCslL JD@oC FUIC H}n- A08Q5\l=sFԇ9I3KT3.Z5"!DV!@JԡP۶3mAD`h} 5Y&eEDIkϑiE%!|2ĺ` h`hJ(P; .JFϢu,$R$D Nu*!vidFy<i"Rchm@FҴa=Z 8shQ3!*}Ǔ5,] JO3ݡ*voӓG'<|mvL˳ȫY߿ ™=L` 2]]I|*Uu eIWWޕe.-;JKN_$NIr4ֈSA t%*{U{A5჌ZP+B>Ym৿~4.!{; >lQAJ۽"D]dD+ iXrYcy}%SFBǒ7ű[ml?LјIcAZgFAxcIi gՁ0bxO)P mϨf[@tǃ[lWP݈s%RHiB{*n1ʹ'bYg,<mWC$?;gH'Y[A ӽo]jMx~ρEu/Uy vFS{{=:hϣɍ4spa*O6_׻U_Sӷw->N~OgxqCBgOof<=????= ds;|?E*{? [&PZh"\GD *(ifל#qŒD\Ttk#òdnDDViG)#GMMAGQ9EyQV𦇳[ 4"Y=I^+нO9]5Xd˵=xk5r`vSMPEғxx7IVHSJ]wQȧhvo/sZ!LfF)3IAY#y,ZStZtA Ծtuij>[5ҭ y,ZS ޢ$Ts8[-JXmniHS|֤[5ҭ y,ZSBlx3weJ!kr(QX̙Quk''>h!%*vjyg3z #@3L#r K2zfi(4OVZKDkViHِxnW.xncQ!R0B rM,x(I0xcb׀BB$0y$8 9KP b_ a6ߵʨ 9vX ٳzeZޛ-`C_ b)J'`Aہx_VhT+9ޓ\E4G]j Hxk5HhMX(,! :К]s 7M"M)rQlFFk{OEĂ)[&Z5HS9/Pc~bRFvEц?rmf:(xTIk)1Du)g0VJ C>*ha-p O l$x")D7`HN&g^~جzMyjPY8&GkO؃4飽׷F;dA=G ?y+QDԮ7ɬ(~i{m]{$wt/U3oIq2هˋ-#Vav@.%u]Ӆ YI]qFQh6eg%] NIR֕j }~b~* `ov?].=57Qȕopsu_dҦ~V/ݫ;nNU$$1dơ cdhb8ES_KjKEMg܏ΩZ8kqRP+"SGKnA3 B'Gwt+{Taw~(`r|N*Ac~ΎbϒӍxf^A!53(<<_6rV澂ь3Hȝ;ޜG^n?N?C4p,^eo^WWW%+xMP,k c| ]bkRP| /1e_dz 1TMqei+H^xR M?Qg:1t Z| |סA=׻(GF982Q382_ߜ] !8 #qIFEUOvF=sE}R2J<ҁ_248!_N贳RAN.Ykct1.w = /_U(eة|Tګg~j.[9Lw<50 btB>q#0I8M!(,X-Eu,7MK &v*פNN[+|+BSQ/v@({Z`Vl<¢.Ģ>xTrWgk<:4)1ʩ < 2{pDF2g r[@Jrg5\dtֱJnK\2FDLMJhJ ŎoG[O)S!.E!!)F*yL(S)y&812鄔[ 0+CnЪ)c=27+ A6ڢl=@;ChIBf,?ԀɺW^a{'D)zTÔМ K٪PZt5%k?$(Y4EwyDkK*"!-YCI?iPQdmьMsS9jg QQ>)J^.+k=f㊛"9~ @T (%=gA(4)NpqJ)Q i-$y.SΒ2F6$Z6T& 6wܟu8%yw/y5^H9lPv"7D(Hxi ͳ>B-9v^4}[ژF TNu4ǯ)9S"({*Np@S jb#J$ dES !˜)Yp<\O9HpzS$U=&RƗn*PԇiIkm#9 /wd~?s.9aTHJU)zHzfH ؜Uu=zDiɏj%; D:5#QeGA +"XE lLD 9\`4,t €?>XM`xIS^&V ( e2 ˩8OuC)ܳ@Ay(IP:ܥP!(٩y9&zP5׃e%'H=Ѓ z C%#:y`h 4 N;x&ߡLPd [L1*{pC;??%~"Ŷ4? BO( ޗh$~JC-,$+L h(+hw=RFm)~| WZ5ܤ}*FEڸ+P\}.@_O':| 5$Uy/oIw|;F3`z] g1xqXܼDY]^3^\}x3#3,`sDM[M_kN9 kZy /F݇XZZ&SKz'dR;e9RV0ZZkbVP/d"-9skxQ.Km)MW{Kɒom4iDXWT|\79H _P|*CJ~ue>"]/?m['eXjYC0`;0 wػw}c~J-ȉqt3ʭap`!8 Ucö7 :Mo?dк+ߩh0#W1d̬M£.|6vG>s.JWF9|qe?F_7b^4>O.JS*t(3Ei39d #( cQbH1-V' J//Y;Cz([1H;=q) k I$"5w"|gCI)**oQyΝ6M\8fA8z?nNonSM?/}AMS! ?MM#\\]rƌvGM(_YG7^dif1|O k$a@ PcVV RxX>lF&u^^V8U&kACˬ+s2WMZWo N!NiW>ϔj{&WN„rQfU81כOъ7^_ԈV6WdK^~_ Hb5bC*1\®#[ 1VYaIp )a%H -Q`,1dL0QJ䰫ۧ٭e!X-fc{{s_-+ OR:+Ro0 H}D:p"0޳qQn׷i>9YӉVIZf7;Yxro곭i<%bHUQy4D4.r LR9iJ|Dr aM1Z"S=A(;5C5QX1AZ稥Thp B9;Q`5Aih)F[#g#l٘o JaSr}zT޺9!52 'zSrb: @ ^~Yد*xmTl$A~>]~9wFrTr*-2w~oiΘ-jPӲ18@؁|_BCW!&zȰ"\M|a#i#={O[Tq2.`4I_]֎G6uYU>,4H+:%Z uw}w}hD9\P]me#5Qx0s =߭/cH)a&bA`epG10K*<|HT!EqF|[반|ElPDP7A dlNs9WY"1Q, $* zg0g)![GLN RFGBN#F* =bܸO?C8y&*FI iIlYRBC %ʂamr7lq m؀FTQȓQ:jv[mw"bp(Y/@ T]"O`8E"A"j;_yc1!ͼIcl8&e33cf `ILhhQy_y<2-խmϕE,bg҂Qxz2]Ҧ3j$ $i&yu=1y>Q4~^߇(C0/^ q*^:VӉ[q.Q.IlBwqXSfPIՋL*;l/X0Eʽv7}%ߘW BA~ |G%9jK-= G[s[cA=3ږde_7N(k!w2ۺEm#mk Qv c}f{yUhj;k=sg;QY\qrFm,R܆",5+R7$hÔZJWw(FWR%WLѤ-Ըr?ǵּ?ǵwڏk%=WʃOG=;JΣ]W2i?&AyqcyA#1L ah Rk (;v##}L[q łQqbQf~lb%fgK/&hǥyJmxgH&6869@rul)op ꌃux/VZ9ΏքDhD1 #Niʗb-h~ =$`z/V_n+N/VGƂ_&Q{Rrxq;O񏫛H^ f},,ӤR+,Ō҅^~fRI ҅ⲝCS op5dt8#c4*) (/ ʙ512i吃dA5dL[@bȏibv @F3E*!m`&k V#Î!OfYY0k6]Яٗ0sf s`/Y^5+?^}֔~(êRoegf~S[ ?asEz kVW. mE5.;Tg>]4BSF?yBhe?}LR;k#H2u7wW<_M"uvOo.Qە?h.fyrWxf#@veKksSg2İ;vB\}Iqn'[] - ;lvŠvwƸ^$yFno[q%.ڭ:CkVp]e]eH\( 'İfqVqu-]c{I޵#"L;_a0.6;;f:%Gr R%[-  Y}UKpfߊ\gr6)=jꙂ~tjEv߉sx]g$B,M5D7>].>ߥ}r΂A^C<\6Y^,yۅۦ KM}~=qnww2JCuMDA)^vADM6!y7R7nr{n"GdlSx^{ !쓐{z; ~sվiD_ϸҪG2XK qu; nNB/&73@ j} O XNfO2OX#??!@7:ub >yu亷 RIhjdvmyT#}~,vn>ֺc<꿓Njmw:?߹Vw;lР M4hr-TZg V7dΆˤO粪 \Z$2õquWd߶ I*_iJߝZ f8['`ֽ`s> 1=]DߘzfJ:^F/g m0m~ُ~P-=Cf,PH(ZOތm;BIаF6se3`Eo-m&;-,DB7ޚ mA /| d͍]]uvBP`DidgH08=͊W̌WYv"Cr '/snQiQF;0>kv޹G\C0Un>*,K_iMjYv6zoohf3g.is)W4yUF9!;U9. 590<Ԇ- !53SߊFrSP:S4gZRQ*2f$<D9͵RV}(ÜcWL1M9H3f(k3 )"d8`TqS=SR>߂C!m\G Enz?/&6.if]띾6~[iĺ빪>c-Skݘ[\'| gmJnqi׃TkkB^'FLib"eX[2@S.i lD;sd ";Ŕ"l}n1>n-rkrM[hքr) O<3{!YhN(vnOSfD* 2p;s<8hE, νƕS&F7_ ꘎oD9qQ[Ѻѭ y&eSnt !bPt|8&uwsR:y+Z75a!D}lJNt#RatŠqLt.k9 nފ֍nMX+7Q/"эT0bPt|8&uW8hքr) DKv཈n1>nݕYN%joehքr)z_3ݨ}n1>nՁ9N9l-y+Z75a!DlJD7IH/*B|1c:}:ݼݚWn^6%Ekw'*bPTҳy;+y}]ۀr)zVaWj)c]{ko0}Wja5Ѝz5лRWG^ O U&+5{3_bs&O?,2ٯ=@voyQ&jL'igdf/ŧ"'5Ge64յyk&7vnTyPhhDV7Afp e-s2%ۦH%w2)S|T2ظ 7à1%B )oqƚٸovLؾ\L)-2$rѶz<Ǡ9-/d9M/#=hN8h^ZA5>hVڜّU,. SIْHQS#RQ9㥙 ve>Qaqx+\|:H J@4_@ǴSB 1t<b9qs s~L={%m݌!,TBOs"Bx'&5QCbLte=x蘛< ap*,K_W8; &5F2՞Hfh>3A#MgޮL]&m{c%16:adnͬݘVuj/ΊDI\X5өy⏎⏎⏎U7 @6WW̻GAkZ7=#ewMںFeZqq{}{59+7jӻ~dR\l%{NzHBKHh43Hr{bČg(IJNsxFD:v?X΁RPYlpod|6?7[􄈴npSw U*7zNCI< sy‹w?X.,{TtY!ƕPVX__jΈ R4Xę&7*$d2ύ0 Db h"G Q2J!asӜIJ,L)0@\i2\c%v9]3zB!LQ܃ 4JBd@pRZ3RUSB 4 1ЎgDƵH`T)HSL1A(!3Nv}VLЌ($D,*U!n͑@O,JIMb0dF%f4_\f/b/nI)2a(OS(jsrI;mv7jf:Il~F7uZxJT ~,y;@J ጱN4=Y~N#5!gS;9!z0ufeKb~m yYl0jANeR|7;c$g @[9k䡐Z-Ol `w`aEYN]Q³QnSaf;eB?.w?n|^&ZE!>>fq;[p.9T\]Ob2%_Lz5J(isI"gVId\CǓU ]Fh|LPN1,C|P B𬢜ξzxoZxBB ׋y7``  "&`Hp"g\%Oͻ՜Tquݟkq $]2x;1k.jQjjp,K6'=aXy q"H@h0 x$@yp̮&fYωe;cmX˩yE:l24#_zo,/%R?mx+1JYS!y eX&;_Wd:)MeiJ\rgI|܌VvQKA0{u(Teى8BwR5_$F֘<>ٓoačl5Ǯ|CWӖ٧lBmubg:r"K+g8 {@).-|EM[~|l˲o~^Ԗ%r!s_ŤOiƚNԳKI: 9qYQ+ QߴlؖH )XUkTyLqv.F=!^|k_k cfRX5Ӣ<l u+=hxPq ;خYJrA(62I3r~UaT]__Ԕg0Gy5í x wYQ#V/N ?~y'󔧼 J9%hjtSJԆ'8~>㰡 NDuLzhGCZge1"ʴiM_8tz aE\Q|ML莅dBCб:$Րcg> ֠D5FJq=8Y)svq69yWѻݬQ;;1>kiZ KOyd=n V;m$utͪ7Q(b3l9DT%Hg 27!Klt_3"b~eFZҏ,5{='y jpx[ιnu;2m76\ZCy}O^o%8sA2k#{:9R4~ Jb#'E9ɫl={HD=!Kggמ~v7wD =\w}p2n=RB\Z}Tkm,rQd찎~5NimnKA[pfqsά_cc]m 9mDDtCVrC2ol"3+1A6#+n+j1b>0f.rɌxƵ "yQ": ;춱d;ZIn#sn, hæwm^S$5F Uch 0:-!JN^aqJq6$2Pt}Sq7}S?p>[+p񈸘ob~F,7}}Vsg݆#{(m~3}gm3rroVA6%k`7xڳ1Xqz?/4Jh)9`YL8U$gZşqɇz>,DP1XT5rzmT)-*)O)|PՒ+1i UzVYA{ceN,i z3`aXfG`V6፝U/n|UMn6sbAyV6qQ e2:\#٧Q5Lbd@wc3 Fbh4)KQ Y8j@s ]k\CvT>}4gyhJ-+Btuvӣ&ә|ModX,z2}~k6S!g^?vѯ[^7'<ŝ&UoF~J7^ݯF:j$0n]t;JiCH! /ld2xWjPAv7-Aeg E%몳+v[j;Kkz.ם1AJ56 å5å5DiM3w"C\Ps)A;k0C"T@EIPRBr3F{G[wxB=tI?r>2g)syN])p7fJkl)kk$PrMyD Jq!ru]so[.!1d怅eX ?o$X!3ZNF3ǿ3uR=o@ӉY`vQKfBP )@tU9-mtת٤0"Nxtz.Gr` .Xt~:%%($B@4Tl ]>Q&xf/b[UMY,mW 7opQ׵6`_j)^B6O=m/ƛ/BY4| ϰ1?Um3hi(kD4+b'r\9Y GtDVUg-zJ!œreU^~tQs,~U mawMPS=j9pJtphH)}UNږ=;B@,B39*tBp%dA^AӋZy7kK1G!ݺ霞tI KCstΞ(αgXްr4wt@PӞ`CH^Q%$*@U ̌%"E@S!(m]lr-j&Pp !{.Gu^>6CCȡ!^*7қƛto}'=|Z$,H쐆mpVĤXctp &#@Z9V&nzvyhwIBxg~Vzm=QӺkwVlix@"nu.`sӑ+\g ˯Yk @ڡK4jݶ&bn87ʘ8;/T2sM3:ӃfhYٖXܑ҅ߟe\.o}6y_y:zP~cO͕;ewBP Q#dcZHQPt!;6’1;12\ѿZ2*8McP-Kw;洱;Q UYVoLTT-TKe9aȺ 4ۀQcuᾪG%E?u%-!ܒ>E=&@-R̯WOQV%-TRW>(9> Xd*:*eSS}H"#Q܊DPǪ7}t=9KNS:;hĿn djIbo2NҽPGoґ Rwخ$OKf^ Z9bs /r<{Y~'4{|tbGPi%ASŁ&&T.L ,jz^Ak->|e <]-Zڠr ¤90i(L(C,(J$AlJ&; _ɢUUȾ5}Z^+CcoYW*+]|JpxBwyP {A*E%x&䀅6q%t5 jjBY^i%\nS}6eˏP}H$ܬl*^Vf&`2 k%jZUw;Qcǂ= 9(^gT9XbP0W1BR$Uk"A*RC)5fv*)ʾicq*Prs:k.nIɻpY~y Φ(C٧GyVFSS&/9v,Nɫn!ti/4V03~F?ݑçw(t1~{,A3xPhFu sVk4){ sFfOgg2TCķ0J\{9WU@׼]27^th zUWTSt!lr$|q&-N4,[eT@C52(l^W]6DYM΍\jQ9k|6 Y޺muHZb%:AAˠi=I@GEV0bhqgh^:.^ м=wQx/n"$K .`lr aXc##h3{f8l&q,pW-v5c ^|Kop[nh!O;EHV)mȽ9S*Vm$JN3K1Sw⨄~Ju`@ wGL327p,GJGvDvMmS+ ^Pӱrju,iєcJmj9:ʔK4[eZa7ѲޑNu=3 s*ny2a*YoA,;STtgcyJINi 軐ޖ+g8~Sy:!]53w^?{ƭ_e;2gYM?u 9dD|em #˒y<>?Lj9 ~B],e/ g4vFX+U0g}!ߧWFG)΅\VRw`S&{#Owcz-f+Bʸl!6l`ىG 44ѣ~'1c tY>8?{1`g~2 3???G* ɻ?JRkXL◴[_̧$OU̳ډT2uz^/[X2O3q& UU=x2?`9! B—2•)"@(Z*`UMމ!JxiO@@P"2t`H֤8K2*63.>p 7 #8|jnk&]i$ZgJ1D,WJ0ZPqXCVƇ]7@VL݊I.k-M:d<':Lr(-a8X~)AG[xt:*_DGEz7ׂ-Y0Yva4 黈Spwqd&ֺ:+X\˾r֎\[.}찜c!=:CYҸI;'k&phgЪ'h__I%7[8BS^AJ~yw9VN|BH]V>PKCsyc29 HfL0mL=c4S )NM%U j5,!A}.ma2v Zf?SZ_(-WLO٨DLsX䆤"5Z7T(K4,X*9Hdy$}@lIl=OeʏQ2>AM y`LCO{jK-ŀQ&yǿG+ -ҺՑd6e>~jB*7FIB )݂h`l%IDc6!A/}A4e N$/4Ӳ${S6_Nb_./cSs=I#i1ΕO..RǓڷq+$̓DFW DQ{zmӉ;]TӫisLN P)4=@hHj&K TJiӹ NzJ%@hT >g53bg#ciAMNײݬ صjtwӰԳl, t6tV?,#(iѬEKmͮ">*;uu h} zJivϲ YvE/3{>gFAl%,,(}?ɰ4w2Gj6 EO'uzWakO8Ri̪.*3'aQ@zK~v~yͫ=[T`tgIT4?Gڎ| nwtx;} AE2x Ղ\Z94 W{/fͅo~V܁V_NLxfLJz'gP#otavkӃ3p;q*rrESI5&kSU3r8h%Hy۫V) #N1fuZst (GmR{ qV&8-UX=MfKvywxqq6oy&| HP_q"8H LYQJXb ?L)jj'T#NJ  [T( %cƙhZ:I+i s+ѥqZyTJ!zLM1 JSOH8bj;>*h8>a}3Ad"o|BeO2*`scj3o|š("7l h2@V<*ny'O:9ַ3{KSrLe*xJk X`3F9 xF)WE!& m6TniJ YCڱ]˱WNku2 )DԌI:jpu U9vxRl$?.AX0Dz]hup>a؛cOq{9]xjp0 .dmq :f1 cJ;& cޡ*8ўȿ hE؝Uٖ4ҿGjYLyO~I(//mGV;ar\|4j|} Z̯WO?yoZcdA5]HiOvو^6X?2912Dg`{ۨɰUYmr%)%Pg(*sDRh ZS!:fpefiB$vJ;a.C*s)R1Q6@7 @ xe3G !;Ib_WOg1RثAgU*J2 3n @ߕ?IU;׎>,.3ٳ[u/:=m]S lۀUZX{**֌FU"U*lLZ}PY ;v݆@f6-؆̂C=[I+Fc>CuTGU3Y}QueZV5hIW޽FE x}3Nm|np=`xͤ FZeWӐVռ.u[6P0/~- ¾3'%grғ91\2Qe(Q{j}t%uFiigNi~fL_lցoE&(T)oE== FRj,H: XKF:eР h"[ovd,Tn֓m-7iS*T2F*^MhⷞH?ԌIc~$\jjwpY[44V|q9ix=f2<[ys1֊*z'_IO2R - i>!;=wmHzC˼/ə`Y$b^0H6؈o+9)$Ւ,"%Aαb+֍"ز;- C&?M>oSc܍'W^~E+^[ҳ2˲UU0Ǡf>LMK/2b%cFs4|rUT,SڟcyaՊ;,{ hc!df e91gC@A#(M, 2ٖ}s5%MW.@OBᜳ0b[Q{EO )}OIJA^A- R4fOa$ZH*@1pJP,T ^9(]NzD6 :ּr.3! Zerq Em߂lM^D@mVN)%x#f(|fE2Pya:Pܰ%++e)/yhMqMkx_x\1d'Ϟ3kByfs>W)q {O "\t|oEPty xg+y2|Vʻ%b"G")}V(-y?mgYNRjLPI Wޛ4q[K68* y0{b6#UȩHr!/26)GP.xkTj& SIܨn+e{37c8Mfy+[%rZM/>Y]/"BBX9'Zi`F3akw#ly^&*'J*dFuN+@KKPp2GpP;AEf[#ߏKԆ%>7V4o4_>رig%&z$C?"S5n'/ Aᤇ^ AڟY›n6}J7YxsbNZZ4D(Iߘ)/~xo^p" 7+ME(I;ަ$CM6崬\++'^2!Ex:}_arjֳk7w|PY!q)puj>?,ACKfp]J"Z*mK*Oqvgp 7'}kaNtkISIۗ I#6ǜJ9};PH2A3N83>5|x">|B/":(loӬ ժ:t7d)RС΍W<  U˜Ůd1)My+"eJh8!+y"CnFMa3R+I[ 1.%33/CIA4 ^dAXW3K)L)E @u*p K{t<͊,=V/ 3ka65]a&.v_j/ŠL0Svl؀f1wh~()5f{'o:]2.$s.st=>?G|h w^KAKf(ԗXRWdR 9"p),OfOk+XXǼi:Dz{7ފQ \N L2?Vfpo8r?da:l],8RUiMtUԚ#U)oAǺPOж,t9G(93ǢS$Fǻ='3@J]6Y2ͩ'%M}盧_[\=AK.3˻/*\^l֊P?T zl)n283dȴ)8Z;N,Aoue}$P4L XݞCNn;q=͈f}Ns͸/@ 6 #3(;la+ء;延=7K6U/)tQ!Q"K7@iQtc' Dz^qh9R޷QѕniD+m;vZ(N5mDΘP^]-ʒʋR @f3Zt3ݸ ǓoX@99.rGa5!H7I2E ȄYr ,ꃯ>Un;nRT] vFCtv0AՁ?$ N٨D>~:{|xvA#- 8DF#@Dog rgSKMqH3 PoB奟n7ۼˏhMamYS*)S 뺎^uu] h+V`h*fm-rRx׼ G,V0(h }Q+AniTיnkټ4z=r#΍cs /Rŭ  C-*rm$0m>AoadJ?Nէm8 'jM)l[pO7ՉUJ'jZf:G@ px;>*.quˮc\v]˪>u~A*(504wx+ԓD!؜f*>Bھh֨uFnM#҅?AE;)tihx rsjPh-Ef#}ښ-݄h{`h1V Fw5hKu\P9q9rQJ0t$s.+GΣJ]\ ΍'[pez'q]Oױ3r=F4D SWw~Ѫ<0+qʏHӮ{+wQ٢Y*1~J!1;E&9C&e=CLBEn`,iQB䤾O1[`C˅GWcvtN>ոUFDkþ7wul[tEM )DpύX..^c)C6 Y E!~ $"" #>€htx mJ;Nj͇ZH;$\ssƬ!L@Q:tZ@)eE( 궨sǣm [--mDs~S˩an1@1S#:5zQ8`k)l GE4> b!)< g&_k -dbc MŪ/a(qlB(Jj;w pgKnLo'EPMSR dZGö j&b% "Jb=j'Юf2ּ1&H0mdP3ؐ2#|>PZBG"~]m#T(@͑tv)@47>*pLCo,|+G@M]xq\\")JhOP31'FY|×̣+r4aH@97C.ǠF^=~. ڻ EBn{ NqmL2P| u 0@H$.c\A1#@%b'.c1z1MT>H>h4 O2X*-d>q#Ž7YnC&?M>72v?5z(OWy=MmhŕJzV[lE< js~O 苛`7/GhHCľP-{ͰO.nwM &.vMz?nn6밫 7{xv?{|pv,n4xww$D*)ΌȕG#Wy)Ӡm䭯Xm)nt䵱{rw5ctm} w1.xL#U kȵZBdt!=nyW&6Q-8zX|[oG?ч’  ټ(j),Sŗ:T \uf\p0d3P(`; ]j8Zo') [޾=Tq>gw18Q1W1@X~U2eY?@ˤq:Mz$UwW}?A|3Ñ0n vj4Y؏WIS1n? NTTXɯU;XXP6@w/9@ak:M\X;f3~("Xݍ{+C~?e6&cymPm\՞Fܹ\_o[POM_k)ELBe:t=Xn-6R7\8h1w_3w=Icz^-bMkRyRB\n Y+=[:#m5~' i"΋Aô&_z'W*x-BP52 q1np;X=6}MJݠeU.jlB /iDƄZwHyachtКR*BW6270 cV5[ѪE1YȼrN"ض\c=.?ΔU!ArZ.>ʵ?*J6Mgv]!dB@ʅ gO+ ihK(82ޅdPQ!PEۿ^n3\F+pnqqaixgc%v5ky@6];H0FD-Jsz5 ԦM눮~L  Fg<]PWht\ЛFDy!CPj}#P#imVXMxM"òq= hpyp|%c:7WM[(*Ptճʑw+5& q<8q?y(sP4),W҉sc7HƱwTVyxD3[a</[wE0y@E0eog7tƘot6IҐR8B/݅kRt&IviJ&`#z8S !G*R>MM^IIKFNyh :$_9ֹ==]|w wW%:j;aܽ_W\4O-^Yy nL͗}>]Fr֧mnCJ8IU~%=*1ݔ{oںF#/Mb{͐H*!dWʌFE6u:㯖?.1 ;4Bٍ jyl4EB 5#tMhJ-īwgׯ *SQpw?jiEͼGכUtJCۘo[ Xoʫ^4A+5h%䨆)7hKOnog͇a]Pa{uQց^++м;=r1S#4z|gs ){Xv"s8ILpJa&cPl1`V6R\I-:TXEN$y>=x1JAġ|s#z1J@Ob1*Sb PQC"dr13% ԋ9|#@Ѡl80HN>7fTŏLVj I)5#'xo0L dJJsrOT!hrpTZ:qoT0dwV>Q0.&ҀrͱQcvo2yi订*[`'׷'@C䀨0J}"Ps¥@uZ5dr[u/z= -p!~}6>Mk!^ZTwm9tu2 D Q!B|qn-.&Rjrtwx c 灁"xY64H\um]zmZUΑ7nW1R6/ZUOni8b!%!S߿c.pƁϵ Qu|+Fdfՙ/$|ݛWh '٢}s_aRhPC#v<)(KW 犅QCMC*.U\U !Nʼn!Yy}z lw))6;JȜti0<r+r4e%!7hsM=:p_W,2UuPxk)*WIne$lnۥcӤFCKETlt`/gD: IRZGqTMWdK."!A4F4]j29S*JlIj$fc)4A)2JUO}[_]֬@%mMh]@ oh7g~.o #.S94'&#>g*.ЪDIS>xX}I?R,6y]*ytV? KߊHi;>Y-o]L}Hg.lf2az:?rVẫqr0tp)Ckù`DbZ?T 8%V  ҧ6i .vƩ"&W@pRrEn9yAMI=ʥ.nr2[! `B7wnmxPx4KM!- Pĉx N;PX|`xy"|X%N5;u\2HoIyS\j]N#c@WzJc1H$B(bH,ƢA3p jYP[8_ 4Q90ýPg_eMPyr`º.ʍ>+H,} oZoe0 )tgFPyP:*"ƒ7#phEQV1jm.F2%Bs7GPQi`,ѿhٍ=֩ҲO}#o2;rػ'O`Ͽ Z1DALW|F 3qR8/^e m1 -\X'p9.LENHy}9Fz?AjEZ]vͫ5b׼]5Y4M Wys[P2uV4-)ϭs>h&Ua5IZmFYM_;Q2ͨ lƭY KP`1rlQ(c+B Z06򑍞Zfb^ߦHNOc4h!'VOӱNQYLƜ\ +G4'ub`r-,2iy۽11!*= ojA)yFW !~Hԩd(}:&쓞fL͗MONc7ie-]Ƿ7kk͏T[+ZK1|qI574K5\.Ffv,h|Mƛ v- ]CW֫Aom{\ec!=ډ46d^|L|ArgeV.kkB%R~AV>l%?x,W'mP)&~t;?iUOuu闱!4Dw<>=[~[.] ,`/ |=]/ ~2:̄tn@("ig3lr0xޛl;68WJya:[!8j('$BmF,w>cRt` rq7TBu}XrMepŎ&bK~bt=mʙD㿲mvN;ل}Xj4%JX>crm!ٵ-u A-G%;U4ր?\]s\W;t-d$/B&,)2Ą̂5qy(ÚU!A.}1ش.{ U~wV4a熰]1t~{?r1G?[|7H5Tz+ Mz-vZZ rݝ/s{p{-;G%~+,}Ցh<\{uTI Ozx))o+54|ΐ.;@1GZT S RqF) Ϧ%fӛ"h/LGk$pZ0ZuNrnF~ק7}O!c_Mu/Gn>n;}S\lZ-nU4TT;z>@nC@HI@c-$upHYo֥i|s!Z?h5 }Mɯߗcbτ:EAץ#}z G'X2>`4h!S-oƵ=M7m^ [Lٜc<;jPsSnv?z)g|+d-NZ=>gƣB_Nɛu[Xu˭ׅ&\<)s'nI ewDuw>/m5mpa lݿd',nR?ATZ?+dL'Nz}֎-%¶U|yCEWPVh)6(ւYtNY}lg]TW9 cH.ۯ9)=$6bmiD$pLSU2I:ѢMu*^>i#4질azꉟ *`ாNHkO)Nbf '0k)V+ Y{l87Jn{&E:G) r&9fj+zYR-Ր@yzyW|Aӻ+g5zF;6Y<=Kjpyoz%&$92p7T32nzuzA]||u .Ib͂"ݞ4rեd H-V7Vd+&Vf&ȕF0q6Z % UViBF ҋfg ~s&.оd{_JS'xt㈶}ce.>0?9R tBo8_^s֩0g~{!ZZĺ{wU}(`Hd\Baڍ.s.LZ]EзѢ}37i(az0SU,lHe)퉻-iΟ??߻"wm ̦WcA(CPpLqtHxp"+Ƭ%F@:I $SY췗sBh8\ZWo}$nN-q杣_7F+*72L@ON\0L0'l [!8Gҁ/{^k*95=*0UK:,! bdH@B`(Z]&|f ]]}eN?w6:?JwYJracYB7w1IaVJуn4񖏆/@c(*',&kZZA/1Ht`q DRJ$I:*ˠE^z\cr03bfӼlh 6̜(n 韼CTf2i=1b`cU+-O$C%=Rpg*P8kEI)u0ܘCq0K{sl̅ffAz eERY՚Pa1 n 1Gkb KqʐQB4gu~, nZx'QDa p/0΋Z+QHNoUھǖhtrtI o;CV\ֳ&W^zy?Ӣ~!LfG14EgP_VM[qٯ{Q/ȭ{<8z_. 馾tk?aK`.G~z3M C>[7lɒ%$'Mn񾁀#$I;~La"N/ݫj2ۯ;}?eg9iѝғ=5JOEQJxjoE2ls"C>m&@1ͻA]]6QEz0Gj{G\DMD!20xkLi-HHJ `[O+V xR,Jg H(Q샎)W)=:AH*1O-~CzZVSstXnshR&yQ0#fkf6Y:QEn<$m2!j&8E+eF:X26y{߿˷?p#͎L%do4%[AoRf&IlJS bdS*q&͕Y5ynz5 *BJ Y<8*EGR^^Oū_U)n/^]\.,W935boqsH w p4f+Q$@HP[GW5 Y6NXV:P1cmՂ~%ulzWH>oy}'1ZC -۪l{0S U!% `[_8>\6 δmN*ց[xjgpTS +o#0(q*W*xq]ՠ}>Fh²7/jYIc!QZ0yj@X1e}HzjvP“Z?mc֚qEU-DI(p6jeL*4ZdIgJu$E/nb W^U=J*{%&fxT@:ohL 8dAvW^7$_{BAfޑo8Z7#bEEHKZ02iW;@P{/:$Εa$Rw(E|zMKтVFn|@Ic"U7kM+#~zDLO4م"=DTņ-l7YӢuKJ>GAp8Jl$H(ྲྀA%͘[ "I&4_v?A}̗!7CǛśkFoNشBUZ[.k(KNX:{nD)֕c83O  $큨ǐ4R~DL>77t*Te"ĖKr:w.rmѬ 3Yl6"lCfy%%"s,Zq Bm] 07 l򀂴a)Q1q2m6ْ^ŢQ愘F'cS3gI3b&s k7Q Y҆4Zap(5qYdl2;nuǑx9KܤhLg$aR6C+V%ctww쉙}̠XZɚ4 yH$ &2,KM\&4!좳Iѱf1 1zo% O: Hd fLB 3+?I.3-Wy7S!I-Y3[ uq~3AohLbivo\sS}QC5@G?ImւvØRJm4۬d)[?C_?͹yxzӯc՚CHWA>޾k][ 9wf:/ܤYd)9gk9|7>Э\һt1?N6݌ErEEd)nEwy3Jݍ(^㟔^$xs6䔴G`]},5)1p 5)i5W;,Fǔ&,Efڔ(7P0Ͳ9RcF{p2C;6H^Fibo䚡BX|l7fbh4Ʌw*\N$_9{|G֏gu aŶ Ć{#ofpGcYWhFn69wvvmMU*VUG[0[0M\Vrˆ7L &6-pΠ-SB8fO8ּɯꝦ).䯫;i"XoWeZd N}bͦy=0( %=u H~ˊ_HF)'!)dTld%>v*8#!oܻ[.9Q{MV!u5ÅkTgÉֶZ&L dN$_7vm8 ]WE(S9xMɺ[O+N/??A!ȍ_g<-ym??U\F;>f[,QO<~Ab~??z/~ J֭t7ݣy)̙+ S&?&~drH+x3ٲz $iȐdlY`6*32K rN-3BZ@ĸdG<0EW= $%+nfH-yA$[ ue3IR[F-wV/lGx$x>SBwApPI{*z2ENU q{vyy9 2/Ͽ_.u\6}/2҃5KOrڍq. )bz<-II2yzcRi\&ܩƒVnRvBVY2lv$]?CfG .Y~Kcc]|ozBvŢ-s>2[ˆ[V2xcVxsԏXp!3h'|> 肋L=lrmhPt_&ܡ7ȼfsV3/u%4u-0emiZZը7*ge1WϕL 4|݉X]͕X JY8([-|'2m+@W?|]I#ɨgҧ`2&P ؚ9lt'U9N\ ҋu?ˌy]FdXzG jwjH`OpM.F&@~mDk@߁$` 2t&SZ+< *dlBFAf7C}6@fv,+[rP&YRhyGjDCMFr] dgnfo:6RUOTW;G)(V2Ỳ2h|&A(Wm#2 eM}0hZ;|J`&mfrx<'Oh&oƽ;M*Itg3Ls5Žn|J5^ .6B`XOTq mW?[:[u]Ҧ pgE9ħo:gx[t4B1l)pvA#ؽdAn49Lpv7tD(N?~MZ,[ 51c>ydkҢP)\>5L\>El[/Ȼ2x~ S3If$ހvSqkM||ې Θl|`FGT 4i|խBb1 2V<]uFH`&.DC#1KTxIcSu52Z2ƥf@@,x\1%NZՂް::wiAD3fZ^ 2=S!D6|Ijd#G2й؏ϝ_}د>w^O-~A?]۷1ԁ_%ߙm_;ysŹ:DѰI 1Eͬi?qو`,q$F88w M®?}=/e&q A9kkbZhA{Ѝhբ'HcS3g{=0y^*H2+Hx$HI}*9~`v>/_B |Ek\|,SkXw0pGd'M=_]Hbƕ OeZR+xքO*ɸV +EJ#Q{+ >@={$y?~?@Pw^߯x}ko~x׭z݉.0IKtxIN=tqnD! ߙfӀE]lA I:?w`>um0:Co ΑYg=^EO2pJxGAz&RV7{tYilikc^PCqx}XEI|ч} ɧ=z0-zFC6qc eN8R.O9Zg6p(q]#Lk7Nɪk[)ZWBʖgѠjkY#73s8`wlv6G."[lcCpIU ٱ'q=S͗U$?igOuI+fw:&`rvĨeL3 ։11m92ZTYFJ5Ἦ9g6"k`&k@soD,Tn6 }#&֭( $uM56^QFXj)@Γ+2* ҙ1M!Eۆ4B?}V^c?LYMG "f\ DlђaxbC bG5) ?gT4i-9+n&mruiOIZogguCJQ8x U& P; H@d0;ع0j6Wΐa\/LYٟ[ƀKD^,p0O|>ꀑ)%՞0O6$H"( ʝ)0(lҦ!0^:c_Nfww *=j_Ŝ ^_vUaaeMB W;xzk3]v`egQ#J63L|9S y㡭,Rn+D*rL%Sj eM{,vMeg{$"1>@UsK;l)eǀ 5B+г\o_!9ÔΠ8_.{xtUW1R`#,9Ѝ'5YټYw_b0}^ @[Y}u_)gPx8¨k;k$u$ *FUujk)4Վ5Rk"+)`!&- +%mcGz,̉04<9*htH)]ucQZu^cLW1[iyώ|X-5; OK #ӻuYTSyԸwfhD~W7NJ%[> `cؐ=l ~E7?BzrTh)VZWѝ䬕jO,.?b}-RQ.Mqj]krhQ9ӋnQP$$QXk\Fj,CRo6_./_W],Vۙ{4g pW֪m˵2szt;Ԗn&syEu+*lEl@p mfŃUX6fIۢ xVrb*^}(t 4:+ٛ: nR3< CIX fƘy L ~w zJ 9>xqP.RO_b@N;cf% 7c3[{>ަ*w? m|"tY"oŗߒXr0KSD>fGfSPӆ3 ! oXj (xx-gkE!hpʠ]eKT_Im6Վ5|n1s( hL1=گyfVÅ0\\~?,Org';_yxrq|R)T/hoHhb$p7+c?J>AV&U˫ ǧ&m fg8 Qt  87Rl>B=JBHy-mBD3v~󯳓_י5^ap>` 2ErDBd/xr-nYߞBwyD/ ,m n-ׯ{p<iS.c1:? 9t;#:OZi#8FѰc^"k^Y7^޾}!A^ە^H!],T~Gmluۭ)btxkVwc:uxׅ3Nn\㫎K|m[$w;o=Gd"ץ>LmLm'LfO?>H;b `4W\ɐgڶmH;qm ٱ< 7f}o9'R\*@3Y3Tcz2cz2ӿ"z'{Wܶr9#x޹>z$̙6V,K$;qϗ.&:4!5neZ,0>PF݁o+^W5SO>J$)B= b8~+B.)hɒ-/Jr}nd}> .Կpe7t\U}$sA# %F[K J*S@eMz[^491LV21-; S+ kNسʿpewu|8a88v#±Hb&M=sc2f]q9Xu*"ֿĹP`eIn}[4-EcBW/M 3pU/=\-s*k#tx%װ=w͖-y ֺsgf0QG7ϤƲP #T)a9e+):>*fRcc*;*$"pB(.k9ʖ~R@'\jtDK#i]$xT2Dp@6zz{fm`A$!QnZF̾=-p`s)I*Bfz-ui~ &t hs46Fj_an}-ӄ cZ![Ti~cRQ)X4Z4Q`bV+ܩ/`b*0QFe.|[zk@r?LjU9p:XÊq'!tB#râ) a+c$9w=2;=s;mdU_AM:3u_\L藒]c"gZ`fa?:zy3̠Y¦_qBGǎW+re+ Yk2ĕ}e@fU̕WelG˥D Wk5)vѥ!C. ,JL%EKאAp!OaATeaLƎ>w TɹW^3s4TM/ӱU5=I)7 `E5Hٱ!5|/+I^ ( d\bbQ@(G2ndܻ?2f]fԛ 樷Rw(XsQgV:33FZ(D(]QFd-&D8zy)<dt:3+t5_Ozq.>PV3gMwƧ#=J֎kBVBh ٖM!&J(C?|:E$R2XGmM14BZ^i ic440 HVß޸q^ | 5!Ph۱0φ h! N8n3b|[)D i0ӏbƤ&QKlJ'!bM0PƊǚJS,f!Hsj&􍤓34B2^c3>5upF5|Zte6>Sq _k+%56ø?NH"NB ɏL Ry AqNv|H&biB<ۈYe"%8 ]괌7q]y9^фn))'OuY2c rOk/NU:%X Ttq<;r(T8 Û3\ZTV*?ϭgPwY,^]~_ f[7|#&3^D sfQ 5MsI!\j:MUȠk Zћ5۾ O 4>BdX2wCپ][YFa#%ΣGhQPΆ[S)T~ 9 td&%޵uBTq*IT[XQN~; %&K rH`/B# b̨WXbT9€/ZPI/14dһ."glT_n&)}I~FqE2/zszcowg*#R,<ˍEEEEYV~3&h~2c7^")z ^YH(DE!- nQ6-O +PL3Ed=|8mrtBoN2}MW8EvaZ"y=WLdL=gBo5zg}B۳$j/:8VzSBXY?gĹ]uwb}C+: ZҰg&% k~$U &9#3ZE~L<+lC8e)! JYRo󶶔[sD_FDDjJKQgƏژq'DȞ*59Vk_CڶI%];!g旝SJ5 I,d"#JHDy%^YN'2DŽN-\oQ~9qL@ԏջ̒ 4W#x%F.2޶L̆\!*%7lk$:Q9DTÎq񜶬'7{~+" [7!ȉ9Uב")V(H>].+!R`ХtEw\^7.ŷ̆ 'ybb oFF6omkcd䍬<`qIἌTptZI!,a40%JtnSBhZT>1M Z!Ak;$hmgf[3Xp<8vIX5 I(J M#P[Gт /M.LD~ٖB$ޱZJq-ͫ˴L9?g" DʂcCXgęsQ&%%I"eТ" @zc8|^OӷZEQכCF{G bŠ)Bj^ƏP7*?<{xU/vfx9v{Ý-4˰Lpuͭݝw{ȼ{y5?zn?l?ޓ?z[(hAw}o}f0ݟbtxng4V;ó{άmtv7Oix^I($Bח7E2P!B1k:m̊IO\&d5eJd0}{|3|ό[?Oпlvm;<h] sRTc_{ |_巴>JAH8N uwzqt|\0o¨?_iyPnv ],1 ky=Sy|&o>f ;+!|3#"jrtn'W㰁Vz|rsoO'l;sfl^j8\=2 V9zkٹfN|EWj"y93tm'? 7% mjXÇSOYMI~mk#{ņ_VI*|fc^YPX:7N2SQcTȉ!XKʅy i*D)f$߭ 1*:Eܻyx+]n)DtN{ǤL{2(=#a+V !(A)1*<*hDFBH\`Ub, cײT<9FpPW߈~OO? KXD^$@1]. 8 P#N:( P(e~A"уE3=dVB! o6)\HoSCiBh ByRSF0$qH. P9f)E(s`09&9(=-%W *e!RIBZ*@UAZ,D27:"`!P2G<\Q\hXF@/`B!/%P!r\" SZ`n*fB +@# sTB d( % ;4R31 @TQ&4bRNREA%*"`HJ|@JJ9W5UT#"HL0 5` ; z5b) mBF˿ITM UMwD8N$2G&E2ʵXI Np2e /pVB OQpiւ-=@J 2 #B |+w[>"#Wri]xb!-x~fڦ%LWSJ7" E>!c<1|M@Q\̩F j8C)T} A#cN`jb{hLj'YmC2;Eƞ!]9ADAW BVel]S06BPDm2 'd ƥu CB<&2I7 Nt@+Ȗc4]hb$$7ΤIpH M51& S7~ R9B8 u𗠮uL y* tU #/]S7~U l=PĢ*#frqtYe GYWD$_F^-H~~3ʗtQwJ Q^0  Kch)jc~.Q˗ixZ~Ifsb݄}*e!X}Ntty9y gRGexNW\d2ێ23ˠNJJHS łJNJ oMe`o.c>w L_,0[z@YX}`| ^>쳒teA-h+Quj*amMw̏K"DKK!"j'"&Lax\v/+?r!$4b^bBiL/G $?Y0>9}y&5Dw`tYRRS_>at{&}r띥yМٞY,U2`;qyaMTDf%H&BKA@!јĎsE9Q1BkH%gEUC$/K^C %#J@^X\@јfʎ/ك(w0Ғs˚kĢXG5LPaʗִ1my0|Ozr٫Xa@HH=cX%Kku%Wį lB)͗d5P9Bʥ0P94RX/1⓿ۏzU6Yz$`tx7ש9ZpЌB@쟜 ѹ+-5C#TBR.N\{(AEҶUS D|ʕAW52)֠8qas(5(#\ӺJJ^ TaUS; |.RRK1Ẉ^q7v,>;8vm\2:xZFŕ~F+Sbi%q+*,)ㆧH!4pWjMW!lr^>2HRxPnf$'!'nnN'TEYӽCze?mo}8ho !O]s~ {.>59%c׽.[_qig 6v/۷wv\&s ύ;"gӻˑHc@7o_m<3>Eo4'j(xf3W9ƨ싻S>f=SyvxWu{6Wܞb#C>ع>v'˝dZ8pٹJ`ĹhǞq~6#D9.vb{RjxR;jnݕώ WRǓU6Ѻۃ|ontm_A!x)0|>iAOz/|iK͟m]^N'?CU븁Vߙkov{nv:<;C[k$Y(p^h% {l&;cȒ~!?eLkre⁏>,wqUoݡ0x#üw}e a]ȕU͘gXF vϸ{\R" IDѵ @[X \b:lu%kß~S%U`aVcУTHX JB[1N$R ߦɦq5ոEwqqwy5T~2s]&U)ɵ'gVA /OD-m\;Hpz#՚+_*8RĴ0Ik"_`ć$]p\Ij eN8}V~Bo_-yyK n! 7-@Am!Z ߶Fugd}!Npӏy1T܉UHAcbgg\QfFݭSMN I"sB k2戻[3<|X7WA3[P͔.Dɘ#vB\_Q$XV^`r;_í ;&XqbBYÌQm(I2ljv+N]k>Ͽ~럺럺?29)\'aَ&D3(br` 4O(8hPwhuW{\*l=߽V>R+Q;K]ؾnZH+n3:9G{%Y IN ༪Y0Uh&)S_ZO3T{&n&)mINZ$9yrEz/R}5XFs(pR " UdAX\u?t_N[[<݂!{q݀lݨ}Xըfʇɮ(,qab)(0'_㪘oиk&jdikx!ab!{C-/q㋸Eq d I.P$G>xu sT2bRbJM,8gJƇ؛|jy~GjCYb7ylY/4vZ ֡8:F{c^Ľ1/ޘqőޘEpL0 8Wkg8I ?ci M- Im uTmuTm|q \m:F4 f",\D"B:Re+) bzlgH (X&0Xp" jCj 'B˙׃=GnzLiꅠ:lu8E .UN!<aX*u`bVQ2-VݎYrFm` ;y9!@'jjH׹jd43뛐It޻- o m|sl ;>`NnĊ1M  -c;ߎ;O522Mimd:*҅ ]<<'Qo$$FJ nl]^ŦGZ{)͋(~a8?wNjlq7R^(uȻ޸W~J~Uu""^d:XJhsjŹC&P- OWvCxwvg SJ֭ ~sEH|`|=((8⟴0ϧ N]ŝ#8OsW?~.\߶tb ]W_e6eo_=sW#Zz! ~nj6tO1ӏS绋vη IxWH:MLdN*wrq`RkCYcq\U>%&?\/F!tCkWUr('m^/=.G4¼WR5Tx+Eops>Oz;@&-U[W$װ-ʾy⦝am1:9C9łKͶ]T7oIײN_4i׿^Co݈$>ﯨ{o~鳇}_b~7qgIߙ}xZFkuFXyOvT9v1A/ü?pL)0g;-+gH7 7o;,)ow`)X"xTQx0ah]>jwlP1?_Imm8Xb e\)~p$WFw訲,<8yB28JLc=T ᶇ-k֕lظHͦ&kb/IŮiWf L;f%{z,QrIhr~xPC{f}M%kL5_C׶b䦀DJt.<'ܞ/VWUg}wRK-X5V9?lE^!U۸Ej̩k#YD G:#@ĝܩǠ8FzcuؙhHP=pŠwOޙ8éih^6v4(NV\mEq a* P 5>md< /(Ŷn?&Ayܱ1z[I0MR-/ˡ{z-y3Z4TC2 |"D])A7 >VTuQ¦l;X[rEA[^,UT)Psq|%D#p+q 4,IVJi`TV) XkSc~Y3W LcLU6 ?P2(zh^Ӄf ewڄ/a3~IfeI4{ oM\ ”{7LahZe;^\+wۊ0(FQF ]Ƀ`^qYPQ^o*-Q 4xI}>^_e9|B_n; 4 y_-4wLjvE_ǤY-)q ,xf]2p|`'w<*ЀG v~h{T:K %Okе3S ӂx oA[l %L(569ת4(*Tl$k 4 lM/m{.˵Ir@xd ["l%%ގdI&yYRg\x HHH(‹)6=7!AvFdIKv xy\"dV,#t{`\81.?Q1\a.5eGq)d*.eMѳc ,\gݱA^7e/'x,dLʟPۀCT2;#"v*fB{E%3Tz<а)8٨WR[JxQҢ~ ~'ZŀGA)%EG%=?w%Г7h' V|XdwpNN,5cB <5erю6ZGGw<KH|rdsǘnje7XFΎ! gAF1a 9VH62bR'qZ)by%T:$c,4zӇf RBB c̈MlRۑ1Ip߈,$Zŏ}j@{ C PtũAzpA5Cp !8%m(7C)ؒIU-MJ fBF-nBREWvRg) g2u I27չ/:S\s}FwL0\@StN.;2>iչd]2$秚s\N4-9)PaB-WFwO^.ñ1+aA" W%:jqp\a*`-]X, bAMT6{bK6ՠ{Y*Rnzɠ}ci f|g:E^.7 0y8#M"u&&PrpDYUԓY4yCx8upUC/H j"zHQm^?fcHN`dPOH?G䰀nä c*6QPsll'5fnFͶaݡk1wXV.}ra]7AlrJ!!2|KuqUӤb@9y])fUٱv%Ho:ڲ."( `P-IVa6_E_{E\Oenek oJJ<ԕe CL,+e*JE!j%N0 ;]CS?r`m~ htd Pko(? \o厸A 7px 8r,Nr(Zr2O2^Tm,o?gryY]5~&)ڲ jwlPV`2^/1Oßr#6( %$3 R6z1SI)ҌLiǬG5Y/gH:JM`nAAdgAcFk8n#4;c "`/!m+6چm)2Ahsh%`i، 2O sɎ1#RC8fG[&gXV\6rKtQx?L7?3 d Lc.h lYcBy11pK_{n\7o!l-:\tU*Js%*m IN.l}씒6Co鉓|/9"$d!W]PsjB~ZD_[)(CobMbY3GwChhH#=\q TO㌳XblϘ>`LcZ!xWXfW4/{,R& wiyƥҝU_`!oy9pZZy\w [Dhn{e@% ^6c4^ HdPz#$ljN2F!gW[A\D)4FwCC7&xB[s!3 XȮ1{7le9IQ!!)ιF+6-~2&/O3.b1Hl7`m8[^qۜUo mYFb)-Qڔdp4\yF`ʰ`;w߬ͺ~5 6 >G_A.QuJD^""v r?ppnX}*C>(=9/GroDmX+k 5 Re8no@Qvی }杴 WgbΌ̸sEo& Kq0X +h]f=ކ?_JuFScL`1p&C]v`7Z@_gYmz a y`* ~ୋzeY0מm4}5!KX*o%dTKG{ Gsx?S9~`_қnq9>On9.\[#o,'p}~痈%[+ى7Q{JҾ}7ȃAR |KuZJ J$d![dAd2ȁp8hp׫?//Vz+xG,w./b/ξQw0ŷY>y21>EeMhuVtTPr}ĞW/4M  N*,ɰJUvj?"9AMQW=NT^O?ˏMy=;Ǽ}5t)ޞXs¥莈M1Vr >>1̪E2× s6 ֭~L\扚M~wa݅w]~-`'q"MY0Ju1NAYT+ASHA~RAF_8=";>qez?X6-[7{< %~4?oe6s0siϞƌ&eLH)%Q)j"eLTY3'UPx;jxfT$V{(`gX4b8sZ[!T !R0YJwЋ !hm9+>|NJ|1B Ց0ƹL$^`dLf[E#)jC#7mϬ_ګ`ѐ%*\OB;^ }mrmOT8gPrHUƻ QDq޳(fv ᵬ žϊܒ(uE[NiHq_.&9 ﰑlH ;+). 9ATR!>D*fkp_JF ď'%DWz#" W[?>C̈e=(EibpJ)I!Mp9>}dN uRV7Q8EQӬKdD[*HY_jԖCsW][.](uU6 8O\p DA/2|[.DB.D|`&uw"Pf0ƎiēxM N1N,;PD=9iE"+NYM+fb]" ar=>~O [t` a8ESq|~w`@sl7IOg=Ia.a|ܮGa V\0mӺ,ûDhX}~QJuPNlH]"S}z<:7ٳ}JhQ JZ} 6]ϱdע9P{%>T_UWoET~odeA IK', vp"@(^^B{\1EE p6FK(e8Ì`835Z,V SBq؎- lE߀kIڎQ\Ԏr@aDJ~8HtLI W@\Gs4j3-8fJ\)$)07"@h_h+ '8UD6xh:4_#%ሀ3VQ^>,'GK^Y{$.!0Vmp-z"2"eM9d;0)x ꪹg>%5@vIvh]5ٽW}wkz.{ PCGs|N9{O 2 W^y#ypNΊ⢷!(mJvuť>Zpp?9wX!O%>\\R%]\U"`fb| q; -oԜH}B"KLUiO/"-礖D7葼3DEj=yF9Y(YsE=_RA0OYe,@5MʵLGA/@t-"ϓbeD:tl!ATp"7[ g$2'SJvUf ko|5+aCg2DԫZg(iP)|*F^Ic@ w!;vLd#@9۱Ob90Fw "9"l 8&J8PqKL:KmK##Vj1)ܪWeWEh; Rz(l)Ix-tc(>с HC)#0PA[a3h2YWj83!dpTY~Xt+aCGw"  ^rRVW1Io-[W2ue6'jg ޲b6M'}c٪k#ɠh<:_s6wW ՝ʧِjΐ„54%lÍ_[lJVT_Q=*z1{ֶ WQDolW]J۟lL PѦs9)хB"mpmǂKyAP^s+ާZ*Z^"tZqdSw-iD!Q]ocτH&k"N+6ivGתLs"Y ,@5tbbH|eċ X@@#%7l4*c3}dh{JȦbԽ2/2D3Kw?IMr=A[0&[,dB:xm:Vh,c)}`{`JIQ]ʦJlH BrQ5٣},ȵ6/2cBTkiߑyIRDHE9Ÿ/;^3VkØKQ RrL%ө\zl-q7^%׏WQB|o,ωHCٴCR+L?{WIb2{6_-jif͔ZR֥bṾd0>F0n~_5Kn'5T@_7 6 Ʉ CZ@UU.nO\wYcH[\ D;A-2'\(Dkڠ.9sg(s@[l|X+>W瀈Q[wЁ5fюՀ#k)¶%S(2͎ d$C@O_8 5".X %'artD(Zsh廣..Gd3e iUGS<1?D]aВQb鈇”-u^h̴zy`)yu!Q &3{PnX~V֍r#S~-W\VT5o?"~Wu[o|0Ua=h7qvڋvv VIHGu^h¼ߚ`;֭võY, J Oz^a0)h+c5':m!{l"y_7VEFEUgyV2Xr)+eeA,ʢ:WQ7tSZ[U;Rih+Iگi?<;Whk ^{9@-cby^U cyOlRkTBCgOݙ5 vW1w]t-anQ{lHB):)!w$j\(CJv&f& q?_MX5iAdyr58 xob_Sऔ1MŴf:ԃ0I8{k$)-NX$Ms X9dP?!C])gpZ4kR`pHc`!aLoq\019pe$)E?Ahgd&79nO)Oi)ڸiä=H(E78̵VCs,Y4!Fnv8 6 r7aA-ﳜ{8 |M cyJu$p?zDD >A[|RɵKTFU哥my ]Z b+@Kn(BWW:?݅XN72^ s9gӋzitș?fzѨQq'Ot!ic2N8ztVQw7+6HMڟ?~?|Q=UQP,owyr֊5%@k5aKQYBQvϗuSdYd?27dĺZ,gLY,eRLT(SP Ay)hGq4eEu{1NQ%́0`"Gb2YI RYsuU[%LŤۯehU)ɸTYYU-~KVѼҔ5$!ֲ--{LS_|r3`Z\Gw kL 2TKy,o#ypt/qug`(,w 5I̷W,L %4fe1H&2NPjbEYJiobSK3;I̞R13P5͓J _F5Oj8ۿ}M3G*O䚼M^+.7saf!xWi䜺}:y"c8e ҕc6W!nM1=H-zH:-tL vل\{fpVW3$ДW写,3,לQAZUT\K3-QSbǸ.9zoUIRD5{UE43V>=e¨l\d=&N@s7rQΫCΝՌvaX$}T !()D4P2j) HN!9h\9< G/~G?YOSdzWYg'"J-J0F/3 Enh̛Qd⚏HIl6F~>ru6pD,-횇r4Fr 8L*s^c H &gx-0BES:?saͻOsEУ,;ײ_L~nO`t"Y2\p=ރ ,%lRd`Bc{ 9O}y3Ru:wSq՚0p{ LpLיiMQuMH Vb=o9tJ *^ R&Y jdTPbOEކ-=rRU<~žzkn^VgHY֥s"ީ@ CﺝCRc[^zi{imν}|(z9jG{>:5Kj@yO1*.D,uR0D@U. hZpim|3:r5{mh׆FEu3Sk[דq9_;`BWtwL ~֘.Nͥ..YIC5 l2BkMLO|4 I;]QHXɲEh7mTHj%Tr -]hq(F5['&>E3y[rDj3yATc*­0~]ܮR4h/U߉*suA>Mp4C 9{H+] 67Eh!m^ܺNف5+pl> 1.tt'&\4u0g8dl"E+zvۛjYBM{zR&:2Y1VyೡcA`hdqzWicDURF7 R52;<T1O`ʌ?#ӁR^Z]2!b0+^D[Sh,Y؋\D7?~{ǻE "R:f/RĬy#``*$PeVM#cqYmT2vyK*@KLxQ'!v] Z yUHE1aE@ ԘW+$vEn!`/ze b+N a&T[j 'lqP*ίٷOVFҦxl"܋u Hًgcg4ONEg0*e'4yu7$d{sB[MLRu!?V_L* ''\;+wBі{zbSlz'("5FDTݜ *SGkp_ z'<-3AMV,=ɡ-(:wMv҈{S0|@8tJ[+3O({$7f*ۉG#1kZxD GSMh*!pɵ Ugz!O]ě߀t}\Z Gco"],ׯ}ΠaèZc1S#߮1)jel(oW^Zтo˿"rKÄΪu\^ԙ=S{{˛f$Dgƞ[glm_K* ߿_¨_2h_춫 ~!([ů]m׭qXH /)D/ݽ@i-kuۻ΋Y(@.(3u^WY{UoXY۾CNw4 J f숦'&d!K.|1Di}lEz5 PK43#fl(L4` ZujvCRI6r$Sdz:Xljo y Ln_$$sޯ.ʰNS//N&BQ8m$p!)FCz0}hzj -cʱYwC19Lm@TJׅ`rܷY4DE/EgJ^ޮ6mR*䋼73cIhSB l:Qh E<>/=\ 9V}#J͔%w?,̶aьa^b?Y' bwvEӈֻue4EvKVUE%tC'_i7g -a[atP~? ߼:]\PBDa;lNXcp܄[&3[m{4 [5/^n6_(cB`gRb7%[q ٙݸgnvB 4'|\9)H~2d\HcIDݸF8u$){VR=Iٽf]Pxxb.cH{daB^8EP3g^ic JKCBe =p$°Z7%NJq 4 mӷx+7tM~DKF:ҷ]HգۛJK CF$yDKe/SlۦHͰ'goK(z* tM 5A&\iFªf5"nрVJt|t}R+ a ¯bw k05t fɛ&].ID=s pC?S u8xꤐbn;iԂQ;)([Ozcnu2gBlNꕠJ\&P{bt 0lZpTbeUR*'!ֻVЦeUEUw2x سW/0A1j6VYy.] .7)Uӻ)/۽`ulx%'a]{oƲ*q{Zx}? NIhan$;u|;KJGBD]r3Y.u~x' hkrWj=M9Z_a69X?r-UAo8N؞/cogRγ)WAꭽu^mI垮_l=bUO/HvJENG;Ѵ uQnվi.Cl 1S嵍{j7vnyC(BmyzdҷM]kl/:_/٠8ݷd%J9Rnؾފ;l~ zP~Ծ|ܘ7Ӓz4nMm_'Rj=T_nїbofniَvkmO0{}#XbՍ)[NUlڀ1TvߛmB ~WGڛ݃E)fs,f#;1C ʉ§*1I0  G@ܺqPx,g0ɬ;;OD ùL/N s"U;(o3{QIA B!T3^L榢q<[&cy?͠ Z]zpp^S `{Kd! _}9"Xd0,yo/ O v7ćO'#`0tGoNߑ|dttE% e0=~ѲQd߬yTCp3"βB0!;;c"|//cdC}_:y[N&&p`8X\bwR߉/ " p\CJ%w}9'8V}R{(:K<7p"\nR?)?JCÂ+6K.9lp40bSu,wu|r8è='E݉]`XPp*nfE'N/Ve&ѫ`G``e:YthUyQpAz\Q@ņEw1;E}!k}l}3WedjO S lpU;+BXnCs_eRVk>kuʰR-/WYTuy/4]Wdޣw?t +2upIn"KHa#הm7c+{v6ؠ z袸z 轇l1P;gagn0x-e1ʛBCWj,5o l'5%u,B".]ND8 aZ8 p7+^h(]fdP]?3<꾇tZ.Ue_ u(m~ [(BCiDλшr*XZ$cy& DFdpƌ *4'+ʏ`& V9ӥP$lтL7\ 0ڥb;ugtHXC%{QLK2_^VD;.9Jޮ:_v>D:S(tp͛7 9PڜODK]wϜbT 9n]& fyA0砍xƾueuHN뼷 _zS`S)""Vu4NM`k1*gZ)=Q=Vk#~6 0Q^ǠVKJl%xL $]dP^[~vn9+IЗe9%ՔPp;֝ADJ/ E?VNay Ōܲy \`e9$lY\eI(cI4=P"FS ܬYIDR2Ð) QS2!t3F8zEKrwLTDܡF\,g 5a^ mK!~YΞMޏoߺt"Jї*Yjj)z3KkQx|ݣt0. xЃNyP$v(=SGߩUմ"EdޝjvZS +-j3 #"E6c~2s!*.MOo׽iUof'\+}&;]wngSJ*uhYFN.jklsp;h9Fs%/:rOv/(y4ޜp~[LPG)ac⌕=T"3ZHOC2l-%,:B)UKFN)0*!c% OK \b꧄5$,%!w &σ+ݒS5@jkώ_O l&ctQFl:_ͼ8Fl9M.`xV?mpV9|pO{߄=B`'^l/܇KC7by3łl桲52C(ݎC7sηQSQx .)!(٧!ysz 1 8b_K$- ΄Ӝl82q!)$?&`.5%.5%p}=`ǽN=R1fNPeC>@pS(qZcS'AnEu*HbmhJ}4%pP}%\3\T)"b>; t>Hc!@ AY0 +z>p4umhJ}4%>8rcy^QRVSBƗ؇z F p H=% ;ijO#Ry^,bFMzBH;Yd1&#+:/)FR L /8P)єP#hJ!R+΂:֡H5X_w;ݦ\E: 5cyj0{1k5vM8r둠|;iC&@7ne9XEjcCCAnMͳ22b5L//> BȂgN^z߶>q"~XV{Ӌr p //:(ᄑ}?C/|g?_^*W^^ =hE?8_~8~\>Gz')[b6G7=mxnO%xBX]5 Ϙ0J;%m05Cp@ڶ29Rq~Q\H ) QXo6G*fU3kJp_UruLAL{hqVT>csJZJ*xl-@J+%gβFQhx*3>⎹JMA!-|0hd3%ϊD-SWfBҜ6|~1 TFs1ڦGAR(&lH9HS%h$0v: k{R3 A0CVl3j"Po7IUKAUcPVrSIe.94Ƨ0~XJOfC@%a5HX8)@ghwL `Nua * j3Z;ErtCrlci`Z;#LaS=k7_ډF.4GT$e-i26)gRTKlq[1O6miC:SUY/kEkrBZd4uBլS!xFOSLwg&??jl" %4(Zjr_M\ɉ}XQ`m=s9_Xbi׀ (,d$㊭+ԞRV &MKlڜ}84RF"%? 3 \z~4 KXˍܒ$-84HA?ĵ<{I =bpG*st-N\UrFBM'2MfDMKW"d ?r 9I/LK>!YGbؕV 2Lia %,0;&g`P=@IЦlP3o(O{?'L#o^T@  Yo-o$ޠaOQNɦPBmgFT pgp^X= %) n w!9wјuZL0?ЭaRn6(idwQXJ|J4v:{ Gq|ADPN?`wғ`V:3[k8a3N;=CҀhvklFOZmt#3Ag'OuC K!YK9 ٙ[iJP:V,<1N?ٱ\TԳ'"g% hY Z5AF 9Yi2ȩ7@ћ pk*Y srIKyp=+pg `=C ^- lP K`1o<*9lYǜQXl N~2<ApX b3[s0{3'Bǚ[A@վQBNk ByoRQ-ǃLRG:qΔ1ɶ@\lJHr q68r֭Юc sAa=ϲ0BU;$C"J9DRʏ+Iд Phyže?RJ \*:uZ2Sc#yƼgXS~ kűfxՓ_Lf "a4xF/aLƒ;DsF|Ut=!%Ð >2}Dh88>> $ZrmrZ<ji# 0ע}_8\@g[A< p@OB9|ȯ$(ءSvH$t;wQme¨ۡ"tT8L!7a|k 'd0m M:fکCh̴ƚKC'UÀ[+=loR#M]9[ oN9GP6PkkB(d Q)S 2s }< ܄%`7ƹV\OAMPIpeXrƣGRA&}! _x\pǣg=%+m B_B(Oqxq37%~j $` 8P7Gt mGLAqfU)_6csJYSVؼzO %5Rȼ#si-MHD\Tg Ճ(RCZy lS+4M$CΛS,q_#tr҅4#81]uДpD֡Id*GOhB&ks9,""ڏjLCfv?y"kͿλ٧5f8A6J?9UpHGؾ@? 吆 uJ?~H's#wt:QFA M-݁Fy8AJi=wUl8ZG)V>C/YÝCjm ZAkq^ȒE/|*Wl"&E\iͤ.s{gG,g  QHl\m9'4?msO6bG9OJXC DJAz;bO#; "v܈ځK2很BHsn?5R\DjTƒW=۾vǴO=U"Io=*\{1z2Ϡn R[ý Mm%-1AH,5q A@5jԲ-̰;zLY?oIE \;x1}``$]x08-)p\1E8D-\q[/KȂ|NmdY <]k] l<].3 7ͱQetZ9 _o$ v]z0 <\ p t.5GwL<{ৠ_^l4n./zv i5ۦ92U8߄tf> {W4<1͡NڍcS=pύJp7/Sg^J =Y)Y3`]r}!yg GzgOOb'7Gſr}?u}_ooOh1z_2GbǿŹh#YА}~ȟ_.Ɓ dz^F/T-<r}22v= .5|׋;ZyT~{_oVw5fheᯖﯯŏo~~*^Ze5'g̨gY`|9[&a`FY/c[YɧO?OXK~bρÞZJ}߇Oe^\C#6v_mp !_]H}.G\:geNo]C> ~2U57A}UgJo5jdӲa%,f-|i;jr bڑ;_!<$,Եh%WUQ)e嬼h\$mUM8WFd FKrA8:o3P#ׇ>2g$i! |%,{R>; %e**V4 9o$6`V-9U ~mɵ,֔Mz_ I~Ke_ 3b 3YYkG Ɛ&d!U%JV$| ʱMv{<`^ٻƍ$W|A}(<-ױyQhS$ RRkPx`~pKXU1Ń Œr_TXSآHJ?K7M\xqE L\&XᙉE sTddGf4Y OWx75ʍiU5L7)k-@xZ[s bTdƑ"ldqJ M fYR0P,K`pY`1T4:$&4-:nlf)XbQjn-gҚE㡃2^[`CM~|Bct0l,8E5}<3,i Wx75֦cM~x+芜3}([6smNy 0F0 `}p9xw'//cΛ5AҨI )qs&>rKqyqJĝ(,zkv|dwuU,ϕҡ_ٚYC5x& &K2/o]:d)dT Y)u.Cr{!zZU2ȃ^yC4stp7zFf^i;9018J|Aqg lv=ʵ ִ+(ޢ0U^a~ǥ\$q4),΢K Ͳc".CSZ )1ǵCC,v0CInyk.sar6J䚙@@9WX1xTL@+.bsO *E"f2HƏE;me U&V:մ w?ŽC; `>w>M6vC\ ]1nي7CLtK5xw+lHB8ɽ< 3\s|K̟|\b#Ϥ N]NުH9AXMx3;S%Tl f?p}T&NoT&dnw~?pBW犪is1u qYH\|P؋uDV]XtMuu=3 Aɔ&$Og9#Sy9VlyE1oR=n> ai$V~ҋU #3׏88XOiYA2&[I Fه[:ɾ >“HҬ !1/iy{_ J6bֹZV.گJRW5ΉR.ۖpӖY+IY+5nU(֥FewN[Y{{tCMeg[MO7nxnc@Z3pP Fvk5 U賳惸tϤ&3?~vo`ji4;ybܷ~N7}YyLsfO'3,ً'/y$Ӌ}T @0 y4 IPJIh.c]j##ܖ*>_1Pwpj0!0fw56gg ~pj )(w }XV2g ZzJjK>K8 d6-%v9.d{do2vnGef:6e+%C0 nU45Z7 жŋYa v0 pu+7D f_16mX^qi,ǻؔ<.[z"0QVwɄjpJu{ E4Yq֛|j"]]Rע^6~: z~wQ6/FzIOe}!l`Ől$Fi`J5z$%`~y& JEYu*V%NRՖY׫Bjv$1.o3 rQ(t]5|B:LlF2";]eb!]P]~8P1+14X׹H_њ FHe.-}ʥGGmFX(Mk YS˿rW1eE,֖McD\/BeTyGu*"H`*wCB8X/bP]X vXB9nѝ 2o~lܽcB;}Xu$N vJ}MMV4N$fZ5ĩ! 8(Gzjϰ"UIsBR(-q d@n"؀3 k-N`WOU"kvK*`h;N$VWɼ0تlo>kJum]0k?кj:sha3P(9vpŎaC SSފ6Ηn)PkѓIR )gڮ.hx!+W4xXEßǒ*qс҆ 5a XaB*Gg/*rNRk55%# .TiVєbݧ绩>\t91g:.s3t 벘K9_%&*c7}3n]/n+RG<h#Ԃ7h#ڐn JXI}6j%¢>xUէrW{,yyAP8&QKֱ Fol@ >“eň-kmesƙ\a $fR\7.[[}ko֢%)]3r .3$QekymԽq^_|ަ0Ӵxynz>qIlOȭd<8c+o6k HͷSa4/m~jٌ90yS~I.CcWy,rUR a&`]_zoFyp5VFbVWBx!+}NM(p(Yf)1滞ir\pg|L@gG?~y]ɔovĥ7df^%ߘQ}qY콬u"+Nsyi+iۅLoڊWJ`@w~e.1"&31`G|v+#v)@9XrYpDOhqm1V%rRS;ߟAͤa̶Ņ_/ݚKzޭw7o!HcS!Dž@(/8`iR#N=&= OCڙaC%rAL(.0}Lo}!,>ʽ4a>ؖ[1Qa5 0\W'6,*M.'_?2ζ9]o~7Y6_@\̾WKo{f_G߇̸\2); {N\T̏7{!q̳AO0bL5ǡX"E E"cX҈0$z4z0B@H=0X!b0Q'8󕏠m O  K'Kblv&kg0. "c #ۀ܇X9E!u&;ӜV7@X}=N;o).84weqI|Y`)@˞]`6l,<%<$4&Y]UCAꬨ/̐ȐNcB#b"X&JH!6,5#\ChpCU6X+58D>Yyw95`jFcD*R ͇tiPۄ$S5}blPjM#b-tvN~nuO{S'oF}ʰ]Emܵ] <Ǩշwշz-C.oAHpTK~~e{)GڞҏOnw/{6?~p.-;dG&#QOԣ?ԯN'84|̤8;B9doaSsu eCuMP~(gǗ"2#u1/Ͼcd^!ISgo':nYy>_ޏ3j휻bU}O?Tu4 ҃H:4!Fl f8D#>S%;myJ!eP?_!bƠDF*ldgh${G@ٻ]. 'T[b')E&pI %'iK`b CBzz w'}-_L؅WBMP&z[9[^8턉źt&CYH-Vi KwCme$՘ZM"V|qYwARhm6̹`>Tm,~D,d&'S_E}adG@,6㈧HuqLIl D 3-< vC6г)x;Ϯ>=|?]0,u1\O˪f=p?^T"ϿW߿u;?Gfׇksq9[2⧫*CT6!^9A/쌯n#ke}W+[?v cS*'%>K-:{zͿwȯ"XV>)BH%=$lE{xdrpDR"ը=:ؤ։pٕj%[I'V=ղrq~z98o⢞f)>5peױV/{kmjYth{ܖ%|9þtMj eoSUzS_ JagW4/_bWhP Op-T:/c,)Z(H2eйV!"d 4&x1]*R"UPEQJFgG"VBS8QS/,_{3  c3 /ւN8 FrQD* "=AߘL8LmhɌF*22GTolBbs`dv6J|qDY:wFMԡ%rU|z}֥~_؛NH<6:)>_[[Oȏƙg9Q\2ejRqQP/sy3-qxC| P4(mKا} J3;Ҏm%餤2>,J ٺ#ٚZM A"F;X> 2[1?zԼ,HMu͑Ld֝]rO,bGdb᪝>8O>K:{6iBP6>/B4͖{pł86sl< yLЬNYς-&;=e1vۛ_/x\]FӪ7@=9zg'm}*R$d8Y8RmSۼO2A0ŪsyY kVO)$$5=߾|ې*c 5tnaJ&lǵoul2]ї`!G߉;##B5"j>mE;ϓK~kLG\6ܗlP4Ƌ܄\㋴& 4[΢]Eg]nghDى]a 1g1Bs /#% &rλLP`,IKF{kn&H惓s:/F9fs}_&5>i!쓙_=-ґx|#H=ϱ2]@ O3Z%ߖt܁:>HV%XͶ%G+kIQcV xsc[RziKڒ 5>| ̦yGu۝f4hݢ- kAB?*Hr{wg$g#5mL7cTMv\Kmؓpfq'av )>?A4E}-ޛ{sq_{8@BI NZESpRu>>D,=R V䬢Es ВیE Șq'eeqހOȮEB)+`suI"X}a>`lrKv rh$ETS ɕj`57|]ΆT$P\NP*M Taqyf#f(3˳G(`ٿŕ(zVE' U"hZR{ܟfm);37ll9 /ԕjհh ;36G`DW{ӑ i4m^&`ks;c 1عsu2ɵ'#%ln4(5#&3FSGqtu|ĝJQ6R"q+Fj~ApDXk⤀n2sTCZаNa0B8'Ȝe* ҁ8v-rnHmˇ9f C Is&ŐfN4(FT7E9̢ؠP Zr./ l?Z=mbUhj?ᦹ2H5|p:<]>%Y8曗Ml' No5߶yS>tqS@LKu푌^ !>/s6ZnI=m<zW~Fn5o˾ѳyq|aD76HWj>-r'r܋NB9N'_K-Oűr{ш&sϡTC3moꛨ|2Iܷ :b;m\GN M@ ܗ4QډW[u0RU}JT%ZQg!% nRV(/Bh)&kHn q@@bP+K1d"CхPXQ+%,[6<.~R~=dX מ*c`+"OwRI+aoK9 dѰ N-Rw6-9ڱ޼V޺RT>t`I/ד'HTʗ^iP Zoި;hi"6F:b]zGz4$pje*[za8h=vK =6nڵ>&9u,J z~ڢĮo`ٞ87 "Th(96fNhw}bH(a[+NN^lkٴK͆b[SO|AAԹ` Ŷ8HػFn$vcp>`[fI. ؒ'!RjI-lO ⯊b&` 6Jy&]V6=T:񵴗2 Fu$N,(.G=8'k$Oi"ӭqlx^%POڊ(#7ѧq7>-0-g3 c)>["SΛ7gS"6] =$6rĩmxB #iBщ:~tޣ4;^v.i:IP,G(6="I'ЮCPY T2'Ffӻe/eBf*hqP, "/oryk;yH3;H3#GOQ{p=2Ä G*},̥">x!c8<Oo('r]mڼcvun^_ myDN7E9(Iљ^qOi  S9@6_"v.<{3_e+c 2hr 50:jN +OY1YsUZtg`q*%i5XF$4Pp=8σsz 9, ytl&G_Q QM#'ixˋe|g_xfg5{8CopNFD[3 ] 7.7t=lC-W켄s"=?b>Ma164+R 54QiN9'$U yT&R)`D39J1A )2+qDd. "K "\E 8J1%ň >rzmR(7UMI&æ0ul$25kQnݩ!GX: dS,C]Lx=2}(hxB S-mxPv -_ca>́tb"@}mZzw!ٙ/mV{Ve;_D+uEE4TW"X(L@kw93勰)/}R}xv"Z YoҶr,4&j՘sAm? YR'َB ^Hތ0ø>kάɴz$r0 rӈ&<6ϨAn7rY}z(Zh+t%t} ?#]"Pu#*3 [g%T,k2Z~s2,Gw\v5G;_dtQ1EM!ujIb欳sA&9$6ϨAn7rAxe G%/t*$6d;zNoQUzjcdұqf]fBf$#;Gb20z[lTH"DwĎ 8v7GDB1 :=F3afQâWf9sSxerɚѝd82i(3dQgiZfY%LsJ9I(H"ݪf:{v^CL#} }l%n1eNpx\(S4i!RL$" @%MaTF>nOlm+؎mo O |; dpU)iY iypƨ`L)%Sʥe[8Ao,+P)PQX"$" 7@S*Jx-բ+x=$6=;ɉ)Dh_p=$켋"$KQӈƄ hIBR<5%`Fd4HQXdP%fB`q EٵomN(KL696&;˼_C:X_8xwZq/ٵ_tsX'Gp%8s W6MW5ia~YVmt ]mhfd1N͋CpFYHS0H⒗ Ң -U>j0Ϟ糛bA2|bg .7F#T7>A7@g @1_I<0Z3#guñB:M$Y&3|, G;{=_.mw[<_/3h7kbeѸgֱ2r>PiрbkHAqT<Bf~d\=.D-7[glNմ(L<u4ecǥ+2j`W-,,*v8dԉ8OE_QҗQ'.$[Qz3'S I, ֹ)iҚIψPU]`ecY!8~YQpn=ъ6\btq?̷"cUjL$㬜jLmo/дݭZ'`Wd.~MkK{REA=%[r+;NhcVdX+e-Ƈ _$y'zN|,yVwd<Ê&Ii""OBXo`VdWyYm!]33Qݣ췻D`"?b)7FҚl57?>s }#aUmķwçB KO@|ݷWr8*uy8\Oaofz~J \`aQR"Бe]gzrZ)0Ai?ãGD=N Qᤤ1>cO#{WFIcusg͝ݛ;14fM_2{?ȯO3Hƻ_C \( j和PWܒ,Yq`/ ;YP.LX1riylZ|G*_wj%9^C78홸Ho"[Ѧ/2L6)KU,TOș(L+T`TA^*V"SC 2V\ "DQ1S 럌()`zwn p8o발%>Z6_b;m&,8~̈́ 8*dLPSD$S\#Ý#zY`ZNaី9|-cf<=iV^f(^ŠETAv{/a26]%^2=t{D1.ХY )u"'x?yv_'iDQTys~^~ ! Dw^5 휃 4r;1ޣ<ƽ|2# yLFa{/:uC9^fC789A7'rD dDuDPwlDC}e{O|L.Iwha:fl46E>0ZcˆuQY7ᆣ̋ROtAwa {g"zS OخO'97Ԍvׄ;`5DN{gLx-0NF˃d0nӦKQQXY☾^TgrTd+{Hwe1_z<4adNR4#'SKjmiBd9,e2Օp+`W4 Bc~jbd*Iz,qeXPv,izC"5C2Ǒݠ+$ 'ΘH!w-. ;},^`J Oۉ}x˜|j"lyĶHs\sGGOk'{SvGV\vĪt.ҹvIE"})'+yGXܩeK[ӳ>KÉ̵Zajm۫#:AW=3{$5Q;x(7$6`S xݻINވ  Ў{XZ(EE@ոD@P}tA'T [1 y-y !uyC?mhe*.?&=,X0D_O5& (5Sg?Ix;ovgL]=ެz8QyVrq$K&d,gYET4@BRNSx$l/m{@`6ԚbL-9==[4mɹ6P"PMQ>ʌ K sZPP)M0wmmJ:EUz}MU\Tc1)l'߷A28"HI*!uoh4JZ'j>Jղ.K*vtOF%6i\ Z,)ekb&љ`UhA(ty >QLӼ$-P).*Z!l*l|͗ij\Z޶(5\7&^`!iL~v7 5]jYw݀o"=ͱbJ=>ߏwv1.bIηhW,eaGEɹ++#)Wٱﵪ/o"$c Wz0W s IVѶ<_oǍitAdgY+e;^h(sG'I ^ Kj-O}Tχj5W[@u#)!Q.W$e \z{SagjkUЄ ==/5w@8^hc;q B(+܉xʬkx5f A$:8y()%AdZD fs]G%+K㰽v ⬐B0I +/$ܬ]VRGyp%ח}JK&ޥG!y#g;EŸ~BCFd!=ǥ Tgm$><؞F biG{`w} YDK+Y)н|_O:NtQ#1s14G46Gdwd4J;)Gk Sד#jC$q#ۓ͹&rW1Q+8pF܅$]G#=ݞo jqMHpqfQ/tJ1wx ]?ۓsyOp@Ӟ* Fg{QdN4 {*遹T.da2X'$[k(&)CO+.ػYb\(ЦtTp-pjL{))3eJE+7g6!7 s, cTt%9P >N0m'Pgeg/m`ޯ!K!V .\BT^%qp,}U -{jя%e?&8yΓדt+ğ9InJ{0Y1jW8&˞7w]7]<xЬ>Or@s&o@Q:\@ig[}R5]L3l*3fWx Z=؊:[Q ""APUN8\)^mz!0;{G[zp7y-ȇ笕ԟiqz2.588QأY 07 xXmKͭ 5!UZIuωc"5{)3M2[= Auj޻Ӟgsr%)H%X!r6@A&w[&9)<:3(QȈ ml50VF2QWi#4m>1*eMK62R|y5Z(e0ǁ- a 6m.qžO1R -l5m{Tw\dܑ.YKD?f0M$9nT 9Ƚt!:N CwCM{uŋQR2GhcqHֆ^GK*QR03ؓN*{ |5W2TD A rZEdUlYiK4=P4XM4+ TȲy;_o%6;aNt:x;g@1$^泡__dpmϣ`'ZagQ%D]M'1]ZnU%?j+8=< N E-+5C9!^0Ÿ*U⚒%~]Z]:*Tɪ!w,Ր;B(+']f]r<&cr<&Ʌ%sY_OAP-`9/.jPRIFX zRZ7"n2'3wo/2VJDup}uwl=l=z)%(7+.P fQzu"w$8R u2r'ZwOj~B@y4ه[DSH%*({$HV:pPDcm6bHZr+(m&$ <ڕe#RVـ4`B&xJ\5~+v )J=`݆RC&$R,V :wԻ=H0fg\# 89`{f)I-) JZMB>P8*nM@D Aq*hc*S3vAIJc+M ӊH+B(ˌ$~ZZ_9[ƬR9ȡ)׺;˸t5L K4T`BPrs9%`.6L9dyGxA 5Nh =C`aewA[H)Rk'HCRxg." gLB0bLx(j@B{&r_֮0QR^ V8,Ec=% @I/yjId8/0dp My@rbD,Gkp0QMV5tj-)w jJD2H ǷYݞZ.HR(?K!:)RM/ь q#uE3IJ!cUUHBǶ\clf@8.a)y}(Nº搼Q3T5{ǡWFlb"M[fߟ 񷴘7 t_;v:h<^_nc|>(ȀdzP*)4F y]. n-@8% @Mzxd?rNlDC[yq&\R HSlҳ)D zA=B~ B1xNb ւtA QdRL>iBB1lcq| F|btA Ւ,3 F&Q54^bb '^H!1zu GL̛5NcX^Ztq|<(_G#EGDj>Fn~G1a}P*Ǡb60BReIʮGF P*Ӗ%i-tY,KN)S s1tWU5TRVPKnʊTZvY"eQF4C 6dY K:%CAC+Y+tv-KPv}a7Dè4\3!48AX7mRXɍ @Af+2E_Q"=VWx=wew ŎR!YOJxS! t y~M޽{xB9Dt+O kZ `\Kvՠ}} x{!6 ޭe4`vÇyc٥\m] ͆ɇyN~dmh׊Оz=g @2>߸AKth{IA-+)mo\O%S?9ʔ1jfl@jv o=qV@'= rg'iv{ yze;ڿd4p>BNO37i,z.ɫ٬nu,LSnG|9irߜ ~WIXCx0g/o"XҚi9nYP풅h,L*?caB9o3 0|\Uz2=*]sTs;n'stΗz2%Sst3wn3ד{Jк֓x℔3O )_ i ޵5q#-R_\凔K|aUa5%rIʗs )sD"f07%tbgh4FwZ掫]Pw D}:z(jyTTYb>[kBHk_ZO:Qh}ɿlVbar?/ߵG&M /Uvp;u 5gF [(unt¨Ԥ=FخZFJ[# ]n}g7gT pCrLRc a6jO(z9<*;$$G_KQ.E3S{;BBx FvpZ (8t8*Zie(F%B"PT RʫG@ڍ?yW0s;NP 3sI J)OfYnsoT* ƺK AB{W\Ї};˵ m_ u~).3u3j6P5eeflܰ2lΜ3 "9>$g)z(*ը$K"C9tKBVpVT"rcy8JdI2O%RfY-U$VL’T ܬUL.I8z<<2&WR##02-sDVE R>t,(H)ʨAd҄}I_RԈ MvPz` ɰr,"N#VO@#L) a֒^e(ORhxC ,]V(է%q~5:*ਉv$@SΰSk(,LX > 53kRDs1,w\F@fTf* Gamkt8s,,c)N, @^j5h 9XhI-,2́i:gW7U))<)`r jnz1U2/\Ōۏk&Xr#9#&iX!^% (g'cvvwucnٕ0惟4&> @}gP4*ܘ؀MLf~#e 0l5CՍWwZf\QX ] i"^^DCQfX  8 ^SFK%SUJ[]q)ڣKD:h]9*RA{ti:Su)cҵèU Ah h].5wGBjDu$xU6(*W#QPT(VEq =JTR9 tlc[VӅuUl=M .%3GP֗]sEz؛X5g~5a_8* 3֫m'.:zYo't_ßŶ|昕 ap-C/#_蝇,|z3fpȯo&Y %U*LN#*LIj4BYQd?s'Y8xG!W >lHB}v1^EϗzX``??x#i>tYps` 'nTnb1sct$̟ߔh;w@ 9%9fymn"JЮ h4JD.K7|h9eX%_]|B=ZPٌ !ը'nW)+T-)\vX2ۆrG_^oe+$}+V F aKb,f?߸ 7npxf;OSG׹NYe<"*2lFJ-r(WZbx&`9B2&0.*˸ Qí@'3DZ#XBҳPbE*$ ߮s@>Y|&[MVM끻[Ѧxӛ$#rKrgxcT߅›38@x-)Gg3冃w ̏CF0on^< 4aj;|6k9RԴ}# "ŠgޟnoYA} Hp%'F0HPyt?tއ"rOF)-Խ-e4tAegjsr!0\/,`afJ^sx}9}:LU}8fʚ .rkͮZ#mNdj&z=`ܗ;s?B iRHIɡtNTr1vޚ4Zhrv%CѝR@wH"BWzNᮥw,MM"3rdžKK-s$8$3[)v&TqC[I#2BZS"GbM$<݈ym7lk[E=RQ\/V6*ZRwU]M w9I;þ'CI1ϗ].lSz(Mp4Pg_t^y ژG&%) SW39.iȜX1sj09kD2SoY9ͥQB=3{-2ՙxkW膟SCb謠1HetrLb1̬CT26%w>^A!DL&lR6y-;2 B+{M*'3%Sڵ]٭'WfDW;|5G\23`/I4gOcئ {}#j!e{4U;ϕep?{9]8ϟ ")ʼn~4Ʒ fw޽'^̹m~?ѓ qZ*urc9rj':c+rL⦊BF:is zԄ򇓈Q '1J7nNIS$$ꦊ7B""תUkxhdpu{ vWrvl™(S{Wwi\UPWyW!q0&UGRoNJAqTx½>OPf널|27o?bZM2^2[3ح`Lf&9Xg S>F D\ )6>3V8s{5#"H(RyRsƏs@ ؏ʢА.1?7Wx_f Cya(3,Mȸ+Rd9;|qT KL ̵1y@V9 >S_$OqzDQ+ d1{4S 0B֫oU#&(!Ndo HK/VTbLaݺ \ uQReU(h/{U(Fgb:@$ ,yrİ\U-J%^e&$ VM6j;W UpuVLX*1VZPcF{`)8V$# ;cX.Hzck(X+"DD\67 _A 筓Z:V]I]/&Kxnrv藰 /1,Gh1}FCڽ_Lf]׋Q( \x_bgj{7_1.KowW|8d do_mز#y<39e˴I2q$7]]]U]UΠX#}jRJ'TلrN9{iz/H,4TBKWIfhkLBب'JA!҄֝k\wxDG]ugN+ʘC, l)%3$ 3"<!**ge4 &YdYѬ&Jih7LL^m=;sV(MM T8_|u!X'ghh׿;i7|K7b s^fhEv7/:?vdMkVC@~|Gzc#/qtGQwg'L^B_'~^}o6,75m(<\-&F(p$iK.14[@MHh&$WJ m`kWR`Iʅæ.`\N_ٳb*횼xڹKQwΊRH#N :F$ZoCkgӘaRZ)qnEU8I{]~k2Eem4$cx&e僧:(%i5pM>RLiX 0nS:Jq*斃}p*&DSt 6˩" d My29#ɓ%R% G3HƔ3IyO҆I:8<њ0gO '!q9/ ͈"EdFCL#T_Ft!Jzh6h(5EyRm/s"4ŵ&#[:f9xfcjqfb}PMtONsv|z5uvj y#2AgRP:$%\2UK&O&ɻ K8B8j,I(G5&2kg/G5pgсh |g.g m,3@qe}-'"pM]+[M~~,l{o?$ J ,{U阖Ev_fyx͟kyxrdg᏿׈|rý R*%:[ M ]ŗY91TCBabcτ=@УNϨVuKW f'? \^oqO~}0 Nƕ{g,ߞxB-4)Ȍfw׻𵨦:4ב% &h\1HA|Wi|Mfyr>Z\0&xV!h͚*rI p%!dBXCfN) ilDJٔj&&kk- ,D,dS!`lp9l2; N.+%Ց_;MdZ!"J &;}d0+ uwF"@b oj'o>Ǚl$ @6fJ 0ؐ߉?S(<7IaS"SE̳%a{a9C4:1jΉ(X:l*$e{\70/g=2#:1 Oe.*Y$(9t~&\?*uFaDtJJeDQޱBE-k;é櫟 JBTPp 69ƖK5=>i3+S<1+xIA"ǽKcHQ'?ke8mk16qzcǛ;dZ%3fX8lxF>vkŰ]Rpd~z=8C]ګ]Uѹnx7;fMlݦv ɏro"Ŋܐ QCRIVr6@-dޜU"GG&ARZrA_OJDMHIK2 +W: ?;L)PIFࣜO.i$0_WZ|n-_LGʈ䝇a@lJra QN^viOgD(l OdxV'B _ۓwӛb)O6Q7^K-PNXIRvj/&lُ)ib,~k"?a[v^.\̅tDdƀw)rcPeX~qsy:Q|WyB!6QhtI`X1^(T,aWZ!L=eldlzGw Zkec x}[ڇ<+pA;= {aߧ99,koJ :c&^. [oJ˜vnϽT6+o] 6i4 ay'6|eޫ#H_37־" <ۥ"մN!Vm3l<Z{Wlj{VF6؍P=ki]w~aZ,:r S$lQsӷ^@;È ɸ.Ć`PHsK+2* 3=J JPfe Z8;h6:Wh !9^"bk̘ jKO31.k~xhZh H3fTPNϬ9lhe+b`lƕIf` R!\*Y㞭p՜~Ewe}Z3z|q¹Y\nEfvWڻ-:.oP[Cֱx3v:֖! @ۥFmY:z%qSm'nꃍෲFу^3ĘޤEQ5L&B.~/`LcX=i= v`8c6骻˜~EDo"Uo=)[,aK9ĿU򘇶 wM؞l1j#LBCS wwu e;Kvmv`X읣I@ 19DKn/; ˵x N҃4;7F}mw[aK(<48hNY,b An;uktFlЛ9j"FӃ8$0PDQgneA$EІ ѭIc0q-jATSuK:[lŹx_T{ߋj~MMU]Jd*-W_~z?qr$T/@֧<`xw)hBT+`}*a6<&X8on#~iqG:'ߟ*Pg% p 6Xl21AQ P)bրON`F 0LZ*A5bI/D R10 Z;;u^j6YYkYh?(ZzגS(ө.0?BS+U+uz3LXܶ':H-o ꦭP=FO*N~ 5=Y}we/ [[h3(qwA+wG _ub?zYq!\ On X~,Nvܞ/k{7?xAr4@ezs&T9n mZ?܌~Έ$iwOJ x=4>|u']pZ] B43yOOtvx)WW߁9S%VϿ}r_( /regwu9]l3D P\ϰ3&wӓG\dc/j2@wg(d䋽?wŠ 1d юԳSn}%5F:l[Y>#T{Sx9객aAfRїb0G#-<*9u grE{۾{=4k,a'K1uqDfڪ:.4Fwpy絺ǨN>v]l 0l`R&joךܕdm({#lhV[  ͉ad-ZSVN;&(`H||m/rM]}w%jլmaɌahO{˃'UhgI!U F%ΧTHh!Uъ b,:nBy=_Mj4" dP]54z#5a$qʇPĩEǷl=l5.928hHy͈4w7ZNM!FQjۃCߜ>!i4InǾ\VNF .&,#AyG/#R~kw}<'4]-\68VIRZbŦZ[ST^v*߃.;O EaTwe=nI4e׃Vޙ!@$Ǝy5MJdū:Hi]dE~EfDddLDh]K Qz TQ1XKvR4)d}Z%Sȩ$e+#HyPZyEJK#Qu\2| Q:c!muHwFba=3.QnE' iw)Ś~ܐz\LI!F)#p7tDj"QFʯ$2|4^0Vx Fٟ!A4Elؐ3Ӊ܂8^$maD[9I >@B+}?ʁq!?>dK|!L޾C˹͜S1axCBpA EA!8{De+cg)9gw+j!y}yoM<ǥ kW eYq̜[hLwꐶukXn DBK_:%"*PL"Q) px (6: ;J0zgaċVD ï'5Yƃ;9W>>Na|#bTb[`>{Y"apbH* ͤVdQùQeU_F/6,bp9CajF'< 3KxF=oz2M hМ08`ZxP + ?\]6-\y3./(dּ;jeewrF{|۪cK_@y!zKoZ]-~ϟ'v7Դ; f(=ɏWW"k`8~_g| fL.Z=~zpe)>ҵFWmڭ׌ )[VHl4.p7rm<7\?_eE.A#F꽳VwLKo]ƻ#T }a>^z@nu>Z?Ȅ>ڮh`Iˈ,w;؎!|:L'm?o◑蟧 dù6j=']=,=Kfb 'tNۇa=}Ѳd;=\љkn [>_<gzlZ@3Z[xN'%Ny[DEʻr;ʅ@yL3SA7${xMۇn !}WxhOp8!NN>$۳U!n!cҘzz*d[:5vC=5z̛=Viw|ݽwJZOv?fX#% 7v$`@%?s~`E"GxH7e.oҟN%'bug")\w.~:]av9<~uڦX@Z[Jg9Hi7 !$h9H?YJR; ݛAniA' v=h6eC!̖Uq GfpjE 8eRD^3@&'VS 5N ym8mgVhqhMo 9SK{awEso/Aꮛ `դ>;99m?R-8蟦d[d(e jO#2$\_*{7g@$xZ;p\|4qhjN`j-ɺz'FKpZoIDjq}Zrjni"A_1D"0+m-80$ϲ Kry7qĈ|QkJVCPr A#M.۸UqhųweM_^DBcg`6Т;|ÕlҋKHfObtUB@٧ÿȄXztpEaIi`v>i!1m(<0R :,qFO2A&k}']i7+]l wax9V;ȫI*mm5W Qm(j>6{F6.; X:ThK0C<\67]DL R2y߼i"62onvIFs㥽糃jA9柌T[s$n-:Ud$1S޽nCeŔDM)B<5ގ׼#z{}~2 ziy˔Y9&CN41^-®Ӄρ,HqqCCLk~c@YCW vFeŊw=XePЙ-nr :Af;^Q`7U7 éqJJ{^*Ro2EJUh!JMWlq^+5KD=h!zq=x wؘӅ z0>cT#jz]8p:㨉m5Yך8& [FO㯉ew艎:)Scď,t=uo ͥC:!5ӖO'g61fr"+ͧX< Ɖ@7v/Ƚ{|Zݻ[ 8xe O t<<_   ^rtˎf9Ѵc'|ͨq+ǹRЅr1 ENvT]|uSM~RTIJd/ YJh $#Tztp:QϹld 'ȟ˓kWQ Dr_&`>W?t/M$TOHl^8uyN6O7o $DuUNCTN"I$iGӵvz sWP\e"+㸷@(V޾p |C&?%q:d8|LnBiFk\L)T&[[ΖݱWC3 /n܇!.W~C.m ~:go>if;]U\NԎ 8ף`YtID/:wncۗpQ5nRk 9ƲE+8!Y_(u=N0:RصhSDh}Jn*G2"IXŹ_<<3Ŭu_oo]&_<mtsk>;;k3M7^.}ES~ =3f`L;MS89ӞrG[9W2㗼 D<,HI@L|&^.T8 &I=9'y{6U{iyrq#o{}LlX t$$#:JaZvIl̹;ɜ1[\g!SU4`bs:Jxg+Ki7!  ~;;v5yQC`_`Բ)9 ]p^')dv!=lr z  !TKYL+Cge߂hsL{H:a|="yǷ*i5AH3O"4^ً45(lCɬ"3yō<d2[s"zL=FLI}~&jtgE Wltniri3Biry{InDDXo\֐ddRȨ(i!:rʋRR) N-qYbwDk;z7l_N\Nw>-dts]`>eB\z{yD$wY-2N> v򧳏ߜ<+dr I=Mħ_ww|?]߼ɹn/OK&&&&1f0 ƀ 9"Fa\8r ?VI)jSSs䧺bf?xqf]Ɵ -_援mM^_Kѝ<-t΀ [67!M?-+YqtGO]io۸+F>Ep_ Co 3)L1*Qخd'mKJxg ->sy0yPi  B7Qztz=6e^@9%(G2Ϡᅎtt0W7({Ius}] <ϓ*(7 @2$DuI12"]! [16@:Ѝ'r~ uΗGn'KD|Wc>n@'x!Y1MTty"i1a pjE=%Swlҭۺqɺ4AD6M!|!\ٺc.RP;лT{~|}YRxyF1AUq\ \[zY=T'w6^Jh#=7&{sc%@< sɀVxCYXn l@ΏmWn[l֠)hT.FL7P~Fb%Gr?ۮ[zr,z{N,@䴕=(t%Xfe-?(ll.b<@)qx^JLQI׳#cz;!meŗ8Yy; sDE헫6wz UcgjdG>Ga7EppPU0c!p̚(x;Šߍ+n~\Oiف2mQ% V(u@&d{怀+I3zK) \(W!2T2P0$n@!AA=(eAY:Vܗ@ pC~!J1=])(&b.e6M&˙ܞ}4hu`+T N[Mgyݳ]IZ6^k nd,,=A@39±/ {v6} Uc"V΅%ՆiqV%Es7XN o>Cڻ+FByl{I 9h>rOscM[NK2m*R|$?>[B,\XYW&z@b놔 rC!B膮?Żl Zϳn]!&in |t\~^DĊP}:DdBY]k 6|V> WaVԻ5 Of~R(cu[Y/omt˾2͇wJ$zz:݉ (d u21"aIFo<1p-Hԋ QZnZBZ~?dSӖT )ݒݏn05t+!6GWweRhC'}ne *EoTW%efZѱe,#sP@8zL9.f$KB^VLkOKx0r[Ѳ%qPn7\#4W3ۊكc@J򱥗rYzhslbE bAQTr@لtIȜwNe>}ٿRδrU[f ib7PO?Mѫ(C?O{wS_nI/.sq/ޜ:~8^{PuOgş_ߗ/^|g<}6_U_GQL b:^GQ/ntk;<ԓBVd >_TW0Ӵ9D9SClLEyiq^*[oM[@ݩIMDs볫Dg*6Pgn2}J[2x [0^౷Fo7WYS Ÿ dJ?5EBٻ*ziMbzPfl8s6UYM;*QFhnj:f>xx\/FHgw/=_|>}[xYMIJ篣ȓd^$4D>{οSB ~яKqOj9O|;<}ԯq4?\CzA>)[;@ ?ynIn7Q ߿;'tI'<]:7iai#yM\A$я sntJ\ڐSkvqd?g,jmwܐz Q@Z7٤uZU,t̖ 5#s&^/b*:e02& |է ڐ4᧓YLzO(BNńN)>7cdE 2=mܛiu\~87tOacݑBp7F4ik/MK6G}Ju=OA䂄A1JO W"~H@"s!~4~dG6Ze+L(i@ZzƆNuתPi t3X]1LVgt5_6I_ﲭ V9XNcA, ѹ;&s4ah2ak?7ufߧO,sP rƑ[vv{ eʆh]6|.&FI|kE9^f$gϖ1/okX^ZDj P>3X.0 GPVܝL)a~/ȐP=DL;o霒wi${6Zк|eG6-SF+Y5n[IWU&OezE9_RTґrBҡ^HÐ C/&Đ5a=FQ"2 Տz^a(,%brBV]Z[&`"m0'a"OuvNťZ=&W`&L,tJJI% X`4|~6?\ٙ@;KqvIDn5%ҁV䅞w`y%up~yIrtd @Ǭl5_x0ݡ0q乡!sw,|r.^=Fa653'MP3Åe>ۘQ`#-EZrCe<=ޱq* O'>A40@C`eH1HAua1a&f֙ ,ݺR YtX9,mx!?pjBH])!{ؿK.G e^Ѷ{I 9Ape0:FV:.x7ߪm.Lral(E 7Ha~M "s&H ' R4I[-J6B9q: QwozݴCr eVoټhS[XOnt.h <ssUJ WtDEHHa{xEM)7cBvT Uxc[ (ho]6.JH͐uE5`lI\H($SIW\pXBd%Wp"oNw cZR\( !~ȩ#uՏn r{SBpR8H%U8rYRسd.*8ljYx@ӜG6xw4vJPX{Qr*xCee؊UƊ 2qc­'+E]R!>#!T`T5)txCwKu\0Bk⨪_&UD[*3uutḓ;8StrGr8BӽQ|7ʙ}UH:Rdᰮ׏**]cCA̸MgS~6Lƽ N#+:IB|7xeSI-Plg 8^/!ݎ bLV,ЉT}AѺX 7ZOW1Ĉ{яt\{%GJ >S{Zs[n5SV'G42'@x P$]w` Gl]a+&r;9epQ!. 5&YQڋK 9gĆb6=4KnTFXHF^7Kk{"Ț< !G*);Wa,xЦ$I9  n~ګ)V,p j2x*6R"fˉ`{يQMmcu$p!q\LԽZȢ./M)~t w!7xTn׳D]l5Y+{u)~%~2:*$.78*$S>)Tx\bm8VJeᚎ{:K*eTEr\Jmp׎y=5P]} }Y&@P'3b F YwWȤ.ui.,^#˘g5d4MSV9},oW̼n}~{vtP~>؏o4Ys5buO=K?ƽ4ny&}?N,PO/Y<[/%J6_$NrUHeTIrEkNq#Y,%+*܃`L* x~<{d?/geކ2B4" J@ąMnqV 'ߞ@=q:N\UGW~z7^φ/'xy|wO_vկ~{VǷV7IHF򯯆ݫa&. fxfzx5SJB_d8ЌW_(2}2@wC$DO^d46 ^u1P?E%k]Y$y&)?{Kr֛78W>Z"vYm|y18,-\w.L«|zgb m. o^!Mo@鯙iHmn_S1\36TR5f劁LRݞe,JX8:[ќP+IƸVDe`&B qRvupCT`|.\>P>dBTfv 7TJ? wR# pg$4a -CړXd)i2ʸhk.*Rq G* ރ "`XdV"Ak4UfZoMѢW ̄ R>?Dj@aIŹkC~qCkb[%Rt!Vio2Un&K3ܥ[s+I7C`ql겯]XV[o̪q%0^m@&r j!}q7廡>ijiJȫ;;]ZXoic!׺7ț5V p})/onrea}~iibќ``>`&/F6kߎ+h3 !^[]wVW2P9x2(k{9EŌRyz&=SFNs[[l~t5o6JZ[JJGMktwKr&}(/#nt5J{LRe0F ~Vy+4 2`&Vop06J[JJ]jl6V'J7lI'm A1;EP]a~ .~f6ad{~N,8K499Y n2e;GBck) BIlH`JxDOHOtИ;5 :FߎA^qm:- 1Xk`_%埧EyX V֍ߟ9G?vk([+ؽf ؑF<=7>)!ŷIy3*7ِ3608r+Ҵ|nXJ8-ԡ{DݥOGԅ?{6`*Sa!͠B.8ĕ6T8,Z|D "%-*u jWYeK* vx Oa뮗M^9Ƿtua\q*q\b^j-\iN2%b{\xB) ̯\v[``2 S8"ELoE; 6 ?m=8|5l=aR F6Qz ѴX=:$"jVt'!rKdZ}"S |w=J"VcI3;5#Z]^`Vz3wS`!HiSRNT`N`2B?c Hf3J#'nER2Τ d)!2r|rOE-^| #&y'jX+\hf.-]}2î[`Kj q=bq M"pV@01km H1|pfK 9P};<"E^;Cuڇ9N[yZFf=2E4\'59 "SpD,yWm3ُI_^sЋ5PON10bа\3< "_ J(kލۺ1s:wPAn@(_=ra +RAr,8,S K3g&1f$z f/"* uz+&`1Em:fd e!g.|6R6KdԺ@ Đ\Z bPrT<ȇÍւfu{do&2DЕ ,|!N'"a}{O\T<Srz]}⪻<~S/<ϳߥ=5W7xQr]obJz=;I~{Gz~/ se显%NqtL)[;W#~T';qmES=O^n >p!m4n'gӜV4[n]P8ޟ8Gh}\\^:ivzumJ(-0붛f7aJ3.KYl9$^jmZCr,*F(C0ks#>MX[# riRk#WE/%8ÅSh%!vi/HMBs¬Yݠl/57ˢB".l5 kҏ*E 6U^qm50NBr ӵ,HJ WSi)N6:kmlt28 $084U#@b$7TCHOQe$pFg#ŏiee--xOEFHb 9d,%F+-!H<B!$c$"b$o(:D!G1}E,"[KH p h@ d[C;mCXuel) K"nYn5,03]D Ԭv $R({c{5'ʩ Z1>VDt\Y`,<) @*%úК+/  Xx h+Ƹ 1' ѻu]QSH-1~%7f6$# L|J)̷ٮ(^DdS57esl:h T ԧnTҡALMS,mJ7K7U q N1rul#)ze,Pp8k‘2EI%71ъPpcYมR5&e˯|QR^_s1=F6=5/͠. 6`.Q5LMN3Ӭf١hB7.f"#tŒ|r w'Qmtu .#w];kMch]K{jmI4YkAqnO#"{Gc+K2CCN8&2@"p VA~zdkt;#t@+?uqнڠR.w@!1)ElKqi&# $xǚ5<0n*?()Zٟ$,"hC d|^hgVpfz`l:Rkc?:x^s<TqbVk$gMN+GKŢ&[0(h29_ zGbpn7}Z(nh\cګ@q%[1^a,cyW}"]))V!jB"`d|G4,sh<_SGy֫睨y%Dy'nUZ;?-6r5 "7sYݜ os]d %;ruP;[KYz瓹]ek?`eϹ2.Gu|ٍj ΗWxGۋч;xI=\*P\X/-(>K7<6xi҅>\_uLzڪkhkjM(wMGLn徵y?|-ϐ7]C^wڧpN|O?? sh3L92q\L!2C9GE2=L>=PȶNϊjw3T3U";n'ps7- TNm~.kwV?r>vba_TjkEw>knhp# ^Y&mgJĽG a͔Ks.^wmN/P=G;vSA;>_t$iB;rٻ7rcWy Nrz;`x&YoNmٚѬs~Hi.kV ^}dXUA"v,&՛jA$zD::͇9E3ſO/qh81Ҟ.*Z* bP{eOh89o2 ,؟sF85UNPGq&H8~XЉK d} d:qW R=f̛b]VA{[1otrdSy3 卉dĖ$EẜHE@qQȸ(@rp:)%Ey]D.nvbZD ]j|tWG=ܱ@r޼nгGJ{D2kQDr #c"ڣ@iu"?DE,nb ;〉LhhL`vI֒O`+E^imvNW[r4v CE):9=wVpG4`QjZ ՗F1FSZ4 H' pNI~ Z "OSu$D}O*!hv`ދGF@+ip,5 !I,33sQ1c2΄hZE)DsMaC :'w3~x^w= Rc 56e3n8 Ed*JQVLNYf`BsqHT'6$TiEl(+A B<#C se-H)EJS,1]C2:Z AXP&ͼL(d#ԓ q!-i5n$; shgtmq$((/,q3ꗍ_@ʫJc)`cPi%c.(dR[G:1\)N%Y&(;ݔ- M̔ΰvމ l mL+j2L;@؃nwHNg7S+S? ©_T@ DL ( &?soSagӕIj J )bLKSjJS$03G(tPܙz&fWoE7ot懐h/t濝L( m᪳'z$7sb崩̥B Oml9{s>Cs)_*r8nxT1e5GlTKͩI]MR+ߵYQo[VE2}vد{˩< ⼜?]$UU,K}{qkB22UMz~^! g_f73,5*ۇfeUnx/g8Ӻw T OW>m0"*xTTV Q6@Fn+I'6Th]RcUb#tURuyWYIZ*(^"*I|DT9W%O~mㅎbU\,PjKAAJUEo\y ^(J,^3JSF )c!"AT6r1I u.dg f0!1yOW̪H\n 7>'߉w5e#7%&T& +%i@ޟzu9>USp<>'f5 )䆦n uG*ouhh*j0R43h0鶶5h,#&Ex7f?B4h2yH5 ʹN'rr&y6e*XHNoDyJ[)J(Q,l4M2bE̦bݑjdܥ(Ze1TJ^AMe$8KM D`1@2Z>_)1\6-Jq!)ꢈb8*"E{RGO#` F-ؘhyuޱ[ qUk"ʬ+& ,[&.EˮjGiAԞT98}HQ\4ekCD 5xP- kY_NRYaŀ"&f' ##6nɲ%l:[qƌnwG(^nRT *20l\kq:Mz_{/fC K*qt~f6.WN4Bzq^uV\7~ˉ4Y lj{ mީ?ƖC!@5vݵ(Z8=l @uy{׵/c3Z }{!脻tqZ13y'(99<۲|՗CxNJKAyh|7 xoDAovP}>^wu*5+Te.0%0NgEWh_=wx*?W&CH=]Юh A82v6D)fJ :Y{ P.@[$'+oz H1RiRx&R`Uifuk"0,cHr5o}`y뢡Xفo8Q wbE3@P:XDhR3}Fp>a>}<T{CCkqg׾C,=܁%dWJRx٥CUPp1k%#yJIf4"z8R1'!Q$MVA*f)-b HsYN@|=f[BGd+=k4ݹ ҟO =gt2?;f&]=:iaOioqlf2ʁI WVi-MVJjtr֩TXs55d27KKmhU?zp5o0?w:~k7PͿ9}q~ξ WDc8'qZ鵼޲S Pts1McY1-w:b^3=GUQɩ HE; 24{f"L'-*bWգ~.C)m rYiQ/guP 6 o/wK'﷛1 |sWw,sg>Oh=_]Wm?%'УMӳǖo"i1zRHj:V6?:E3*2Em-kuoy1bk̵Gݚ mS0j;_$zC! @ 'L 2nTB ,ɖo<٬1٬qWY=䠚ongtyBz]Xq_LkbJG$M."M(o laa鵝P>8d 834Pi˜%L'Te"Sfy7jU^=~1yɌ2D@%[(eTa݃yC+D.jZM S$LP0=oDD݃yC NJ%/5S& j4F*< 22n;YL!S8 ńLkR4k|:9|)T>ކaWoJlM=ZQ9a,un31?Bh^R"SbhLeeJZ2'(\轉9QytU8L5CVqEY7rDӌPQ8E NZ:%Q, k1gڱ*[b!Dc]/DC |U/V[Xceֺ{ͬrZmQ יURf#Qi+*-fK1 )켒~#d)%d [CMEV*wlRE_haDkWwHx"!A:wr7q_f7 t3ލK}~p&|sfp폹QA4H|}n; UG. ^y%>?߆phY/5^.VHE^0 [^Gn^SE0;Tqy*e#gm*.ئ^Qx|~5w f0њz*7]rɉjBPB1yC^U곑_W)lלonbɈ\Y>u'|l%AOѽ/Fhn^OBVHc3Fb#o tw_{s"A8y,>l '٣U~4Z{?KNWB?=ys *?ˮqyN%&J`Xy!'3?t{_Ƶua1oO+XPIxD_KމRPReI%k,U`UH4Ra|㕀_dxWaȟxI*іd O3Qjvu"^,_#vgC678Ρ=nJ&\8)%/Pv(Yžuܣ^8y!-E1QQrXjՍWb% S8cYI,_o7uLava bFÌ MXI6T%~ʿNy%99Z#Ҽu`1 [N MT`E31k$ާy@H]Ԓu>*ݪłϙ]gsxE2P\ҮEOiO , o9gN{. S;X> @}8YK}LL`p?'rуCӃ8Odŀd*?U'썛(yaލS~@ذ?nEyþވ/Abf:zL0/y+q&jIHX9 Θi j/Ne"UHɕd. T ++`WBe͒jO0swmmzYd?wqp6'ٗ Ztk h CPH_*G7(Ǽ` F + !)(_}kΛ*ӊiIg*Y0-]*:+PLT8(ŝ=0= 'mZќV(3x}pv e5 +YG[l`2B3 L3A9ٺ84twi8X7obt Vjka94kNvD)H$2pwۧa4ߍ?HKF2m?ްп j#kpዋL?ӗdhޢ|x^rz3tamat"(lo ~UDzHRU*"*.G0DžTs+k,WJ/Ձ`(,(XI"q5?|>ϥ- X82Eo ?`0Um`4s0<_!"S%tFIgF-q5.:B(1)T&|Z^7OAJwjv$:'0HO~hn3w7P^KL[N9}F+i;~@ :r?9 yynRRnE94BpkմY0k",=HÐ"UZ!ÚZZe"# csEi0Gb^e§.L^ =G5&鍰) Md1DO%RS0D |Z*#(A rV`&`QJQ-,;TjC!;M&rWkN("\"ă!"NXS zO9s֢\z4N RHw:S]Zj[AC7)zӏN%uXex4`¿cޡp$Z\wZJ|daX 6w.y|c5NJiۑjו$G; ZҖ避_E]!a2KR'N~Bin]DDpxN _Ot?:xw|5mϖ>sc.&3%\?ݏF|")m%^p&u|_01!)~1ȪA% u!D}3kO9ISz蓖BJ&K%BՕG+s›HY{˳o<.;-WcySb>N` QkmYY1(3G3G_ 0> "u0Mo)?py\W|'b.?QWO{d=ݠF۰~bj[FwsSY;ȣ.A6 QțI8&w2䘲 _HjdA`o'^ۨtT~/qw-8T+K3Mx=GM:vq9tse8z|,Fjn9Ȯdcp1o @̣c޲.-;=e5la#k(&5wN[Fn;}Z|R} tJ ,:g4zƣhFS8!< ]df6ކFCc`?=o t N!xRq088(N-EyqՅ?٪e X}(3`ְe mHUbJ7.1$b-lp股S'q/YZh)Z[C9uT4%< o E̸RBk%|^sQA^<1.:1P@6 2A! t^ =ϔiz*k&vh~2>Z="wI:*V0.g2>ǝAi1Tũ·`%](yB#m4i"yQ&5-6DV68811Zc1s NJ驉FEA3|0Rb+ e)jH0j;D@7HC Kt  7Vh^p$# jp(O˱fiD"=Sɱ`K oFIDWE9XEE݌lzE_G 愢XI U=Q5 ̑NjP!+ilTzXL2UZ,37ZZ&҉;zaX(AYDZm #IcŎ Iii<XCAGJĦPXrK[~ ㇰR׶>R:ԔZ|eW"sD(*=U˰Nk>T~%ުJ`bV +U`|"i.klVEN8ip"(r=-KIwcfM99 ~cwJ 7;UbQ `= UJV5UJI/o \LwiUwTUC.%|+R-9igy,_;9 !&G^ J!Uҫ/%mq)fuN`FUqm[[9V RtT7$Œ>1):X$NRG TC!"Y V!vQUGCW%!es˲حb*(:\(P6vHDQ"c'jP'HRӒVf6fueNJAp 3@v|F7uȤkOcVbW\t}"m@ &u5ON|pś3ys3o~f#}|(e1omV:i 8m=ʅM/$ t$f ((ZS jiE(RSj=F2%V%_Md@ׅā95P赔127d˂ZD"50w^z^1"`)o(q!3U Dȇ:-~uR>i RC50֙ɮ'8$)egz&歷k4mN2 u:x5/ ИdOr{9mF! 4wgkA)}\7ڀ}ȵh 3jShiz֪Ԙ6x .5Rt1d9eœaWOry7=taBwytq'5&Z{]䇾YxIC+6w#[SĔ*apװ`CAfuZ,7ݥo{F2y9lʾRP~C;@ÚWtWy)`E $G@H":[BSX."S> (D16<(JՁL@mLN+u2|Tk'mCa_ϐ%iӤ7FSq2J={2C>GuӾ3s72I?< cQzwo5C1Z冫ph9:|H؃20RSa)?cLd1[3ЀʚF*x: vC5;;,jT0DʐG)5wjƍҚw7z$pֈtG4e)g??h-i<[iZ5۱[dѽIN5ۊ8(ZDep1{z$&$1(OS >5\l_~ d|;۴sV.~)jDDCW&jLw߀J^q%K;*Ŀ|͢M x*-N4"1]бez)n+x4uj0v&i%|?[lvQtn 810OPY4G#+Ժ\u"?s/!/%U2P'O XƷj/#&4EY FDhёÊx9LQD}-#d x#qL ZP!34j -vcŔgr{7Ur;Ik̂U%~hk6S"P6)rz(΁.̵[+w0fU랱މ5ƨ:]5.l:5;tѸ-1[ =1JJ[&K";m=:72I1R@jb%$Y7p},?LA12w61.3z!g0diԗ ~y*Qځ1b(GA|vG7:M݀k!qˀ3׌MNVNg>g23nU iac:B' 9E`/,yXs&;;;ǜ5 $>.um-w.]剒s ueo_.7$?7Ӳe^h6ZѾ0K҄8Τ9ބQRx[-nOj2jq[R"O?{Wȑ Y/\RfF%=`pޙyQGiTP=LRGYUbԢ*/8j{ v :^ܓ~M(pREЖ.څS Md%5ٛ<^B?JM^o,T[ j"VT;_IwϵaB#(SdUI [HwpN&]qԂݔWƶʀ}O^{wLlpg~N0wޟ^#V&VJhɜ|K8`nN?$рEiDƘ %Lr|b fK<{(L+XcU|)3Fۉ@ .hyx$éGfI[]Ooɏo.y%CZ+օex^"DGdakuVGˠIkE9E86;yI:*'wtc$H41$qP(\T L::~2/GE*]-6 r뭝WQmŔ-`7f8xA2Zmeߠ9-"j U\7RlOɆie Jzyc:D:8mؚDoSEŸV$V:~qLp(yNKut!3ttQH &7d ]I@jܪeVp⮯^v_BkN3+>?.nmp6wN 1QT*x0,3}d7Qxo"bj1W2%pauHmR/WC +TDW`"$d ZL biO)`jAMkW߉oyְm=Fg `L-5ƒI!m)Ύ`59З&bi ZCCE6ۣOִm@*M2gNtKٙےZׯ}RIڲ^g/EB3ՖUtv'X.+\E:;%AMp+y\XvkY._\&ku: H APv. SG2Ҡ {_FfVpxVzSaY^Tw *;ȐB'Ct:~I[!C'HHB h"vkak.Q|: X{1d*PNQdx13FBױv&KLfI43'k9~xѪ^zYN(]41x52s̺FPW)ޤի D hE$m@(uEM(W><#@b{o^^s-b=YWs=rmzAeS9NC֟Njݟzk>~]x8ojE3@H==zg/Hhzlu -L .+%-pB¨8{ ᄷ%r=eu7>luLg|HAd}:[P0幜x)>? O~v]zf$qtԜ X$Z:a*WW $i, H@ !(UIRJ*$R& yC$Bj`~$)ǜZS/ '3۰x4Y$sԊQݏV?W{uW/Zȿz*ߖ-ؤ'lc\VvŢ Pd2w[;s,qyn#sʐvY8徶xz)pF86Abp1pw!%]>*#HS!fQ7W';nʎڃV0ċMMdU'9 MJ^DTRDNhkSre8벐 s_hő9ieϖwSpl`2}_~t6aD`@< BJT"Wpz3F7C8M 7cZ I-ǘ݃kldD *ELi:=mˬ=eVIu*BD*xM~OwB"^nSȩW The5alv1WI$LQ0qր TD@I&%Vw<ۛaTB1f04,n9OӄZsbx֮XkRe-HQveDB!YDFaLŀ,ŔDDR0D6p*ݑqT}\1r͜JBm=}\W<8:VeJlRg|y!s>YldZI݁2,Y)%v IVaonޒx-_ / G 6JRvQF`Q )]^fW8 0rAo[5b'HgMVy\Da\rdoO p3FW N52+C8V⬠]ZΐOwG֔Z5_MkK@Eq`WY sX[EX8\ɶU;0%AVrx/I]s{&$~9;m-I KˢNwKPMZ9JoeA)BNw?$qFf $5HjAY 7)#s$VX0J(B(uRc~FΣ\rKg)2(js* E^l5 AMvΏ[*~_Xg05bfQ8j rRNZDlwJS\ 1 CH (ɞsj뒷9 I' OG0 cB M2I#NT@`w|9+xF)O$w|!!b ;Ÿ.E=ґ+-m7Yi'OjNE`9)X{us4Ug+uFҚd}tJIkM P]TI]KIѰJB R  H d XxԌ,cpKur(@ڢ.'1w ܋C..3$DCavH;< Fv!h9P[D ǿLLT EҭhNxA Ģn&_ ExQ ZӬHG@m];{:4k[hX.tTph[Ś~B1&YQ])z i#R7~ m 7\UHsЖ%k H&bɈuH|Xm hS| 󔝹*-8_v@)y&7#4k{֏F4x.廸#/#Pa2终o $H!"8(F<vsJ9cl>5[0!:Y<>W}1}۠ `P_TNhϪ0;RT]_)'q $5[K ) ZLO-UV/s*z܆' p8zñs{z4:ݒ|~+>n@LQaZ ׽%>l}<M{+!UzOstۈT6}O1Z h;o τZ_ ZT~^t5&^g=[)w׃rq=8Z Ai?d F6F{|.6jس0$(t\^~8E/"k_)CD =WG_Z2 Հ83g m=xlqBs*߶1P!$n* %``u!mЎCI5Hi&2] !Ab" CSoA$us%Ĕ-ײ ~J/Ɠ'7/e^IUUʼcQ#*'pwSa7Ʌ5%G(;v* *!* S#Vw1XSj)؀& #Tb(40QӴlFX{$fu"dlWWMc;{4p~o_eV 몮&jfuG;:xPޗWL1AOLORQ,cY!3V)%;1jd-#PX?.":0y2,- S/e lG|؁fzӃyXk|xSXc6ލqݮ~eL(q7kjݪB&Y7WI хhcNDl!FA-Hj f r~=*!2ZtUf$lqzD{/#80ku\BJN V])-VZ83Ԛ< & ĄQDX=!+\<|oVM[s<_Lɻs/jrW͐kR˞jz-FW5\Vh; O_ޅV~N;K|N;oᄕU`բ›K.j+I'RDDEҹ]?wG 8h,~l^`9IW[n;#]G-vfb__U (RlQ#B2tT"t" pŷ5C P(qn72m̻N fpR͇6Jcn[Q{ێp;phQ-P{%=h~I $ƤмI̊lG=fd^smЎ5E<[^tF'$g?~w[Çj@C3 rGL18C֤?iU&g^^/Ltbt\/ DLX&UɯMLA8]NZtPw7PYMZc0!r^.g#d2a& GZI#,h5I=ОJQ(TeګVlVsHM@Lb2N:Vgb2 %@5慚{ Psf%zZ̒(^-l^׊a3ګ˛+^oeCt)]ם/SpL4DZ~\<@ 1LEِ:dlC z)"W޽R ]\^H•ٗs*UUZ幫ȂX0_V%h֦p\\t7?|s$dH+TYgp'"jQ\e ߿؟WI-ȓyEރ4YM4Y6) %f=YXr_^w^`l΂䳒,ȬtFƔWdIVM)))s`2+{.&Lj4M|~Fk-Jߗ_FijY~G dREغ{[ihilod]JafG|1!l Sdo%%% Y}iO1Ŗ)9gJxw  ב{jxW## 3݀׷| ^j ֈN :F^Ԋ #t-QcEItCd gZϿHHm]XKT-wYL:&K9ۘdmuKl՟ȟVNYed9SkM-1<,B t>A `rnxy˨J3xG9"C5\}Fn^lNtץv!(3 1ABIAL AC?؀Kېk]+JhKc[}y]^,Ugu"g%nm1:+Z՜~'f'GZw~˼'PWJh񩦤/ sy%E)NnǪߘn?5'~0ǟ9Ͽx~V.˻u#gW$%]ݿ>l1aN:X^ N0Ӿy/߯bc-˵՟4DEpYw?tπfTqr05 xۂ|SH˨]/Y9{,XgSUOd>DW@>)kJk5Ϛ=8)a2ð b1QWN#$rQ06CZ[Ԕ *@1q/y"n:o wbCΌOx̢C/fX,QpBN3~9=8jȝ1zԢ+fG 205a:j +i#QdbN'cRI{W ṕu1= vUiBES,Z7KoquF{U0tJvQJMG ;k虨s|#r0 PNSb7*jЁ$Uq&|a6ó"WE%:`6Wv fs%~#/bLIg} &#ZZ!D#rߖ\NB:1'ӻqU^j/AA/5{ e*i1]ԓU8XTVTS*k$tU.^Z.tᄅ.ƌ>ImR[2={ `- {7%J 4af,tr+ki#yO0' >m a[`6 7bI[U 82 `Y$Hfbo -¦3 !; TG^?jVExm R5"fsiߎuZ^$DZ88 =bfz|?%@=N lPw=|Fbv* |ұ!rhA 1lî>Ht 0Q5t쨙j/iN 1B;^HњNdfm{I^pqolVvSz픔%fe˿=8 >Qg2iTqk$Hkw"Nh NtذQDaq ܵ=KCz=孱 V-mxE!qSݞg<Gްۥ@jz5F..n_9~16XSM/6˰և,-9!3{ȷe#[\1Uu$ȚKtaYld8Ny%`$$n $]d'B1_:1` f Y1i gLΦ)Φ"+ J}R*+br$TƠF#F$OR:)؈-:c T/6*y-(yj\$?*sJ= UڜDj~{B D0g%( @gIwYrǎ U.&LcԉU̬wЬ'?|Dc`1w+Qw^'A&1%kF?PyXnGl|:1͗n0p{5/bҳwu9q6nFpD9eJl~Hk!A MqDzCĞg1AvG'O#R]΃lƞgDsRp6H|F;S_ }zac3sA ;}yԠwZCZO/ͨGZptWdn7Tx xIxo8+jHN,!)|y2S[3[\vcLQ5R9j)@Ϲ's.8iK;cق~_WY-pY5v+/fQR?YA/d7H`=v8T%д˱@1cDP11,P]%'*bUp )N/yMU;juh(lNU+ׄP&@q5MmuNU>r%YT&3bRCU.4n˨Zٛٷ\I8~F>!w8Cp~tE{!ɷ;T+{qW4;pJ݅ KD\j.A2d.v`ȁ GtBǨb:XP |D't*֭ڼ|{nuH 2OYB'}z\ Jji% q"USM I~-rKQ‘5CS]s^zkXRP}ދViյh]y0^dD-ը6l~ Q:B¥ng8ɲRMgǓʤ{/M~|6C[|?-NĖY@I~8ZS7|UiJo4R7EgiI}2hɆPҦ)Mp*F@e[qwWߵvx4]fUY!Cw2,mRybv_n<~k#Y3k 'VЈr#\4 Xk]t>]tkZ\l /K[i'"~df'4VTt$/ 4bw{ilW~1vxg3Bб5+8[CZQ%/7fCM{0WIs&MJii КhTjB5FE7pn&HC* a*k v25I`:K?7HO>G6rx:{clZsb1 }Pl-)>G#@]7%CgM9{/4ΐc*Q9{#nʯt8Upw +BA>/_E|peKxἓ( 5|Bllsa dТ9_hR1asa}3$ bU&ԩ(KqpT@rSbɱ-0p/mwr|gk[ =|ZOWRN?zqag=F2 y}L#:myiπkAh!Ϸ aDSgWkj΍ sVAM߲y`9{8{L "iCx(G@s/T&E Q@MTO_)@TE:5ԧE5\[J; &/9uƔ$sm#Maڞxm}t?/ nnv9|tx뾐,ޏKď`N.(V pBq%rѹnMPd~w_gs;f\+"%RmPZGw̑+W]{Τ`4ڹHpD0_3Ji~61߻_y wGHP@IF28I -S#T3`,т%R :`rPfJ?_'QZޣ$/C7yD環zz/I:?} /)}|-{{[RƉJ5Z8YJ ty)P,M(TE(}PJt t>.J'Kz/EIrWSmo;wO vtg j!o.{ḝ}\nMVQZ5OxZ *g+7Z= <(p~5ǟ'o8=\j,fKn[l2IkE4Mnlԓlzǽ 9+$8ܕ4]<f:&㲍NOjuh#( RiT1p +-M<#ŸF)He{s^L&f).K-W|?Ez4c^ciHp E˒Dq؈g"K%*HCK\FxwRs.'nɉ'[lk(?4+DY#^Cb??C2^gSWAkق}MX' Fx+NW\Ob>X*X,/FXu+lV㑾Cn{{vo#@YfBAx4pd6$e;D9@*zSV⃀~1-ZM>G(,C9Z;~#VQMsp/T~hޖ9KҔc9=[|S7MK oggrTFm9zy)X#Q29-ɳ uxw%.v߄p'Jlc- Ӻ%/7Mi~ JoؖOBjUDzՕi-]mvniO-L6FBM++T̿SXRTE"%qdQ"X2HhM[h>o;hmIz+v,h\dpmjV=c Cc'pS1>=}S͞"W'<QI:ߤ}E?IX t`Ћ1I7<-W fu0?t5lA#T65o7Nr\ tdK/d[\}F?~Fqz@}f74 Ypٗr%Oztz<4S rMwk1 y^G[7dh-;ѳ`Q|=2M% ^_;Z.Ǘ +㻝lڞ#/^XϐCjYr=\sڜ| ݫ}7'Uysʆe rv%טElogWQvF^kZNF]9 BpijتRbp[{||k$z%`g+1{9Xll,#}^*I \۲ߘ]8q0y7 #O9Ɓ;g@Oxa-I eϭ3k5nLҨ_ճD4il)кfUcj!_  TP#lQ6\kJ6p}?/Ah{YlםSWt&.,:S:Q @%O&_'Mq6! Q l>#DT.rXJLB ZVN,RRSzw\aZ N*@2L,f"TلTrp*ZYlPT%P IJe5;F hq"IcH,!G4Jc(-tʲXL:AeL1@m"I!rȒ{Do=;[ךtFt;B(ĀMAb(.YI8 ˒4щ!cBk'.aBȄJ!HH srwAg~F"ew0O>,$o_v`X_$_5-۔(YM0l$տꮮu1 0/1NooG_H#`n7p<>q *S<]?'<^?)0+x s5^}r-wg'd!{ѣ|>dOWW0s  \C[* Gg?c"iUdFp6 bc-H!$;כۗS[HMrk0h,Q5sos/40en =QX) mxV B-2ʐ`9>{A4:PB{(% kBQ,rMau[ĖKIQDk=˽a0"ذЀ"(fs‚*.)P*D@~/w5*W,ŵ1V?ѓz`^ LgE ^q֜IlIpeu~XKo?`f7^^~2،Zm/Jٿ_enPpEZ7Y@ Eusl:?f˧ݻ _ʘ܅>]B/OBH?=5 (d{ U4^͒M dUB3 JsƴS`Sr ՁW+ w%tztW$Nlxa*;z N$ IrPJqB@3(g{))6"Ǟଣ,Vy%|0Ν=`prͩ!:lrcGy̩;Ч(bVSuu ТbC8,<'opG{-0 mq-uT:dD5B 8ے\R&U !J`a1Db C5f 8ؗ`nM5 sC]h ^PWeۏ6%Sth避k 4ޏ6Ҟ2S eAUAjQiJN }\?WAe *]xhU,u^Z4RB4zinHҢx^ZѾ`*Ud,ayeB^ZF5bRJ Fk2%̕Тp߬"ï4x;,[8 wf~yvrLO&tӸetxjybp^ө%;]Z罥bu Ʌ1̈hqJ;9’uDeN`Axg5W7 *09H%#;\FԜ{3;M6\`X 5(XQ}Pg@Z𸣔" M? 9ǘhK-#WU7RBzr} ^ose/rcHB09 < 9l8y#H9FИ@ mi!?NN tve2u#a$=ICx+]~\9pIvC: Ȓv ru%}-AgaioX+G+l04dGxT,DJ%909 Ba$.9 kZYE"gߦA$>W Ȃa2 `1.+¢ 2 BAfF`g.7^^ׇE(o {;F"^}CRR5bx jGkDcP:91jP-=^ؙN)(M=ϣL)*g'M@-e>>ɻ`>cJ̀AKgA1JV]5j~vI0?\wqAcic4J` ݟ9yq _#cf]Ŀ.q/|1y\ Aj /MY*$P [X?F\N18 +Wd+4*5=8Ƴ O3 ,`7_Eܟq?hVL8SݘEZ/Da+/pk*n|1Q /Fy_̃WFmuۦ/z5z_ȄCl4 i `xm^s4 +7~SU)joBim۪ro46 6ܱڗ(TR$ u9d=/m1 ZDqV z(%v;ׂ)\m{ ׁ+&>flnuk sٚ #)WT"E"ڟLt2~H\65 &TxzG\ IݳQvr$)UQsh0vLQaɦlY҉Kl[/}%ߴʦ 6Tqݩ}T"ҪbT;D5ꮨZ 0um+?a>$pPa5 ́C`}͖fK.jm/j$r-%^ Hִ(H.yvF{T֗xŖ٣UL7o6ϫjSxm[2qHqhΞ\9posU[xw=E hc|XzǪQuzGY(- ꘢3UY[ GKB]UX-p8vc;TO͚NQV\$tz6iosZv&:"ëaBʑni2RRP+fxO͹e:g*5 VCk@F%)6#HXR9Ѡ #HQ$gXA/J$kLJ &s+1Ƅ2lxr 0DQmO)"5QfJ65eXILMX鸠qM_۸ Ƶ@B'8,ɪ$Զv6űB#vJI'TaC4c|*/D-, ڮ=VD<€8ɍ x^Gy`4%Nj6n#ridv{ݢ"jݍ5 (m A@@[UDy/G,mUkm]u~C]a{<J@g t8k-ך ;jqRliwޜKNpw;"h+c28* wy\'(QBo[J7ƨ!|Yւ-)U80ŴX(HÁZrvޘjv:Tv4Z.)i{9Z,F?^L=m[#V鏇j\gR3ۨУM}:,']=ɈF{̴fp[LЫke`weN^S^ȇ>A 1H( {}muMre{s;o9-MLK3)E]GfT2O%!bQ"Lrarev2;0괨8Gt}E&qؒ\pa;K",p+BqӞy! /!63O^N@礨"02_t85.5Q d܂`AgYaYfD,#0^([NW4Ći2RҸg j(Mۜ[Sy So>Q3TRX>5S;oۻ9E6x73y`(Si_ 1U>*#YB.?-yG/IX:t 7yFuIsaLD*b*̍A()&G,i[|6Bn79Q> B:/##a~ {-(aqD?]?'<^?) _Ā7]ߝS 3ч_t6_P3@=@@=3Wf6ur3]?s  \C[3 Gg?c"b|k%R+Lï0(>*#̮S3!w+/w~  :>}c_?=#+p ~]HʉۯzHC5Þ mCyT{L `96 LQbtVZMlq%^_օ8,2xtX-ZIJP{5U<=Y}ް^){KJYh<826#x Tؐʣw91H0AJx,x&ze Y^(,$\H0$]wbk'D2V&|kkC(e4R TrP55o,ߍ0#:5fc1Wsrf 3p kUDQ0,0#0#Sn]OGy/^[0J؎~>p3lqk[Zʾ1Cz[yn2'ZlrAY c`8Nne &%|̤㶇J lXEF"0"t/{'D3YwDaS| =KtWG[߰9U;9"~^,l6铅M8xfaC̻%56˝(k- feVƵX "F4)Dh@1iK%EGk#4k=T:JF,$`삊h-71D QcDQbIʬ=d":uP*JF,ϠZJ5 LHs#+e%~ݡy9س|-Qz%/EHICW4b"Ꞟ?>4i|33J_f!oX%xa,>W)*||ܦw oCN0OBVT6OO Xmgy<'AC7Ǖm\_|$z 3*u ha?GC&j~9Y<˜hd\UPζeŖrg5pNiЊ=/Z)[qz1 h1q车z钬5%\0m~M'S 95Boj|ѦbE](flGCW!IU`[Y9.0ga26\T5FFwXB .Q` .|{ MZGvth]b"):dDv SRu*d^ 1%"YpH+ n4`e:2FSlB6BvH~A9P-윘s V;#TbmIlY=׿ R*CR#UC$Vd=i' ;@)TX=8\[!:-~Mc5X}kKrDȿbq/(JED[yIP@E̩)SCð,@1)dZʡZ(둰?xKhIBWtJª']xn9qz/5j_͞CL/?늚՜Z5guzi;buӵ13cqap㗄vjSfu0Yʢtl*h~g(_Q?:ɖQM@ m0j6bmς.AԩeԯFOS&(D[&"UQrvFe-~uA"BhئT] d;{>\K"V$i3할5 bIhj퇲w0p_gôS&)7ॉqLo&O.}8I> 86MS?/k!vlī<}绹H /r1< Mzz3A+0Y?Pѧa.e< Ij\2r}Lvaxypw~ɪE)O0 VnIB~"#SȆvr'nw4nJ<պ7=кڭ hL1nfTkVil֦T$|Y #dXXgDzc{lV<|GբS P4+}PK\Y}I9*P6oVIsL;><;{N*!ն|j*'s1侣|:!T/_܆Exߢ'}*V&7 r( sw@W q!~^q՟ӹӫ+Yb䟇_lAIB^8=Oy<oRX̛;L-Z}f4f4%`F| i:y&WxUo*4%mdX-*S3QhIY"%Ěh\0Q[tϣ|gn`M/9#ksb7n,$y<V1$zr^@ dy #_jux@-+ ]*dkrSQS#qD@9X"'F5F`iC_]~e Œ5ZS@E> K:g<{!wޛ)mxF|M1OХxpyis$!c>˱ t|i7M ᤣ3[L=Wq}u5r2oo)׭m &ﮩ =R+>߁i>Wc Gqܙ߰^f ݭWN.0tǙ']bɪ=NT^F 7[v6Rhbϩ].ZuZHnם(D?,v_@[Aha'VMEiT"H I 82X$~{ cQ(zI,jAF"2U[l;A-uZp~Bc$NԿ>*qMKEܷ`Rq>Ņ?l>}1-^}F^79W )VAz<^EA rZ($S~CK6LJ]?g68G27EF F_Or Ѱ`Iς,ܴ^^:GE5'Zb= D=T$`Ey".=aus D g4SGȁQƭ`HQPl`{Ld9& LR'IĂiCDbXiIEdu`f>*t(uemL4/KsR"q1U\gx2)9W|2W-o' & /'-?\0{ :|H1BOp "@n+}{~d:[He Eo7tb?= ~xD6}iYrpL$8)n?Hj?-/R02$F)\*;nUK l,/bHm|4(ƍme4k£n1E">f8aRnHIK;%tgqN1-c17 Z*!Q{*Z%,8}pQ*Kܰ# wо0l$Kn|j Tsڠx:J9!AXYn>Sc0(a=Ȓ 5l›;`jZqJ_LQ BƯ (يbLUoqNqf-(_&ep !;t4e>"׼㐆MsmqH3dIRCGz4j@>utT0=>Y)?吥V˭kl' ܲ0:,@6T[QcbF98ۡtpcYY孲J`kk dFTִ f=(AMlw.kmHݡX Xؾ/YE1%ݤ$Ngz8M GUuu2(3&3bAVa gd2_B0ޣ^V% ɠݣ b%efHMڸv& 1e ב[b;Q W1dNyo9r<@qEeFN__hHͳ3+z=~q0k:lx~r{~1]}^Ӆ&6lz+Z?;[VASVP\ӽX..FQQC}$S.V!B Cj쟵-zB \%%` SHK%^Gnf|rg{`n|C]H^->~k5i Ǩuj'6Ge=80bb`@RD E?s:KXW0V`_7Ə GtqlrG-:",T;CM, dRQ1a._S8M$ll7*y`J | 9#x:3 AY!,˛>&İDu*l fv[,sA x}g 5 ,j9< (  ܧ+wV^pW?9?on.ΧWE/SPy'TRBRH*ƼCT0Rre 9zt! ˛k1LI ck– cPXӓt,Aܷ ug!IS'ĎR;LI؍-CX:8s* n|ېf8;8>[:Vu]ufk^,p0FJAb)n Iqx 0GAEe%8,VgDhi'v(ѻA`!~*wLFcE}Q. {[ef!p%ރ<$qao꒏\J q^{!{EA^r?d @|+`rNQ|Am CD0JSeӘ^N Zu `3ـA+#`}L@W,HѰNip1O0X-*H}lS8OWb* jA0Y%ISjd?1輰ICBHd,LAʨ(*#PQA%ś{'jj\f]n'əheeI@eV+ 2<)3b2c!r*puӪjPJA pFҀ!F;7%LI Wnޙ˚ZͥV1n99j8s",syNGVt'tW!^, lcdmHH1]O:EVD-W"S6o $-dՠ2a L9# Q>XJ.Uh}$ GYh5~S58 8*lASos-p^~vjl>;s-:ZI\1{뻫h do.n|{WyE:0eGFR +y5Lvt9rz}{s܄B.\M얞D>’ٕr/6tY0x[Y~ R[?sy&gRMHI-R&~ ')ROR#ӂKho߾{Mh8 z}"ir@0)4#j`>-jLÞ'iP.h&P/vBL2У%_MCJBJr-z"N Ujyޗp>Clb m0ՁWv<O9!`#HKԶk꾍Tr*$48b0=1I> !Of<@,cDQ L~4hd Ŵ\ %$,z2#Nv{'Bq v{Ԭ} GbOÉr<1Ćiꄫa r^9#tx{I(au91%C!)IWֽN"BTAQwx)U>,7d2'1dH{7ƷLT]]߹qZ5qP1U*BK`N!DRcf!bp<חfH%SdCǩ}o |L>'m* bpH 1?G2s Hq#rS9ϞaYѪhtJ3p<#ù ".X1TqDq쳄5(Ux!h%#))sim#EIp j9PZ* N9(5!V5ri?_vų"Zg2Lg 1J:OJQ`Sg>)Cl@Fv7](TJrmЂIo0 JJhpQ't*@@ @\sHv3+їEҺ:Aص\k]Re:[Lr1DzW?ژofo~d/K]CFW q= D[@tsӽ= 6 9 [䀮6VW>zW毯Lգvfά*@G6]ѿM9On/gJPym?]؏jrܟQ05,>yw򞴾fF%dHPu(DDRԝ0ɐi~˱nuyz:^.77¸j[+0@5)Ȥ}Ǹu%C)X˶=%Ckǂs9$ 7m>;y͜V-t 䒰֘K,>;vw1tNٕԵkn.^O. ̳ƷݙLR¤_N3ψ:[;_ӎ>\=;btuلQȞMJ~;e wlU>T⭫T"a  5Dp/B)Kd5uN1ccO+5EFAj)#hPBKrLxb@Ny.GjV\{eH d'b[ǿɅnt3-ˈ4"SV 􄕌ϳ` kla 0jBaߚι#AsRCeZ Q)?BJ(NNk'ydH! 2?L\B;Q)x;uWn+ߑߵe{ݯfxHU'tLr>\%ngBi1k[R AN߸,uޮ{# ߿Ao^?KQ1,*d_C,\RyVo7b~N&ﰱƀX{t bd,dhns_+}lHD0L%zc@C{\NBĄ"DSU+G0E޲`:bӳWRl=)Aϫgv0X rY$.J ~tDCVi&:5!)8.vE,BRL8iI“t]T)I2 yޞϟ|𿫟훑Q3/&э[GgGO?[tP넗uGj "I]aFAι'}Xc`L3׍"`tu&uHSu* @{Xt@V$u^&y~Jsg/!p1 Z;Aϱ GCJIӜaIo[H8fBHPYq$w+p'ZͥV1n99j8s",syNZCs硣'\srRA40'+C!l00-aJ"xC-~Ͳ& 紻QLNtv3!l I hfvJMJu&<س?~+Ok֣؋a6㋇czrZytV0_3ՃQ~h:XlEFafPQ{qF_'~Pqn?.wdLQQgz5f}9 +mǗ(=jA<e$.&S'oنs7wĂB*ŦגWhukhp.\r&#L[gI tH[X)7|F=eive#I69Q|޳=bNM9ivGL'=N=y|zŽZ V᝘\(*8~(`=d{oFSh)Ǧ:a@6ijsN8;ٻd0eq $"d@y2[RAZgk:P_Oi^ @ְɚ+ؤ)8Fh+u8Wi~C\_5ʟx1yL,j.`mum7/}'"3d,\vhN_N.{=|i0379H)Q:7 DFT r !N@aDrV|mpcz}< ȬLHePfܛN Ad2ЩUalq9LZ!0d4a6"z[p] B^ΤDo)!GrID}^vp3)9]3oPYDI je"PThq_fz~;修q%TGqsp/ˏpd Q%k,p$G2>}X7F JĂ՚8ňoŃR2%Z; b[~V__/'Wȕ0s+E]Nb|;%bM|iQsxk/46u/oĚZcL̈'?e=\Ȯ1}4\^LaEp"X;]s7"-)X݁/!hqA\2I29JԴVP&뤼F+8 .`8BaNJJ 256 28_Lq)XHbP1 DXQC$@lRnb;۠rnVz$hIђ(%Q-ruoZv岵j@@7@9(KS{Ɖ`D8 zP X PEs–1&~Kt01B˃eXEO=e %p VZ A%uĔ)y8N A*ԈX CJ !Ay]r {2kG9`P@T s\9ŕ2piBV<شV"d&Nb$AE  `b"N`{຃*¢ ( `E_ȠQ)P-KEX%$h1x q Njn"KzV85 Jx 7[13* Uh)$+KQzpޔ-P3ͱlbXRJV\R5F5p94Vg6łb . <4`&^r3 ` JǴZ$uTPڬDR U`Đ<27ֲ`Saii5XlMrhu]] _=6i=>f&fx𨚵5Z l  -bÜ;t^׍bZ9( &.kPi^pD9̙p}[m E9H<&۱i SBi ӃV@jK+QZ9p,!3 59 :N*Acc?/j_QO,*ޒ*. \Y!6׆-`(3F"b7"qwBZky7YHOX.Ԯr)ɝZe& [ŸZf\p%ck4%A8윇1 T?Wy'aKJj]-pIToJE7 /4c5s h:9yIՓޗBcBRNn:҃,+D#RRe:9yN8z1a& 3%rL$#'*hdZ0tRv~rde(!1lwL` R69ą>,ϔPS -p@K-/Z0ɉcKK˂$&:-G%*jJ@) I9а#TG'QO'&s!HͳaX~kƸ*PHnX{ upcxpp#+.&]V?u8$`;2VDYc4aJ4Xqۗ)Hލ t*xNu` uTm5`Z1 m\RJ87( ݍlKx۵bF{%5> Ϭ 1`"AD5:m soqV#?.l,%*Yxs~wI ȗb[׏h.{h&~;n<4l4ihûn`6 !# I؅IONGT:(KGr G6D|N?pDw,$X+W:QKfc!KV /X Dt 1s-x' Y!YNėH>래dM*;250_ &d.gs6%T/:[hW.]ͱ\7)Gޗ힧{H¬f,XG4S%`"qp&S#$|AI{<'@+gK;1bUy;)x.М蔊=@|t@c2'<]$cJ5\>O6htwr1xO7(vAx.tf`ڻVV%q+_0=+&=?W3Xpog ` %XS5h{gߪfh~LQ5;|M3kQ|4M-®lZlsf~y"{w9Z9miV`O,fHx26V>O=l\T\QFV:` YN2cl<:q梬·4NP6 Zq22p|VK%#FD>@S 4GkB;$HDӾʠe.nF^\um\:7T!-+{/p<\4g.cFx:ޥ UVcĔ07o 7a1J)$!4KBx; 9SɅ>x;Vwf|~Bo{Xyu 3\0QDN8"p7n=x]ěn\knja[.fwKZFZmۙ*SzpESzpû҃__/%e )MP#V4F>ct1vp;]DCsz3ٜ $9. ޘ ϙ\ Iߞ'24w~5.wVҥ[tzddc/X _,ʘ#1+cx]y ?}a>;Mo\uQ| {W>L)/'&'tb` B ăx?  ^1}{=#n]w^^0h,?p4XjLuNqˣ[0f{{q^9򡩥)a]!"߽ȗH, *g %WciAAP R KA4DHoEF N3jUUWu%_VuT֪_PѭW +`cEhm];2Z"A+ ۩p>Ԓr4LI_Pa_ĵ'Z%)Y׾Hd}tE9^"QG[?ƈ-qP_"ĻsHGfРT ;1bȅc%AȶCnP l-=[aU瞇IT=SN ^.c 55Bvफ|ՔJz竫\l2Ze=u:l:q}]}[FH숋u~W-jc\rʇ[sg-[31^!o&ogdSugzo(2'5ީVb)}o+cE*dɗ~փ6O1VI5oH뱻[3g8ȢMm-䢝ݭrΠBڽLH9٢#n?q[aj]H[$DK LՋZB.BbTgyԡ舦Ao!:d$*Za qQ>9ꆶKvf dMnQ HP.ƲjnTm+͏ aDuw:VŨ&2.׃7{ KsŸ&dz2W8תӳw5.c!3Ȏh3>/ƗOLT'_\9<߱G|Nƫugg\OYv=pcoke5+$Im#z^+hdޒ#8"0t132DН_~W_\I)Hj>6:hQo)~yf5)߾]7+PNoU`0}x368EcZ"V zgvܚ ~dʭJG+1C݌oMϦs,QGX[Kldap-bK T2uM+ji)/4r(B`Rgjѽ!6[zם0IÑY$jv=eBH1!Z&d_3PxҴ%Ql)n9a(tȣj+M>T1"ӧ?JҡĀjU)2Oz%=Tǐ6%^jZPu uVnB0q`%^io,t[Q^ًfvq3Oċ* 1h L TjBh̓0 {-sJpJŖ+Jm"xFAV0Z}PnkhA"6y AkgF+ޗkG94k >،-qsB6nKߟCx'Q׽~][+B,Լ_ dbSʨ^g}KjJ*ucbK%w.!1_mKhvnM0cvJE{{"?m- U )Zm ̏^兞UfN5l;x/ּݭ^*pTv",3,B18#i Rc$JeỈdbE#"efP"}Pt 'qȺK-.`JtLr 0̄D-؉Of`m"%m 0YlcC0!@?N`lheJT cӌHac`x&A𡖜4[ yxnVmo֫A6/ e.-FZ!)Gl#M%Zq0֓hx:^|yq<9p$ :`5֠~\N0Xx1V6m<@ 7r}5=zJ/=3E† R[y`,DSwq0?~nlXRz7!YtQ>m%tTt5K8 {.m!1\ds/J9sb0qK &I" ~E A63$P?|^_x + ^1JS-?'9`o7ez)>cWt㧷 /xk50RH?×AnU*IKjY9#Fb41pl2J,^c`̤ h2DFc >l1dMeX qqP&2#x<1r%˼ZNv㮦DAw.K)}J.Q(ɤ 玭BYCi t$HXxˍe]KE`'n*-G H9)vVBbVs<~=3Bp)Y׮1*7Fy{XX&{}ftD #cbq)NLJ8C:811J`'(SRFTdgSˆh8S򘤄HPi.ii͔aı`0>_ UŰ,)bQ#Yic 1eFLᔫ$K16J{`1CNߟbOeQ! 8yl a^ST !|Bo?o O\,6U8ADR\)pJ5X@LJJ!$&f~1˷Sdx!ˤ# Bp1ʙR_"iMx#9W/dT!*kM^<\_tq[&:S,+#)8x$>2kôT)\ O:aN'wK}]%g{DF/%|Jaeh%A;3n-* 7_ߢI\?lĺ5XukRk }K63>Ɯƹ}u bHiV:TAS B#ka} p&oc>߸>8DJRT9©;ʲg`1/sJP2.<ټаF2*"]_^q^UXȘ$F9m,&'ƤyVD6[Zgu3ngu.o\^[cdt4_ X.5i|ֵ7hfr<,Q"E bY5'2Dd&Uc)) Ց'Tb{|H`8aPЉSuAa SD]Nk 9!F4Ps:O+xPM9l?nO% Gj_ѝsozKcd)5 T\nhxKS{*ҖaOO 4jn?٩҉ F=Uwlo?|d`Y,ӇjäYpެOܭJs< 0&):^|Ѭ G!+.})7ݻ ;!\Q 8eXAwh}n @FifN &ԜNo֘ V&ՇQ2,ȕ ݪzWw7Xun}\ms砍%_V*Vҳ%{r@EG(uNI2*֭Y72%[ y"%S֍Q*`|1(#:bZ<@:n}JZEKHۮ^Z7Dh|1(#:bZhJޘuϷ(iBH+ѳeJju/kl[B[._ J~a:;rµL'V6{BX5 u/KU]#$[*gЭzcbAYg}N6Z}_pp]+$5Jpd`V-aeNLdOaKw;[׷bFck4mjMw?ymzhIRqƹj(_&H+&ReFL(p!M,J>liLZJja# mg ]NKk>Ox2BccWERȇz g/xAD :>j狧5>,s+ 11"`<IвS=m+G8s XhLz"dNbtVbYXՆtv: 97X΢ОM47}Zh#H_>3nȖ<[Z?Nz40YޏϏ/D%<-6b~['խp(~ԍj+p'iAJDz{Ǝ>߸>%u]M4p`}Q?8xTrT[:q\l^HhX#\ى| T.Qe)xqm)iuLrjMEԑXDj=Ns Cb7&RwvjbP4;ToxQsJV!!\DdRZ7;R-=Qkek1SN Xke y"%Svк)ԉ/eDU[_Sna%[ y"%SLwLuߺIB-Z7_ ʈ;XɁNjB?u !!\D}d q3TMFTbޟ 41#~!5)tq4z?Bjq ^@ A5A9 /OO@ 9 BT'PII'~BPMOƝ\O)_9!?⥻KJ^s!C|޿}j}?3B(ѤYJ!Y%aFR9&Dtw}.5&uAݯ^Q&6ux׵?)t."q}2OzTUYhT{+520$4įwR~WNXwmq eO#x+Г,F=pBnv[JtF,g$$v/e{zů"GDQ1ohs{kUDtA/ԥHqr'*YpQ zqAO|5b[}~>)p?\;2`#v6ȋ&Y1/J+?Iә[Ngn9tV_N7ح& K4dL@.3M2 :+`4gFn- TaK_SV', Li@nUes^'\Cy)2j 2VĶJpzÇ~n)K}&Y<{E"cTl]ծt An̘*EIR(N H ER"%D2I>OZ^*YhC5J AOp, ~S,8(B%O=DH˴i(8( &jOL!Z}@KɳV Ɂ[%%Y.Dj¾0cd2Y{@3[ibt.؞:ͭXnE,ejU͹(UrYLLE s5 ;)d)4 -MVek٘9v _\^nN>ߥy1\Uwjr$xZLS5SF`ʶ +Bif:~NZI*.ӈOMh{# ]Vrdm؎u%k)1c8m3TWKVg.~:*幖ďkXX>=11vɟ~S:B ^zz|񱩨MWozoug+v) I<-&mC%"[ck&-=:2gh$D N.gqfL!P-̞ Y^YB}!Svst딶ItBR7|~#[_=h A>W7yuƜXuO-WLo"oLqJB Ct jsCu#|=wbI UfPժ!DU.{BgBe?}8.>dGޟ+i~% 5kQ~h9ޠ0*+FhQA;DKJBzt0(VGu[Euaqe8!jm (X;\/Qz=/|(q [ONn=?18[ zV j'?=8GP隣o&G ||p=EB<*`y\ a`iBzˣz07`p? $" 7]"~z= !"ʖollܔǮ7PiҮoM#:eC-(HپyBoG7ܷ0.  [ia}m(o٣18j=:tmkIa| xN=v-ۏ]a V!jj/o>W#RψriUeE#[,M߅b.tߍӿmk$BVW;dѰ"t3L"#vGkPkWpdz Ɍ0_K8hh.GQHT() MJB42fb- }g/$Jnš&K?{kMNawq AUJ'%>lPWAq^E =jD5^ .w԰V䧟'h)V.;++QQZan_+DUuXwWhB3S‡5u1TS'Y <9dITaSO˔y]"gAq]J˓z+7r'sgv*B&+ea\F|[m T29ӣ'aQHH,ipǤm̈y2/psgtMk2p/$zD?W끥>2XXRΊTTFg%M 0 ˨PeiLV2ژ",0bظO`F)^FyV䨁 -ZdN9%ɵX!Ui.47 =4 nVe7Itv6t;tr6i}.Sa p̤i 43"QHdE2rJQܮ"*H}0 }&42EjjOx=sgw]8EZ-དྷc~wO$>{|Z{`^⽅}UaǻB-&tR^b~Jww.t&,Q>F>{ru-ëGR21]dh~#^ʀ2@l`Mm 6aЙ$ZYvaQĎWe;ҎD7p%!G3Ŭ9D{x}z&5R&2&"ZGH em/Ԩ⺌yiriŔKa3&O[ š+.=G),ei](sztqz@^.ΧHT]ޝI~yHkzuVQ-& gZ'O͗͝ә3Kݗ A'+->gO߮˛egկM"vDؙ̉s-bW7I̪ޒ܏;\vFivYkX.\"VZg-52{f&WvdEC?iBlߞvb^EȎM}!/5_LC@ei)$E\pclېڽXu8A1;`x!œ5~=㐦,b^lO[^؃7~;5PꦂMӖJiWf0'_8! ":2 [H2ǵZ`]oG8U+7 0ZZ6r9m&@#pT-_Gք ޮuܨ CǗ\9o"YI cWi*JbdA^H,^*Tj=W{|ZqDO/97Z cD$QHpm{G6=~xv7.}KTl:&KP`XGgKWiy=(zJ"Ėbň4Wfn[*%'\<,f/%7^ u_Jz!:^-X"OT>h RS3O~F᠍} 9Wb 7e r Wy c D|`EVxZ5@$0BըY+2e,1m{01$={ߌWx@5l%9*a! z~KDan2R҈GSNvEjwًbӞtNnqIܓp3 Tp./sha|̳d~\'iN3k`L2K̲PfT*?KՃ<*HeMBqϿόW+u͌}2p=HIYa8G]̮A15xYҽgdx8W_Dpf(Ɛ*HuAJe.xVҴ̰*Le^d%c)rZ #FƐM@UUOŎ?עh jL{%7,8%s C:{OI5P 9;iLWxJYh^ r]YoI+^ÀܞbgF{eB@Qn,o$IIE(Ux[KQ_Dƕ\_4[C<H]n@ t|o؇C`twfcY=~M%D0izszploe:{WS:6b3cu.իǫȚ9"$ՊM掖C*c5G$-8<[|1l]cRz  tSGgUyw]?%m7GԖ{?yo9{|jj|ZP-E5Y@7=mOGd8t[ۜF-6zcfؒMaP9l-޴1HNbn;Ijj c$ ŘbCj6轩O=_a${vz!Z} y݅!womvQL(yXI (d`%O{̜KX+p51`ׅ qe?ZdsnLW9@sxl4aɴE3gZ"}5~_KŘRB_Wvh~G:ro&~G s-u1QͿ4}7j[<_Hݼ*>\&Ght;^̿/o|p'9:j0nnzv9g˩ϊڨr/o=e{߹ڢ㷍I30Y, E9e|GZY=LyVoE6n < èk|^xB)Ap* E0{7IJ)nҺ#3"iᲐ. f_JiS6sϵ2!c#' 9 )z!b)8eT$8Zdb'Ph|.UkRh-yCsz?2*Ch!'&ff"zF]ڨVkR`Qx‰J Q`E(Q3q\ Q( i;VT^5O:g9:ģd@#9 ,Edd2r،|ߞcٔHsǞ3 ?-/\ӣxj|~Ryn>ći˄\gPc!_>dz7[rK" }c_տtwN'wf |!J] ] W7 cݠ08nR`ˮI&( ):+$YP#*J3r']L1.p3p3Y=F!8&4Ik02Y+YtB V$DE)%qВ{+Ep,A}+s@ͬ,+w,;W_r$WR% X+扙yD]>ؓ"Nv1Y9/Y9/jfe9BNn"bΨ}H6l.rF XI­h1?|b?|tJ*O#ҧR*ߍʩiOK:#g f9GTdv1Γf\fL׭UL#7TPBz6(F:-G`(0iKR6rdxZaioÔr]\[PDEv4Q 1@yZ 0xy]0+zj_yS'o_<"CX"B(Ȟr>rr%o(ءm69H'ymg;H#7#h#3[& 9[X^J.(WѩDqS[W7]#ȭ釓'h1űwswf1mՖBXȟhoˈu-"\x@Rt TașUǚA%҇EI#UR獪^l =:y|uD-xDɤbcݠPn.6%R!M.oҬnYt" +!M|4i&5pMl4c㘬r|ߴO*CSw9 z̉py|L I1ɏ'םilZ{C!+(4P7Oz/Yu|ü>^fq $>2yN&):bXFކ;i<т92ŸOd(7I2~U#2 h򪄏+E 5`/aS*@N^,2pb /d?i KL!6 yx+tNhdUaO{UUN ,};=x ;eBv,3)2/#a,ڌ1`eiUHE=+dz?Z\jCOi0ʟtWk3< [@t Q/R(z<$j<t >Pu^bݠRU祸9Tȹ 7RҐsfԧ4*|#) b#ٱ"`KV9Rc͠JRnh(<[6`ݣ`ȀuPVZ12F_ZWo],Ȱu']+>iw+U7<èR;oh7:׮#Ғ~~۔m9JȾ!oW/ʐ%o]0Z_ym)j%v8eܗ3` Aʸ{ -ya䑯 yKro^_ 8B=99?ߦ`˖6 6{^_?:/׼#m.Now2zzbzzTo ʭ>|µz\'5>/GQ9oeƤŋ!m{šz^xr\{A–R)Vw[[K.EunG7qMKӽ~r t NS XgpM zh9,"3"[ ApvSLV+4"sT@qURN*rqd8받R"W~WYYuJ}@{@1_݇Xͣ[wjhز/? >H[ϻgvɷ-I*;]f &669CA+*[Y·6ڮx|Xcc Vm]t>Ͷ1V9d׶ ٬;Ժ4}|AT1 2-٣}ό״fIןkIk+*]-RqNZO~NM\_VݳO'6Tjn Ia= ޹v,lTsԎ9 dUnV9&j8İ9{lǯՓ$z>)߭E6pwmHUV&o%No}Ex7W_Oԝ,QW6ˆa7{s* x))^&&?vާ1X.e;NƓGL5GpTh6G? VsFh[čܢloc aGO}B5s;# zdm_s!ko:vÛ o43o-R4mha_O#u(M%߽,Zݣq|jZ~ ͝X؛ZD 2=7=mHQ8s@ Ko֌Y&UsoOZeEϰ~^bm jczsX4T:u@(J+rR"슘1d ]@T19e B [a%KxAF:Y(O&dIgvll2&LtQR}݆y_'nSLkbhjHL 0omg-}lk3y0-ͫN3ctbzwSثͿgNδSqsmݷoao-~GMKrsgg_'x- 'Y֔] cq"_W=]-󇩿`1{uz0P/>xO%3I.I2հNݤx-I#Foy X%4T吐/\D[κÁsg΢ r:> z[qd ɾo$"S2(d qoUb1) Or46o(O2g ۂ8A* ;3HAK"ؑu8խG@9RvS$$4|wᒐ'Ds4߈Ėbz4*@#=} r [nY(P;$\tt\i xJ SLqCg[Y^,\Z,4 QhY{g٪CLC f80nmzT󾹅%n\6Ljzm7 GlXoEbaC^5: 05;EAaz#u^Y ADBiKڀJVZ_N!-3 S`K֞\ Z署!*4*N(%B#D1nj(lJ fb֔ 3uPBK5KQYe=R)Q'; )23z뭏? N.Dv :=uG#ygɭ#63&όJ8O݅oU݅o'*px "-!"9;jtW*% ,N{"tZr}T+|S3L1c1K2J jxwc.  L%|$>(!iKB-ͥ Ź&s[pLSHoֆ'V׭#EK \J%$ ەECԙH@pjʛXy'N.BJQ p;Ti=F)"-~Y%R=Gҵm5 ̓tLb۔x9q5u.J&usIq~WogL߾UTx|w/InJ[7]~΄{0('/?Lcx%w?JٖMOgng~~>tNxgOv׌` 6}kcD.c8}Cƙ뵿}uYQn(G;`h@!;z;F0fX`(l/zOF郀ǐ!(s "ۋkpqzP .Fd4({U1q.{k ŭ q}8p d{ﶽBl,OaS'-V_m 1dȅ^R ҄(yp6YdDo νQH$8<bJB.K{-ԭSXv jd)'=QNN2,}.D q|ˣkBP8}^NtG& ayB͘p+2fqWТ3Y`\2Âq&.$J#jE0VsRҲ*) jz x(_ἓE7X16x^3V|ܼܱpd(JP}y},*Za+ Pjn.R[1 QeYʀV 0G!5p\,B5kܭ:omlESq Ja.)W8XpVr.@kǖ# 3KIܮ6"0:Ϩto^D&(i3Q>9=Iʗ3#ij QH823xy)H=<7oM| 5]ꦸ->qs y7/?sgxOoWͻGlEGԡr܌^;l6p{֒~P9HX#CN5Lj$n$1 RgϩǴ4ȿEYoqh#K[;X1<bE>o G&7(g;Ȍŀ2a$'ݜC  &yX[3 i$\wj}h3@iL^Qޏ^#X'|n^߀+‹<[LȾ D5}u{ iDgpz};ĭMaBW'n6 z֤?,D&yp#OVOY 5Fc|g?tRw%whL\ ?o#Npe[5~׬Q`ِÀWt,>u H4qÛglΏ#` &1X;l w}r6jc׎!vĐg/.Y(ͳK^<`,qGޏYGfUaQs9c%%-<뉶rSlZ}+x_5tr7fĔK .Tj*Kgkrf ,R]`F9SF *^n^^>5yZ50w#χE ӿ/?RyG@ [>1ٱwa_Zà);K:s>I3^re<մorP \w:cF8^&Dzf̶}3 w2=} rgmٶ18mzU3,2fPd5p53u5flqb0 CYǷFpBٰX*ctL˜1:[Rž["2kl)<:EN;h[|^Q#}# 9 bgi N @l+UbeW9j5r{ pYV܀KL#ܸ~p{n@겐NIT ʲ,&6(I)1K *MH#倃jO~e<+ҟլwUG[WV.% *RK^V@0PSIN*uUR }]E4o!MiqP L8,O\rO*ߕnzdj'%\úv $w_ABI'ߒoV_4ԥ_Z'7uxs "]ܟO?UlדNf{`w3lͲA3wK~K{wx DLJG7)1w\ø-I~"oMuv:Tϐ1Vr%zhHba5%8QI b'z@> J]bO,˗;b,}ت|V1B9]+nsnM2iK]T"Iur%$Y{ _E- s jlq( Ӏq\ñp< R:.ŋ)|,3{8KP"(px.@.K[~.~!4B_H:g@0@ N~uTTW ǔIJ;+)7w‰V+޿A P*Ă#(-C㒛b̴BfJQHm\iNTVK8b+b̢*aB+-GFHV0ԟ*Q2bU4sS!5P%mN#E+LH[4fq!"6d4\Z7\0``VKaBL: SDBiP9\4e"+N5wʍ  MmS`d Z3F)/t%u|xwCBA3+%?E(F1f/L]> d _}8Kkodp!ʰ-@5 auD~ yrf/ _z(~=Ā0BwϬBgVu, 5kw{=SS  [bio1sR$QQLY1J0:a`thGw:tIOp') NZ.Ag0 lQ yAc k19Y6RO!ʏtս ߇_F{.WoG%X-]Qa 3:Qi/;*uwTE,]bXwBwIP[ݩ8h݉^1ti)&@zޱrR"Ɖ 9/@`)J$lU)RXdI0R Zgڔi3EpIxPB>ʹsk[o ~ĝ7/>#`82d`F0#b jm4ҡ ;N4DE BRD!.yRsl2:$~;$g g8" %QTpOJ Vm-][o#+_Xd>ٗdNsN&yI`4/=Fc"}-YjY[}Q<8,~U,Vu7R?\R\ef].r*#F|e@#7ZY`zFQk{&VvpCֆ/F^僳Y,jW'נV3VdHDʒO() \RYౢX'J(WR@pId^V'!}@b} j%Qa/˵/H'ORc\nIʱݖwUՉ=>Sezd;vOYJۯ r@h3B`tK2#R B32д |+l -]ʋ\F=_奈,.Ei7)5(E;z rvBO4Zkc~=AT\B6\ir)NJ6(u|Cmyu6~sGN^e-HfiF\BNi>7VIá{o~\]X<Ζo"ŏsq_|(ÛkJ&@$4o$_s3.vqժV}981 }9$P۷-4~˦˘p/NF:YE$ *|9 LJEk(?3-o^]ͯgS{:&.^`ӲL>574~M"QfB%DG ~|FYޞ( **ݭ" ߻ސQ,N _6,!Gս1dD&&70)&7@Vޢ 롽W,L ȜDpzF?/~Cz3z7N@oF ##Nͺy\o w-1GPv4?YrA(HU1qUAJl'qy4_?f_aᚘ-+k~+ LN6Y Nc\Vtoh zcva\lwsnƒeޡ"Lp;'ѯGݨփ ^uH14G P2!D=TOLKX,S"(eф8BxƂ2J@yf:8t(fh撰 FsíոF@Xb-[!E"L$1,\V\V9aX s%fdΥAeeD6 -鸶~JARu$*b"ϔVX9,(Q,#l8 y)RR  <Or2 и&=3Rcr 3c kz0Z* (#.^zFw5 ԰jR;UֵJyK (cS JrBWR^m[DBhԺU-6Ԑ=lym2rX'%rZ[^68z)BԐw(ܪGVfg[uݪVDV 9xqﲜ0p; C;FPK3 "FT}pT$_7}O&(s^|RH,Ejt)8fQ}bcTlq8F*x*i9Kq bh-E]JФϼQDJY; Z-O_4TE.մRkl%wUVc(:@N@pZm20D qcG_L!hbk.ܺ @ՊHtc"`f@y2,C4U6=e Ђ_.; q*9aƳU#Sy&2y"|c%h52'ړ>;KŃD3B3b(X"%rC9F &sdTɘq_K:-QuFaF.Ǖ _v}l"5&k'D24R'Дp> '| $&IU-y lyGX:JAѐT3BPCbtfgX$Ҍ:PmmEhfWtmOL$VNKJl `D]@OOPm?C$.6GZf%:^~KH=N) ^a8XMI G;ymHhEr{k>A9i6f16gR%AeAP1vЫ,BUEVq!\=;.' 2Ț#.C:띣Ml^y0"&g4`vŎuIAOFV-ԋ)#;j;U}pQjurNl0>C|MgD~ %J3t%n]Ɯ/ʴj=yKJ$ʴɨ2!eg*xʂ!w4RxҙcJJ>ۻaT^6|^ D2#rDJlP(A8Z $22>IHUWՀd"UoH@)n\ҧ4RlH9I;^*2N@ҥ_x/ #qr*}ZVZM&&S3ܰR:ht:۪hJj·E'F+ů;k[k^: ++E`-9/Ŷܻ$׃oVK(`G?hЇ[,E߸ e./|طXwp9 I o^DiCb1GN y&0qa5ϏHJ߬gO=8ԅAnD5g&Ɵ@j%'Mi|.PSrͺK8!itjALQD.Lg=ABG$O'\N*p^YcZpp1럁7@ZĴ}! :;b* 5RW1]Dj _F6GG+WýnŒ%4X[Ilѷӻշ&z=?dG| (CmJ~w&Sܠ.GP i[5w8W!|-oab#19)898Z^\k2B]iZ6uBKbG6rs{w\Qw؞.6]0D .}PEFzwM:r" V4Oчy)4b Î`߿n\}5oe Y 6nhyn܄4mXLer7@7n#phFuOR90!ClٸAjO5˴uy,cD(mcx[\^+J[N(3!2nmZm=M_h87nis@M SqvVݧ5Am~ݟP!HK5VHKcԒh\4o LdZnޠ' Ti1S>T eTcHLL$1pKRR1@uADK*#JOW)YJO{|*zN@!k"Ҧru״Ҥi$]1p!p:Qi4(mE!Tu]W׹u_a\aq>yMH: ڢZj=8_,FkSL$m=W&8c`& }Yn!x;e : !c. 7^9u 3ᨍAӽB/xW]waCuML3>f4ߩbݻ~C'n.[t7yf2`fNg{?S3|?S |8| _Eq|.b#v--F|M2RAQl*f?iRe:t7aMvvS&Lgm64V7C-hgEҀ9_k ;ong2#XIxlQc\;Kqq;e'K蕎FS^_]iɎ Ql 4xP]ukd$2HAp"ZbjJSjxb(_FFA{^/X6E#Wpt{e11/iȱdbeg+Bs :ÖQj+IVq=,)3RNkh:@X&4צ3%j7=.a PƕP: %-dK:}<}\1ƓtcGS ٪:kgB/3jtHLJc?z5tQ$ UKZ%=\0VGamOmWOdh[5>ԡUoәzGм޺Wt51ŭ]'uXKyùț-AI O Hd7_M҄_ P9aZ[ʄFDVpbL̹ownNm=eֳٖ/U.Kud!/DKlJ$|pcbc:mn[Dj:w+`n]X 7mk.۹_p322x*F e$)_PF+(ܣb$bJ9#^ఖNmxb,Mpj$)("M%l8)pB2 w K0c\^p}_\_x,KúO"^1_QgTqAE_08fiW/Pbqr{O&W }6YG`!R'g!O[4/eb: W~tںD\p&&^On?DDװ&z{sbVY5&J%oVmu^m]=7w^؜EL q!(QNU\o&bTmDeBܼ# [_uYx71f9#B6z >1so.]Fc#\1ʇʶ,^եRÃzP~@/:n -FW-4*t F=9{0 ȾZP$G½(TZ\-bLjX%4j$Hk=?{Wz-<0odG]':M>YhZ NzÄF0İ3X!k9gyΙT .0R)[+)@\StT\{%]I9'!3تQC{˰!8(N\U>úJmģ !_XAnA1iRsxGt0~ؒ%EK$~| [hvʝء-FF$lM'=IȊ0Fө$j *tXteTydȸ_7Tc羭e鉻]^d,IzE񿾽aj@>$y`^,^W}җkx>iZ0{w2 bO;ffE;W+|磀;d3$RN#RþSJGbf2$a&T}tgVZx?![$.ƢqwHZZԧ˘#s+b ; 1x_2KEӁw#a06%z[\ADAO aS gAki6qԷ &vu`:r,<}4 Oz4.UDCG%M"HIÕEG} VBl쨦A*kYă)O,WU@j{1ff"˰_״C9g ml㉥j^G&[]tbHTL*se{(_S+$ؔiY>u #*]y=':y{3@1Q7'Ʈ1z]7>XItbZt CF-(؈_ȉO \yjHaL) XqT&^7jUPx]>  KVJL|m@_RJPM쟮/?Oω3;Zd]5gs:s$)S/_ZtM*:=z+c缡4VRuuhH MA=Hr-]9@>ށ!(2fKn@hݹI!nDO(KҘTC *rDVU<K\< L5h "2`[uPU53HWU@*ʑD2ԤڝkyZ]O_Vuca7IͿv;{Q Y) ESsŗ\0 <z jC$%P 8|gx %se`BSU\cTpBX& ϸ.0a9WKԱVzTYG*{w߀ H  ~d%XhVz%J!hȮ uB!1a,FFx)CXlwqB|) %5- m XF3{S3;E[V(٥Z\r㮗2-M{aN)RSzoaPr]tm.{T#EWNK VFUX{`a7Pe~,U cF8g$5mް]ݎ`L/]"cƙ=U狫O˛,ZTalL/ќ1F!/pB nhc̐Kzw>U2ٛuv*]ބwc`XcyU]ү~}3ѷі:ֽhnzxmqqRp'mu16jnӃkqin5Ѻo14*UvcJZvk ѤElb!K 1G56֓Ï-+a$?ޫeK.'WXhefwf~S8\bpٯ @~wV1jA1g PFШgގGv0)A|\*\i(mTj -DTbܣD  H?ZÖ۫  XqC~*৓̈́?_\ejaKcݕ?DD&i1^srÝk |xاbLړ[9D-'3x|DKޑp-)kҶwޭ)mw;\uN`ޭ<1ӻua!/DKl c9-Xs@VA锶ƻt̻gz.,䅛hM${7[ލ$ޭ)mw;mrԩλ*dz.,䅛6lb^1_,lSo>XX3h2i+prnozW,"oağ<`땠oġw299ҟ_ѹɻ~O_jVQth[]J] AK4E֠hj߾PWRJ'Z%|@'*(W'fRЅ2bnZZ ugdrsnu1S4eU>Ŀb>fXDaO3?N+?_X43wu-1cl!&FiOB' MV) hXZ->+n|>DيS&1k p뺆U/)%h5s)nK˩bVb/<0m%| ZT4rU++1ZUR^ؒR%b2-$S2 ?.ۓZ[5 7=b[p^q*K*/ 4.2R!!˯@$1W,2Y]msF+,}킚yN*wW~I5 $n(J!)d+zIP0ZL\q$@鞞IcSb@fH8rHK-zю=tEtFY!g˛/+ ъ}2TMXƜb)X")ϣSRLjc+)V1Q8%Ưq.Vs)V&Y$Vx7mg T-;˛! )STf‹I0h -҈c_?U;Sꧽ1?I&^aL)r_>?c@D9޲bFZ672/4ɬ})|^3dۘ$]Ff[O .1(Ÿ/"ٔ b$R\F>^]O(oGx.s9W]<7 |j!f.׌-Wp|-&^xߚ'>Q@Pc;2'~j bIp`\GԒVQt2jܴh9# 聰QTwCbFS.;rm./1,_Jr#c%`0) xr |2XT 4ӻk]ĹَVG|bTqPHQk TH 62˄űETŶpo,<3[^f4 `M[zkzW 8I3؛{w])[* hk86KiFt;EbV9 뻎J1x 5on0-nx+o=m%V^7 O\t!@dH\a e#LNȟǜ%zʓJ,JcDȐ‚R >g<<=d!ʣJSZGE}\$NL}b])OW%D>e4:a|PN%9Tp+|yJ),J A3:v`o|U[RPJFYIyuH4,-GrYj2r{Uj $C&m3ՄJӁmʚPI}:PD+R`r5B$Sd9 b.Myv[g{]xJXxv{](rNbEFo9d@ƝڸI%lji2Hm*B\hx2Dų t'go>SG>L;$4+<+s4]xlSǣwӇ,ՁW(wfWc\MWWf,87γ>3/!Ιklc?hWnb2|O2 jPF w/E}@Z:7̫p4W Z $kTْ[׊J5 W:x( {C<,A}Y(9JwGQF =D̖r'+TrZtcqo.DNMd?Ӊ ē!R(CV3\}@lp\%Ѯ{W潃~\rBW؝ؖ TK{Jc9,̤<[t6Kb^Q9gx[e-iB ۵V6OR+qǵd%BUD׍"bmk3BTv,!R] K9-./VHrV/]0 (2ZŖE1,"J~MPbkV4.GuVrЄՀ/).JX7߫%\?W{ت3o[Vͭ.OwQT޼[VnmX 7b |wj[(UL'u[:ݷDk*z6,䅛hM)nZA ĠR3gy B }6[nmJSQw]:^!˞%hpjɺKu>,<Nj?RScA_uUϪs7\;̯3σp4~ewXsz+t}~~s z_s,\=2_KjB^̅,/ah*LJAXB,on$ ~x)'D+X&ZW DFq *M}vp ef4ѱIfTJHV0ۦz^emRr"wod5k.3kD޺JuGbHq))#KXYȒzXοD ]]^jiv R&'r&(MH(3MbĩN1Eئ* py%_IEɘ !-f6f{G1jiXƂSu17.J)MOh;ZbJ:3E_8V VU4#-8r "3­114T"uV`H6d//tSRunې1j4gL:o¯[5ٻq,W yٝ b3Fgzi6D)1R]N//KY%/bHb:!yx>12_͛i,:6lk~D -A<?͡&z^ I9 OO~ gd 9H#4vs 8[kKcxf nbVݴN~1:i3! εԵ8k3I A03}S K>ΏH3;?cxL $n,+'$nNs]q@٣a;ع%5oaCJY BQpZ#i.RqY.Y[Q^*ԐT.w\w8tܛ[yﻅo,tX_J6 ?x#8'M^6-蕄^/WfP7&}? cfZwyy|,dG,Bw7C>fL6sSx~-yMYҨ:`=dn5{=h}s׹QLQ<źj4200Kz j^^_ۦ^=':aikh׷ K5=#z!;3UJϫkBOP TzQBʈ:1ޞGAprցz!H0d&Q99.bbQp6&TM|rC)'׷ UĖbb0# SΣyf1H, 8Cr.mw|eUR?=1b((^DߏoTU.;>tX@O:~69(`޴N@i>47hfsL]}_.7҃©J^%ͯfg))zP)>/ѣ΅jU ~;_i1N)M~KOqlӮ:9xѹ%[|jSңR/MF<6aPP8^lIh~ѴMMbIm +o^? EY7W1 Nm.w]Eqpx T+CA2ZbmxۖJUdՔrRWО~\[ Ì@g_&;_ }3$[P nVv^᳾<֖7 % Cަb@"uQ򂠓f2EcÖ4pJ{,| ])*HTCqmBaI"mXW'-kDaan¾ɵ+EjErS ݛ')-GQPE$>OT N tG&*8ܟY-IT?SrDŰt@U\prrtiT3y`O +˺ ²lˊ S$R 0p=)L>$Fq]rZ"* RnIbTYR%{(Dwb ֮$PGD&Q9ygYz:Upruuz)sa6rƉߣΕ FY/t*^O3hƾKew)K'M)pAN&fJ݂2͗;tkY081ۛbϭ6G^X$yDH)8,ewgXN  r-ZW, DhRP Di HU=:tRm:믟}U?g$f?IXĩ=>65 _l}݌?ߵs8gس<FXas(_)kG^xF$gN(.q " _ڭϕaˋzSr|KK!$"72ot}b&$ؿňF_\/+#dXnw0 eG 7ZSF%ȐR/3^\3"I&($84zb\H[ IŏdءM#lL1a)AqHNP` +HUQS\Ս'|G. +ǀռBAcf n_A[m'Q* Rq%/zҀ=23P"XH!Kn6W%!-˲ˊ7M5QPH\M;oiEǤbZ2ءL Bш?5Wǎl] Y5 eԲv0]$sthIOI 1%wћcԂG^-wq$־GR[ER42Idb}V>__WYzgG창_WPo}o-ukQ5˕6jB>탉>C7jgEVDh#5ZV`-mE[MSAJ*@I(crQêm &)i7o-wk u8 >5D;$B4?Vg^>53\y?|28I.6XuwBKˎ-nh֮\6/RBbl81T: WBz|3q5kmJwa*C+6DWauR7\C?VO&Ć$m; \ˬ-MݢYwGܿ}bEO@w߿{bjLJudta"䫫r˖{i?3b(01&  nou_3Cq ̨8r {:~t ooc$$:- (9[Y@{ [A.갢Ip_$5q\Fo #$,g}`2V8]dGEd,ɢ(1f38.WiM ,X9a80+.7~ G>\Yt ZuGҤ/MTx^+/s>-W&{/~p/u!9h!eAu[l Qig] Teu"욟pX!ʁmmg9‹.]an u)}Ƨg 46f)`v5ȒFY\&)6GV83&^2uY,9\a9LH^v y!nM9C)'8g )NTvQqt,2f⟯ŅBxC$,*ҟ~8͙UL(8Ls 8bAsӆwrl Jv W)<֪AUv$WeתPb@6im:*5?=Tt>JdIn0f 8u҇ƽI{>y 9f pDolrY^v&}Y%u% +A9‹"w۱V D!JnƠ=3lA$o%[rmJ Х&ΙFӳ|1^r0 9(uJrji$攏ppN?Nh*\\UZs-A0 庎&G度fZt xλaNir]8Lm;?޵q$2˞={Fŀ K { O(Q)'E&)i(ކ 9A235_U׭KMήn8G9fvL=ZMP2r4eB] 4ZdcY,ЀH:v8p} @J C%A6<"s$>kS&|wqϟ?"sp0݌&8LN_ibOOOM u:,ͳRv(*y{JO+9Ϟ?v GƳ=a(bk)S*i߫Q(sF0;W4F0R7 l;(.T7Yn?69b%z;X.gn!SaOq@|TR A,GO|:%E͔f%uJ=Y2бn>& SFq-Vqmj:WgǗM&ܙ܂jcUw(i.sBIs˫4M~\bŻrY 58by].M5Ci9iK]vwqP9!oJ4ήnh?6BKrdO[XcҊ{˴Ȫ2MhH.ŭ-]z-2 Xױ~8X0!ʑVFai4Dmyu+ x {;{ݥ̶#.pk5r,v Ɲyf9WF0kQČ)s 6](1Oc o |$#7݄7K?>~v{巫rutUhHGл0a.;/:\ B˭A('B|9X6byxA l 2f"X[DF] X"rkx&d '-!;BH'p #Q6j9H1B\fRxMr(|pҊ X(Qp+4Nz#/BGikT RiǀDH${FVr&$t:tjZa [;Q)-ce"Qyj Kexk;P%Wu~I6߹pv|(Q~ޟ6Ya&$r MŢVe8N|dq/pSp u_65"?]s*]Jh{s~(aK(w{umn̗YA6 &UݥImI OIU'=~1igu&UhQ $BTnQ9hַ?~ Wk0=Ӿ Pً6[%^WXB&?{WXu?ˌa١Vk}VkXJ}bC[u.գoE>PV ӷ=(U~y#B*UO*erSVW|)N=)>,Ғ"@, P ao50s6m!y!t뽹:ٴg?{y\WT ӭd>U?r;pG5iS.4f0ɇwc⫐I8-lLc>U$ӏoٌ9"GeTE݉D؆}-%R5oVC1֤_JO2pt~Ap_'_~e|;^"9zkr%#8R3c[̅,IG;lypi-Kv%+vb=7hV"XZ+)Fvc*yv z Ӥ=&5شV=} )ݵꈫ} mKv`a("7F_ۺMw|(.653]gpvUUEb1yPήyFXaMփ"ݢkdwldwg-¿a҂"!ԉ9DdnK@ ,vz#+7IUd-i}C[IHbG௾Y/2L>f{.}?׃r!mN-zW5}1f1;u[>\}]8f+g6!&w F<c4kPrѧ30k;M"BѦe*2oI]Dz+fS-e_gL?_.d|wqϟ?"sp0݌&8LN_·OOO+ąwY"tpMW_ݫOz5V^HjX 0kQxy0iQ81耭22Y3'.4fӷmkꆼ/ێ@XɆq7JQonWY<qTUh/@ǒS˘D?MLq~/nf_-5լXQF2QX wq\(_,&@!ݘShveFsoș 7eJxAq؁+T4mmqT?~i@^4{vw`TȪh_`:^9P[BE !|/Ц"1td]In [Iuh3EO}*ؠR7JEQKpa׮u`Kd߾w!QFZ5"rM8DKL*Z[q:hic5µy[F=$1;\2-|//ݸwm_?59el`8NTͮ='ODs(S,19?n\Hl>97DcDIe$ȫFGǤaoj4,'VAX|p>^:TPH҈$,"o=-!L12aW-Ÿ j5ZD'xt )VRS%e;6iSWDӟp8D$(L/sE$5ʦ&gj^ŧ!6'!Nr~Urt4rX<(鑢KfBx#w(br )Ka4c"R0@,r kRdpbU} V~12,,L/3/K|凗Ƹ,-byjnCE0 sܝ\%w,1:˸ARs |DɢEXyEQFADT0/ qU[@N3G7RpU. 7(ã 3Y9 'O7H_ajm!!0\cEaB;ʵ$t ?PdR-;B烢T? G #t% I :$;H ؀ cP T;@I02`trc9(WxoCp54F{[DL zq -I G@)3J8lVQız0ǶUu')%>E y) I)!`,1] G,F=(6`@-wcaBb}=3\U)YU*+di*jh&}"s%lDWg~qS8.}xR_O n.L?8\ߞ  S>}|{sFx??|v|'CO;C3}WhO77p}"pm"4VoggLS( 98hC>  OW,?= H0n=l4uT 'ې P9=In CZ;ug+6'UϮxbA=//Q:ɒlr !׆>vAPql˗C/O5RR%s=kLF o~2wX,-/;/`eH4ɡQi`z8먑  HX ήiڹ,w{7' xo3?fN޻ۑ9TK^;]+k*t_ 덅v7O$D'8Y%jiAEpn5FjO1ZV #: +,JE) ^xk| XB#S9i1JQp A$xhBűHb1tDj6#uZjEMXck0E OĨ h >} ~҂ c f| qi)R1Arlorh]k6+,n2@Ib i/Y t&n=ԗie[S(ɉѠe<|!υHuoʄ(f LH^dtrr\Z% If1cI%*,#&)BBRC$ڸGL8GPXW65*cڲ2,eL# AyQڎ7J >RRYZBMG:!-?RDGm/]rض>GL10A@ ã £U8>->wS «>8XCjydjWtmǣu?- wpү9qەC|,ʿmAyϱg3ʆ91g=P"Xۭs8*OZД}?ih[mJN  7v3۾Ӷ-BJcJsD6UdG2Y77wn( Qw:z>N|&&HM>19~&>O ^P9 g3lp]XvlbBucb6&N`<5?bJ +qUYF.n_e1ŵpS6vr?-r48 4\ ^?/kzp*$P|ѧC%0C6("JHGn{Y 5P&{9e8* V IŐPB*#L2'B#lL71f*Ցfv/i93L!I!ʔ6!ST`ein2I".zXbtHP(@DHA&8q(S4$Asr^ W,qV 2 +#b1ѓMk" "Ĕ4a$ E6`KbM}% :pZpO*{ްh+r;-m؈Ԝ`9@@VBt6@gN-Eƭ*|,~7gr3xkWkg4Qdy30z+4[-Wwj7lDžV#0_ !N2:~1ؼ) ٬-Xoj>LMibb2e!x=ZL5k$ φ4Ppڽ~4E^5VCqtOndtQ*=Uu:QkH$VфmO4= 1OcӉƀK)*㩖i81w vk9ѰiѰ{e.C~gq&9P Z//&r[7c@!vWdQmeܗLL+pĔ񾜭۩)$nMuڭK9h͋'G1iX+3Pz>=$ dv.}!v+&V7M΋AaB."/֎]e[GENF l=Ԡٗ+9l,{('B8{{| Lgpȫ!X&Y(XQRS5Xǝމk[u"Z=fBkp"lO8DC7^-ã d-4ymu!>aX1~\=>܀vNOv>6[ g ^s{Pgx_~ViXzj% !IԼ`8jnf$% ΘF pxiiy,AZPbY_ⶶ+E< Z׭?y))͖5ٸ{6B> m0 47,4΃Z9c6|ezaE_w:QɄ|0' M\r=G8z#|"nI\!-3~rQ/rCX+sdvzgB36DOSIf-"Zͻ*=sC.QlC]]vҷ2r q=tw?Tz}Qwg)[_,xZ7E|~3?9%* 3[~]ZE?dz$2[{`*$m e=pc{ ӵih!oanMH.T,(n޽ y 8#UsG\wkfZZ'5̜&',x:{~_(ITO0R$R)Nذ=-Ti? Ͱ] u, 0!0" "(SDXۗ6  @#{>ڬg ȉztV[`{poos% @ 6Hv=s7V?gsh~bWWC=N67\'sNq+{:\rA[-V_]b cw7|zݝ'sް_0x>ugs>=Ā?x^4KO_~fq0_!%D%K?)h`_Pc W :%_oϠio7#O}s;3 oz⢫ ޶1{jXVxZ{,/|1f6eˀ5 0f.B\?S!F L18ؗM`04{ geҦKLAO/.}}sj3!PǻHcԸθ*[;tu%%w, +$n&)'E؆ԶQm>.PR~e)ŊpĔRB*1Iib$RmD R„I/KaՐ\0[ r!d7ea _/_f| Ȕϕ0yЛ1Y,U^V3ԟ4wFs}Wx6 JiFK()[ #C)Hag3^OV_g@xuUE٫dd&?5}~qܹ||'KAZ_cO0'"k"af& ԫ`6hA2]1\~\} Wqtʌ6u *`Ã+xmZp$lkb6j%RFLD!(e4b O)DP &18;\b1w+Qv+c7sb1X}:1.oC0"ax4L7RE"iȈ ?_9P޵tum=zbk<0Vf\"ϒUzI BDiR9PU±U$TX61Ʃ,i#-$`1>ρh'`4i1+g\rܥ䐋i\x># x߾'rj\,\^5ŜFz/A%dg(Ǜ4}QOoMM ya.pk[!oT!7POZ[#ң/l% _<אsW7ױqB [Q bku!8*oGogO˭l @jT*9Uֈ[x۽(~:ݧQzO2:[nNV} K cviFֳ#QyIlLgIf)͍UDZz.?G]ap|vI4B*$L!Q B`T0XLe$h&C0%J,=HYj'&NjNSs$\in9]3cѤUtN~;HV~!cz (J%zy"k|Ao\2 ˵^ey57-f?Lt`7To,-:x'$)M:_htci JP(0Ӝ Rh$E,JCD$V;T1ǫŃ Sz.u,XSf0t&QM[ x$tW=y~osSTSntmOfj=R~ws| ^q1tN)|qғnEޫe~C򅋨Lqtn*" 3,F͛7#30|<㾼qBVi3\+Yyh!98BQ,ҡQtج}1 \ D5 /\DO)Bq ;*[L*)Ad1}W?FϻdRX!jOI@w& $ݙ$bP~Ez; NJ"|wcL4SQBHºkGN];?Y)B-kOHtqyg[PU\^Z^pdg]`r!bػHn$W~`1Y+x4 ֘!𴴮ddw}UR+U'YyT<TfA&&jR/IPFb2:L8(6L͆)`MWgkn#Fu\G2(1׍f4*!D))\H=UNpBtR ?CXTVV-~/+*Wt#3|ƵA'Uq(OpR)2|m٣Py(LB+ia/ 3$/uw˥TK ;eFԣD }iI?4?^O]^y9|ֻ7{y%<#}xq{\|W݇)S32UU-õ_,'{_~7!oBM 2m^J𞏋mwh>Qv dch5ksaΦ["lt'a+><S dt]/l<Ǜ(>BP!qv>,Ӂ#?Z/G[i_<'^:F)D=ƳGTDw_}w/{fm<Ggkڳt 9XiY}:+ǫ>j3N mՖV;Jd?i5ǫ+gC^lwQ@t=) D%"`  ކȬwFzb*%핮*Bjt1(q gEQENY%kwq&ػ6RcčF/-,F@s&%4ְS.Q8a+?w %s#A_%5ƀk&s^Q}D>Dx$muARAHD:OM2D'z9Hқ_-<|z. z%TGðsAP=1^Zf;pwϣ%hXYVȯ@rΆ+ӥrln[V7 -\Q2wU"=b{/Nn'WZ&Zɔ,wA5DO`qh$ZVV^ߘ.ri+ x<ŅWO?%F9VOE\Sa F5U~w4c7ۻklyvt6 Sk 1qY?g=W$OUo I$S3]귌hREd{-0j c[;¡vp/c̕uٚuoWd-{jz;>/>(88ǠyG^S Ӗr0(0t<0G& jQjaHRvm4?w4O{*w`2w[h6FHEo~"&N1{; 3q#1]fF!6vO4ܜAnõ᪹_?o ZXBjh`M|&@xGXk ?tơCGx+B}XV_d[0!ՒjCsCϡz +ewZYV#~'E-p(DObV󈻫ɻ3fDoo=.+@CgzTj6=h,jB@,74 HnчwcBKa֌뇢x*%'ChOi72f$͙oFǀ*ҝ؛f8.E$d%o 1!Z |֛ X)׊dG p޻hM&b d*W!%vuNs׋5OAd-ִ"gK#^hkHK_IFLL$rAZa4)&Efw& 6wBT PTF'Yڢ2jF,U,P pI!RjdF{` P7 - %7F"m5"%:t$R9&:̃ Άse38ZA%$8)R0z8]J^!Sn(z[nj) v'Je /#/?!oTx"u䖗" ޡ@0wFeOnm,'^inC/-.r\F+a6HH >2 e5WR!m:[- -4AbdƉCwH,RMθYQ AVA!BCh԰総/,eT7Y+IFFB"X.$*DF= 8 %6Tqh(YB6(m7.$H8=h̀h׻s33h# df0ԀffFԟ [G AnI]n{)ch_ sP3Q:A:}VH F2HwsKЙר^<"뷖 BVU i34||.!*1#CW=XW}X)A>Vq Ԃ/&”"&&W>0W^=T7l+T7+1zN]Ũjڏ٥/>{X F͔8ѹ{aFfWaz(xRW^8@uB;%C"t܇!tFjPRj\puB˩e>йޗߛp&_^s{v]u WvQ$túZ^koU^4 %ՑՅ&vD+5BXu=X\~["D7냽ONXqzE,שԥF n]uPyvcRdp8]+I&E$ƷB *mUiaRǬ(dV̈ d zi,Q V](Dy=E%'sX6*祫>۝8)r=Ƹ4iO@ pl)X\2dW O7jCO@.셔Nfkqa²7 Y. \f*pV7F+J/BH)Fљ U4RS31o!]T>\@7A܈*.Z׋GnZK5q误Kl[QqK\ 0V~eU8-4.cb1*fNd%hH g4;^jt?Nj)$&>&'*3.RʬQ(.,QguSU%WrC 6X.dT5@T$HdP<$kMrY=HAjUC'!Ti1=5wa x}.H_RkZщURa14n)Q +%;9a* A[@8$1KGZMys-њ $Աa+o@:7͙hmtx{24; °P$pY >yC3דѐE`'9# TB~rxis&1cdt"q5#)KxT38]6 jZl%hZ6ig*׿{{׻i+l fYy-4=ԣ 4#Ibh'8 wrA,N\=TxI*z4s%F_QnuI7*ؕO#>cr$cL3aJUp&džiP=}tԭ`h-yc~pWUpMt:Nk5xNQڴ=uKrs~HOnqqi PtMd}U='FrՁ%j)N&<Ah[#ZZ&'L\,q%ͷ]]Ч栾|"~ `"pgS(dOϳ̃VX2ϋ 4JX=!qhr_Fj%{|() d8 Dn)n:M_ͱOKSJ~VˬƗY/_ոP#.6!^Jdᤐ0ýTX?@;E74:%Tw~B3C@_<1:m) %NDcZ[/hH*%b>yHL[:TIp?PiP9i73 ('DQjWB$ %&!&D>0*fB*M "t)>byc&oq}9DTx eDmPdl&grZ} ȑ.'V5إلÇf >_3؃=cC0TvwȂR2[Y$A Y(CTɂ#ynBTsFVC 8#V!THH!BⱀwC)Gs``A[*gdmǭ?{ߗ!w͕V @eRf,NP0`y 2r tU,O7ᆩMN%bUQ:Cm$jρFDDG0(t{fȠhMp#*8=D48]ϜcYQd1gfyxon?&&E{ؕ>dX*y>KtnG/w׏ Kŋt_GLvOT GJ寛9tz9c 1&JSem.-c ʷcnV_%v_]/xd.H+)v|,0u^~ԱcRi&j+ `JLaI$ 瞱=zSxfrO R0~̚! xAfH3RXhUH ;a&W=bzգ"APn,tpbέzwE:Mo=O )I5%J,n+[KwƌVz22XHl_fI/,wU02sxf^NFkP[BPhH F2Je\Yϑ,nrwA[XfTS+eEiXJF a " _`XOO5x-hE¹y罦sb=2y9"SCqZ2~TBW(563(8̐9eV%"D)_j"B*.0wZqCv>q  Q6x:8 7[x4q^cX{BiCǘ7A^BF_CZ#39?h9b[ j5hA1_>:wR@}h(yfahQhzqLi=UpފI!"{7ED$)*}u:n~g#e^E.0amMWi-M}n>_3|9D~R""ɦqon'x;ap7Km[Ȕ>?mo7ŧI]w !֬DS8 (Nт3b< 涢v>yI/~v7/\LYE^g7dD)Y3!C0K8¹KaSJ( AU͐UO2H ?1P.8U<-3  ,sA9ţ2jbS>B.!Δ`&\FD`)DRCQPBɰF|Vk7*J2_UbG*QfnO؞$Uk|%)* )buΠSbh}vȵџe ⯊6,n]({,PF!Ɔ%tNg+i?4JE9T"kC]rVS3xMOqHp<܄gD3zDKi,kY> ^ga7DAr|r,}~Ӆt>yWzٗd/қcXnw?-vgWI_~!cB_ٽ%/{@r '8O ˿>O |:S)(Sqr;ʰ*6#jp{-^. o8ML=pghb́bBhrYZKSq!䐎H&K1\ӷ ;R-K_STyiPr' *OXJ B9R ށ@sU Vqf56bDI= + Ж!NB^6LKBt6 4gVToŬG0{yqtέ^}d>A:M6)JHzqGP89ti^1\l8L B$H .85M% TpV E`OS# P9h.\ wn5O?/Lپ猬Mw*&h+ouZQdߟzao_rt*׹̣ƚ߬Um^H &?RLtWq.{_}l溨:w)lk%*SLjCF,)~ǒ#OaKJ-]b>4[3JcYMp ӳYaMXJV XI` ;ks/R%^3Rl\ٜZsJٜ&Hiӹ?hS8hsĮ}Pm0x&%\{tPM"nW<- ܚfҿ"ݦVy6|n&d+AP!?Iw4ts=ܑ3߶30t5>P/or^t#B |ѼHS TqXuwOG{gNo+/UƜ2Ͽ1jAlV0DH;`;f\ !9 /x8yENWm뎘ڗEhn.bşj :櫪=[V#nXYhJPbQ:S1ƒSUo0K&}+ |N\Q5!AW]_ K UPle%cLPxE.&>+Ml"s% PyvEhTJ}K%~=uasLTRv%ES~jZ쉗*OJJf&W)}RJ3 ACJiftVA5eReXDTK֤yo83mz6uUje<rOĥ̵&✵Gqс0W\~nBgJhѺ-=1!DOc)!jq1}kXEKC X%0*Ѣ(,ޥK,2 [ץ˫cYIe и~2\DpzHQH{L7-Ѷ˜A!ڞsXB4"[KIt xE @Ԣ3hA);_ub(p=ye{&2O34Ks>r1!(DIJ +`ޑͼ!`cՖy|ȓc׫sa[VV@gId1HA@|”X̷Ѓy%kAe0^pR<)& NꇙoY z30x6AbuPJ3߲>m#vs^jdm5QdYL@=.B, Nb`?KJP NK"}&f>ըʙ{+'SL=yev֨X(8 EsXT?T3a ysff.d()Z?*|c^*g8g0m8ň&ԨfYbR. '/!C Pt!%9YZ)E sT;i \)Ȭl 1{<첍u& (<لR:9@ ~E, @,|\!%2JT^XҲiH*]o`sS4o6Ts=F/G:׈6v=g>wWWzُ'{`H<`I>f N"F)x9?勡^59z%b8pX ?^\,!]4PGͯ/T>^nF-CjT}_Bi =ítSkR8Ɂwf% Y󷴄dуn~aMF-WpkZVE(|}?Ζh#Z? {5U6qP{J2C 1"r=tڲ½ӘBM21z51ȅ8@NuQ&xz0/4{!xE 5s"&=@@l\0-+`(C D3X;4y22k-|ޠã7,Z U֦@$2da[~E $oV=Pq`  r@)u3߲Bg}=Eu o# @)z0H!DMy3AC%[_grG0-+a^69![ TޥZjrOP%4Z)]:}@|vn7]B!yBHm$yv6eN^H>߲'dJn+] ~t9{BƬVydDX] C4A`j80$srG +3jd=$)(⣢ځxyQcjhRg; K_a*T[bui:IXȊW1uYm3xY"{\\\&Rȥq|mK i> % 4P\O;Qx8|KyLzZ p\{vG6vQ'LkJZꎪ}fp\p܉۬M*V9i̡[yf3(:!"o>]=B=bmRUXvOk;ݲF؛xE nV.{\:k 7]gM:3B ~M]V6n2i즋D5 oF[A0?ϰ(R|vAcmסaw|99] сj %?je}pMn Gפ{.Mh@(q/P:%G|*i8ĸO;AqF7e7.S]8Aa`1^ #]~C'e4X)['ܙP%KKn7:f QRƧAFHr|ǜ%\ܾo-| , X=T,U\ޛ(̾:3[ɛ} {?M iB*Q&d9!ȹ:}b,k_c,.8&"q2B>oVMxHzӲHN9r6z%uRݳW hemVr辳cHp7zWوE-Zk ݘ(fhte4m o?>SWׅ>h@Jp9Gf:AU~ ڲžۘ]^1wo!_ߡڹB3H@qX#0rgzj&&38g^U3 e'@I$ͧ3gkr+P@RP9F(]H"NSe~"'@I%f wv`\)Q|b*YRFŢeGiқ ,@vD~>Z =*'QM2VsN-k1^bvj e12 &F/'Q0k3AiKa`f 2aBT+ >\]r q8ۯj.Bk!n*__]B!onVCC!P-V O !SQ"ÜKl(;L8S }J 4`.*&˩7<ֲ}Kh~nlY>B]F ^jgWfYn>w"+=կwFE/Stw|leJsV4 3RhC1.[.3&$~Ay*Z+t{x6CN4x0Af Gs {³nM=,pu\m魿wo(j $Yo,iTֻCH܀4D{1?ث|脡cE<^f}j1h)ͩmRn&}uX|np {4_vVhoB)(?'LjRОxఢWz]4`;S旽1Q3NX?]Nb'p/Q Oi9$X K| h}yDF1^1xywN)b+sS+z1xonN>C^m~ _s?]]|گ fvfyipNI~6/]h, ֦`}g"ЉN6|oe2LO7AzTdlӞtvѴ*eqߜ^^]Tջ/^~ũϟ7lvXИ:hlJ@E lz;KDKj=':J1;s0h'9>ku {Ao>q?>B#SՁ|3dRQNi+~1rs.ߑ$:YVB8x@4t=h2M8xpDԳcTh}hD~hЀx"ݺ0Q8sb/8)HDfqطxx?" n(Ys-nMȵtSϤtڣRwSSRͅƱKq7է-H=ϫzeiXN|2..#=~t8_at[^Q࿿a~z~~߱2C]<}ȗq:̣&ڞl"O7yէ:,`<8 Av7jl' "!>NnH'nKeb:Kcw{܋u no[ ""} F$»-A,) hjmFB&:ۦx.RqMWnXa0hyX*> :?EZ/1AO^"{zl (jHDB#; ck^^vlmԱ5ELzWW?-Hcx7CZHVǑMl&%RZ?aZU{^xMm& G5 Gms+Ͷ_tw.&3G4CӛsW'tyss T{`i4bq4z~gSswz&T v倳ď~yQ#_[/J> W< t0P3>~g0oo/<w2jjbRJ=*1J&bnė)n<=OqG=>Վ}Hq ,!$5X+lnD֒@>n؈݈3hZ_duS]n2:U ÿ}o|2Nw]PMOn>o-Ue6_K! 46ʷYS5b?z)~i<2<ҁyv0p[s=Z{ ;)Ðq єLw|2ṋypcvW`q7l3ɷM?yn :nx61HxKڮ}NkZ{ Z{اd7E.u.-~KoI2H(SЃu︩Q%Ms4Rxب y$ %jM]4WEeϾZj%g#GH"  #$mNllDضx0l"ES#Q*}'W BeռB1ͱ-kΙmsu٨:n u4=u7ه. =Qo/ pkCϫĺrΌ ޻D©|j$􏮆:Sf1Ag بxhf/ES1=u9r%R򵚬%F5phܒzbܕ$[q&8/ BZy*1B!FFKixfHA?sԉR4 18L1&fRFTPp ٔb&R`+gH38)Z\| PXUN=1kZgH+mz?- BhUxT{&'/hpbRPFZ ;p3aS<ܚ23a B7Y}.Z?&6]^ pRv&4mm7Կ{{vo>04~Ym+ 1M?n?lˣ+@*.oo%a~"98ۀVΗ,i58~iU+S_tZ6Z,,Խd{e; ;\U=[9&zY&h4I_?|^Dz;$;j׸m0^|zg6M (Ӟ (Sn/6*NS^v+B);{J{K o F(ٛ4h1o:1kR<Srmc/nvaMGŸ> 597~,.VY=v~R_rʍ=p0Оw=舘ύd:{\{hVfQ%FoȥC|d]%ӫǖ BǩI'g]RI̘!*c2OI"y[#4Z&9D.jm,CFՏj@ VYWTM1U|ủXUkV<Ȋ K I[AVH zXְw61(>Jj?VșFfz𫰝)7]uX2V:@V,uώ^AQ7NE3d\_ ?>燗_!OPlS0/CfEeO}vJ=4y?9"NФ5 k au@wK!5&OK~#vV<tnB "ԸkOʭ rL&|jf U_db~1KDj>z.yy[1} peNj,y@-ft 2~˝[`\Xyd+87M[nAsN_PeEnihRQALkpY|^Pޝ!™|D~+)J%R^&|8ʷbV>fMJPLMX+p!-#,|KO @\5FBk y[FM[NDY<'c٠$o&ƫh y3Y>K,Y;2d<-Bt%tϾ}@k=n=+.}Ѥag* oŜv3&׾j{_c]!H"f& #-l{Fۑ\?I|~ V <24AH|z<Źv۹mY? ;C[J$>$>po_1tZeqL' -Qڬf%rƽh;MJ*}lf)w'~?}?{~+(<|=~N?H<{ws,gTבl\C]gʒ259C) QɎt=} " HA%~9HF<V 7!eCpÈhZF1\?EAiga8|KLLMm:l1lɧgn~ n#L/ww5˲=AI",c=^Ҡ2=&l|+ Wzc1CsO+z0 [8uƶ$hv^?0 OStz>_az `W./ڡW̋C*jF P=X+)JyF26wE|1{+%T A(:qL\mmy?H=Lm:c&p,po{6&#N>'>Ix=rg\zQ93pbNti8@@jmCjPq'.$ uUӠ9"uU Tq=K*;F <R I07YzJmO_&0~ l1[+:0}r/(g{k諮 U~i+s~J_ "U}6bx#ZN_u;-I_'$%?D_u;_K_]1Ak7 zz- ni콪$_7-;x^"2˾6*ªVA jp>}"b;c~rxg ,7 0Vu=PhwuU1Qfں:8C @A+}UNy6^ٵVRC-uD\Fs0i({잇+&Nr'2cj;lg˗ϢWCzbڷuMw: RYr=l\36%tOP's[4^l4O@~{TAu(([T4eeF{D?f<-Ӧ0+)@# &}z2[X3d%ดZbc>,2V' YbEB2>3N̈^ʘ(|#E(Tc2UF` Hs&J6cF"W%i49[N9l^~e[Il!8)sB¼LwIHlX !';&Hl<#'M7Atukz{nPk݈Pr p,5;wgyۨvV3]G{S8Ⱈc/]La}gN,9wg.W0j6xm30nlѶA†,˝mrH^r}ph<}[|GV(˃ AYz<D7 fu_-߅-yk%@6ûN^l+s 8i%9nZjt`V-0;xQag?j4 fvstrΆ`ϭH߾ɽZ#zeW:/L~U6R0 vTߚvdv:t69[,n%,@ul+#(5ޑ@ơsى5 z?0ftׄ-Jg_DBtSHBPLݼME8WX*^@9->ǽ3лơ2(£G͜0H 'G3d$g̀6QN6/3ԐWPi0$wedNS 1!Z%٘Ģ${o<ݹ ʯ]uioUlM7oCނ,B: k,%% iM2= o,g:H&wQ69Et qbm$ w;QNPL~m@لQLQ"'H{t!! HJ5g؎hN41yN,3$BM 1Ze\/Nk*RռnP )k2@[Rg-OiQmF,0lo3f $Ȩ7\^jVJͯ8>3382/8LIK#q(yq"  .Ѳ=|Ih$!J`J ]+g3xKeD)}#D8yMsY&MԐ*4=̥-IfRt2%hF8+ZBaZRpӐJ.Plj?SD1O썠sw6Vd:{la>xSw\7ŋrg Rt%|_[!.;4Mtlv}华+ pw"cϯG>Vfܲ7-Ϡx.cLoM1tl:\ oƺo_劷 nR˟/UU]gaʻ]-84r/y4^:[P:%[X8_}* Io|\S]ɮnt'H| ˘Eͺr=jRyݠ@Eܨy!R\*Y$ZW FF҄(L*+-pnz^b(pHKizt8=T+p9h>pY%pE=V%{{8 #'\|k3uM"!?+an9y9׼ y4ڮ2 PjDxأ JN N"Sڮ3l4\_?̥dž j{ nNJZ(7vM :pk]s:Nè A"s8t0 hpD%Z30u$սGWb"8G9-s>`i5S2mDG:6c>|wdVKQk5YΦ{غΪ'cB| +_ m!]LoLQt"6V+Uƺ\i*rŌ(|N2ڤ.:Tw4]zCF>rQpFn5T߭]ߦ4kp˾~( ==Ae1AZj`"A )VK G)MφR'fy֓T;Ǔ4_peC5e:** m: a,f.L'8b#*{DZx=3,WHd. +nHGf)iπ@ +V Bً%)U¬-mTJĞɉFj 4/dq=Wdխf N ]'k}[Wپ%٪ab3 x[[⸛կl'͘gJUzL't\hqaSUaB 9_.X51SWOfOѧ^gϋH $GMpDW1]7jj/w V=Vf<:bq6-$V8Z~>}'D>@6( T֮Pg)Pje5TM ڭR4O9wΉ`Ib3RM23!",Te:C=1 epm5Zi@|>ɷk} 4vʔf*YA(EtҿG`ś1gFp+i,J1Շ̽' DS2̍!eǯw.iJAXH喘T)Xdd'Mϭ qdC0 /ׅ:MaQ8l|\V9A` )_l8Ts1~tRF[59cit@/]k T4Kwjld 3Lz>Of^"> zx˓FcHH)==E#-TtKK3"r<>D?soO9Yz #y]8!nVЄkg,~6/9СWhxr\\afٵ CG&~d|_MFfW t/{ʟߓ=>H3=Ќ(z'G(MS+XeUյjk*QI_ÞL1^-}({P&9=][՜?=rtn9OtX:'8]sd'BTD=cIO<8d;syH弸,.7>d{&M~^: zp}JVOlqר~^S樽 Qd4 @KQ͵)lzދQѲ1:%ԗ˄1qG+{)3F6Gņ5>H+Rh]&h;{=Tfϧ~_%?n!O#ūMt_9r/J5\w6ϑ)!2/;$}-$N pJ}pMZ(.rBJoׯ"Ԅ&AԹ=rn6|١-ӫUu.]`[+2/a5,DToL6)cZ|^3O(ݽ/T&8ӫ%w|q{3޽?$=y nH^&Y,ySs_h$*7'AyV靯Y2wBGm6Brov ˤV?4 UT|93ϸ[LtՆWلXW ~uwu-5$R 5N+T5VTR&l!0uuDg2| e7BNY~湛.x\~ YAt,9UY|%+_ XH&&mFwTg_J7g%&sFDbnL:!w*ҔrEdL4e;qLr2qkHnn ˯HtIIzò5(Ezxe*#v\|qO 0Fk P+]LS~3\M4Ѻ'Ƚ!6/~\"UB%XY2MM4)@4k Ț dB iٿ 8eX%]dYbT)$:יqQ?0_,#8Ty8@!pI62abֲ4qL 3:#R!K%Rji!PfveeH5C?hn񀓴OKTI5>cS&<#/\a?yz;7wz"c╚?58W/-1g4R@on'8mp֝%՟ԄKT{ўP%0bt+ooR/$-Y?7EMÙڼj GL$ e&L*j3:\z pاaBqFD oP%qqUG L4aT=[ڨ ?{'2!' jaofgY/&f>q,g_+_ގ7}PYႱ?}ۏ?;/PYh/cV%s8Row?}pM srءs8,+j! = |:Y]|H~2U(x Ԕt [Z{bk.W/.m^M^dEǑg. y`Mp'T<4R-!>d:[/߼'#,tCro'PQC{_c80VhHw^]ar;A,k|N 'x[]S)sLEהiDTR;IIk U  ?3kmKԪ<'$(sDbj ЎqԒ1g"kgg,U 6b!-)y4Ba[''Mmm-E4K%[_q +Iȡ(AOK;#/޶?niN.~a[Q25g'u;nk/ww>6{z2 'b@tбV+MVߺeY2D/Z<% yp8^e0x%-MWSlB=lYP5\>C^4VS TʧCSz|,jM:1uAQEKmysU.5\P/ԾGSw#)h3 U*s-l a ܜ ¦yw/:TՅ@1T{s6vVJ)O9wΉ`Ib3RM23!,im7潉Ժѫ"h/2.5(#9QS4A,SfFTQQN`ś1gFp+iFwm͍X23CqRғ&"H,{$lj$%Hmvwum9 'sbLaA|9X2Y9UmϚ=偪e)?殊Syv\z>?'?Osk[vy=T/u2M2ݍ-7~پBY@ 16:o'ȿelŲ7ۂŔGsˠ)A:˗2@ic. q{SHOI/+P@EKD]&(v偋D]2 ҙvϔqn!$䝋LJ娾;M@|y":c4n7& <-i؜Ƽ*XQWބtUO!p'/oFT*(^BrѼ9nGS(ZxGPܮ @LD/P, v&>t2,+#>t`n 'Yw ?@11TmDا/FkU`uba=α=6 J>6bys6Cw@/y#,{muyvo7_tqnKv.yM`tzrm9IreؔVt{SEwRSDa Ԗ{_a0=Hx͒cPP͓У;H) Dd:pwL.$R3R϶r ( 9] $v/P$(J~YX 7? -if2|EL7 CX*›b`C;#< H@7jljű3C{2]>ϯp1?wAO: 1WWi<3b:xO^kvk^澏-XdJzM[.H![2Xk#?WyQ57^L'^)XHO"*>{x_},ߐ'Oo^N 6&2TXżjͦA'Ƀz$jGz:k"rL!:_:?m J1L2z],u ~C~ǝf݀Qxr*r., *U*L 0BB`8S 5TP1(8E,=sdne8Po;ͨ,XK*gjOea.9Iu,I`L!gil{4?M׉Rci+v`+N5 >kO Hā4 Nl[2o}ևJ[6XYM!WMNEeQRAm^3߭ ǒ B;t+dilb\?h<_;G?ÕQ4ŲDSdOo fnNWiX~]A=> @6m}faW^f3.KmR:1&PR."WŎ;WRc %ڲ9D90 lT]4JeBl% 0@ؤ"Re`Y=Id')罊V$TGqاI"4 :( Z1Kac:q9WW;GjH5OA&_9r<ܪA҉2 G2*v)Z)HG1x$,;gcqYZ?ʵ)uF7gN|Umc%2˥}$6r/4AHf 1먏SsM~/Waө)fT +87@,X\oߤNS7s:ݧetg ovcyj6Ky1kܧ^_}{ׂ[Cp_glǭр.&dpO"9%b\gNfԇmi\(T:$4:2 4J8 cF^]D]C_[\{K !u#ұ wBݰMUōuMf.ԋE33~4Q4"|Fwj43٨ [Odf(k'o*]e>,_WfzX.o!c`~~t`MSٻLG'噑un_j9WS[&#TBٰz}N^9^?[Պ; ˎAM&BQ"1FB GHř;Xz25H\uM 1K`uxr۩wACCHvXlJ0X6&`; (ۼy>UrѼJs!^b!d{XW r.Pة,gg}+Cu%T6 e.PAOgKɦ8=|ȯ2NFc&HwN g]tF ]DJ`wn]Ha81?>aTrыf@~ʵ-֣Vyu^NoZeaiUPL5.JDP@:VÖRmX PnWex#y| `ҳ Ndtc۞A|+$/uԏxjZ=6ߩD4 )"o- Ʒic~D+}e3?j#lNtѐ6?.ގj;wpTCL\6TiVI4J5v:m4 Ϧ-9;8nsZ)S?Uca3@/bGvFN,EW2;F\X֡7^L'^)Xey'z%V[! Ad>y@X)if H˔"!8NRi 4a| 36"AE|ΜwgV. ܨQ))9{"`D-8h$E!JPӹf#&#PBQRiD"1+ t8 s%$hDA4J"A1(bO!JLSu*T2b̾SQ5s,zbW|6d2ev6ł'%JP̉۫R^fȃ$lIs zso [ { bFce~d,ܞV3LKJ $yb^lU Q~#(VTIƸ6G nIm%q*hZS@5YmX>J*f?ܾqG-GfɞVB>so[X0G,C*^>%r~ğKX7e[%z'OX+ 2 Gw''Fy)`2Y dJՊ'XP 81dǩa9rR,7Lgܓgc,_cPcW "β4Sb/$KV9mlxJQ30MmLFDnƖƍ\*pl, 0- ^4197^|Z&[&O՛݌ܲzî-iiQIB2UO~b0g9xwDʤ;B2iM}©V&an+FJ1岻>7)|ݘL.AULX.ynGMHo\"@vާ1!#mnf4'yWk;U@B;ҸL:}u4w|w;Xxp䅐5LB%{MKЃH?k7w_}fRKJShgsb/#Q&˅M}s瘶ɧfe{͔ ڒrK$JM7(`1aQ9}T<?_f2PFJIf!6(;mٌ{+t D\X/ʜ<0ƻStٺ|!4 PeϽmا7&>YWWm:uzӨVNY;v~/ҽc< 8<J8oS+'Si}ea7ЫIIG]Ue`$m V!NA m!gm<&xnc'>x$ܲuA! m5\j -e+h vcɊ6 hQ5(3mt4_=H{WvitJ; C`~р1Jδ1]Xa_p8fTVzgW7x=Ef]ʿHW\ u *Z3,+krF/R8"+1^9,bG f&ٲ Tb쮞fC"qd& I%V)Pژbtu)}Sq!g_)B2)b[L$cF0*$T'&Jq¹(!*`aZ10%+f&r}B<*ĵ %*QN1>>KU \#KWBN:|K7lIqݤ V3Ϫ$k?$N_N %/JvE=DVf%U"+Y. DV*b v~[8k7:i-3y寳{ە/Tl UlXְ~_?ͷ -9'~yAGy֍m4f?=#}},ƒo~|f0pӏvUy: )]z(GW$r*R^I)++;\iQj  6܊j")\Dm,=&tGjYJ)s=KOH=-JRXJxjzRXJ-D_Xz,*4;E.{,*@ ,hrRgG)>-\/RH)ӗydbM,W̧+ķ1 ,3e 3؀d}r… {ē!7|4򍕉{|Z{_ʰ㫳)(RXñK;E Kʒ2B [[Ȱ8+˒]pYjl;e9Q@;Wput@v@Nɑ&U[JVõ鿦MK`фE 7|j콦7[}덤疱={'ΥZy7&w鼩:)> |7F&k,?9ohM!ɮ?u!?MxA'}ڟ}SHڰn"gqǑsS4g+sރb}T[t`/C3U7:Xs7ڋ-Yh`uRh3}}~!Rg~vZs/^^s`pD|.|áA5ABT1xD A@=:9jZ%^wi]ET=!DwZG0 te ݌}ȟTTTP۷}u~8t8aFWxQW}Ԍ%.ógxOmӤ|2su;ۆ,[N1TWL?6 ^`ᕄFk20_hTKkRWjŔXkG՞գF#ECƴQ>Na!$S(AB0VJh,/*$TE;$ʇ1:FU!^H3`|08;(aQDrX4'09M)hs氞 ({-H)gf v)܉=]_JHD#v\b%~vr%jm$)nh&D3alX#d (rDI RİT@T:S)Y뵏5sǹҀ6Q95ٽ#҇|2A3|<%w_3;[Lvaҏ~xsAߐ}r }&Q.ӷG7oF8gMW^]g:>9ALn>P _ֽ֚wlRj}PtN3`z~ҍzf0Y`:F&>Q8I=-JaY0r}%Bh/XzLiQj硜%FA.-zKK=-J-W^"K c)o.KI=-J-36YJIK4f)a,e_^ 2Tj/l@y_J-e)}TKs[Dx&p-f]+=汴`sBk sS$9YoWUqn;`'š й0 ,6dMFczuG =P^{ `J@Ppl0WDn|+`Z#;V1+.XaX eF;+p{$z(}3e.`G!Ó%C$ʝ]vϘu2o]2!CeY/ ; m<\Ao^/;lE$ByE0WÄ 'с샘P]_]8"￷,{/o7O=ᲀu\M?|pk4J7{1YgN_AGj^{j^{j^{j^穹~AݍaZP|:đVjU\'V˜Y iđ$X C kQvmnJ zCy@g̵UhMy͒ʍfDǏ*Hͩ-ũ>6Tm!#YH-882V&(*L\9M`z͵ۻļAUT@s]URLl;DNչ?P u:IQvTQ3C D$&D E d84!=#zp-ń%B2,ajRݔ)#E2T:EL}x&MR"%z+ů&!FD'&}!Cck}#>N})%eDzH(}X)U}0K(p7eF^ʕ4EQ-v*35/"(盫5+qH_E?fGag?dq FWDάIJ4:X6`KTUuuuwFEx{<(z|-(W3ȪR{4ĴJm4 :wOadF9/68 Mjx♧t(Jrd,."=6?apwη$L|7W]sqN:8nU2Y2~B]jfyg?T`6|11 OyC߇et>bkܪۣh6|؝onJ/>%QhS/T<;>GOn2nQ]Xز2jk1|-\"*kD-YIq=W!}g~7<[J$^/"wQLjOΏ`n;^a8$꺓_!hU vryIw.ɒsvx#pL ä| \A("AguVqM\d]˜.DM}{qg$ gθ+m5X!y4,_ *"-uomʏ~{kC9PqŚ?+U=Q̈l{ktVi!g,m@۽:so-$}0st˧[fԥΪqcQ͹,Iƹ[spRqh+kro?Gpoo\tXջ6j ?{kQ%' L㌜,5ru͓|hEgpDNV C!bsb(xҬ/NFBt)0Քl\TΥ^j>cA/BrWI?7f ؜?}GiUSs(o9XHT!!߸)s:S?n\6h؈Nwnduk)-[h):vd+b`#:eߑGnfxXcڭ|i/KV|"ZXЊw[ {Z0TD>vvQ~!]76y݃YM<)7aD.쵪)?f(x3rJWkL6.G3/\(1\EXC6ċy۲%]'N.Wo׊*/Wo։ni|CN8ƕG5uZ8K9PSQn4؄0E>;Ҥ$cM@(>ۑ99GBY+Hgp~n+]5="גRLg< {2qbyJVT=D6Մ#t{pYQ)5zFݑ6Ha﷩8S"(I)HVH!﷩H\(;)%P BJQ}MT\]b;)~oMFD'r%rӃ~̧AOadp"J+.@.X#~<]wٮ7<8oj!L6Ȑ~ز:2V9l;sbўw7$YH,'j'~yJ躸rY 4q =X:ȿ Q}0WɁl\L]eHJ+DTRF3d+_%=b[+Z;>e =c-xqQ * W$/8~m[wzGY(wy~MU1cH% 2qh8AkߡSsD{gZR sb 1 JЀLKWR*F*2{bB>҇BDd\e;_@kurҖ\ Ñ%pk0WZiDžK\%hl`:.H,p )WJ—j_C 2_ҳ%!`M~Ǹ.fF,kݧ#3LjIWں-bRsnI2$" Ȫ=[6RR#p?܏y8622^w+?gvr8Tw<,F#]6oL$B=T$:#'~H'=a dKLdWi d2<,ĎB #aDkg.P%hM%rB'9pF6|:<6^w&WZL}g=>_WN9:`Ba Pnba&RPIn1Tn18Boon/60-w1$D(J܇ᲈf/,pwFbW}<5tp5l-"3U0IʋZ)7Hh ȷ)`zjmfqE4=vk&h-[M*@{ a9K !=(K~6,Ii‹-DvTЪDWh񱤆e2VJ塚aisCTIeo7Q 1)$ gg`ktN(-Wu5>η jac=D 6nf<Nv0[~`6'JlՇ(خ|ݠFgVP{O_pQ8RFOX؍@oz˩5^䡴_a1+m%lQHzY l*P}G"駛 OjHzyDzHzh)Rss&iv+}Gv:ƴ[yw[KV|"ZJn ZϵX?v+}GvUT\4FYj*$2kI:Uyj) L&NuVA,TR9>:p}8> i`%ar*P7rdܱiM RMk3?voG2;JĸҼ\ Ĵl;-L-rf `J.`b)f>8 ,# U?,8 Lo$wU@K.!K'{oZp:0n>5W7W8Q& >v?:|*C/c+'gڝvEMI"]o㭟t rXn1C: +s7VtINl>djL' L<]Ʊql W+eDv%f7+K"qжyv:an0{fZܲ'V(t/?ΐe\X6sX9!Mv2)%ytɶwqbrŀ74JVo;GvQq VcgVl?O-AT̋'ᚲ`_q9 \j!\?ApZJ_661”&ecj]Wq˖EἮI?"\zٌ(&Z m?'{I#{3D7kg[#8θoVS2=TIW/_.J˖'\yr!:7jP7J,֗UAh)+CwY g f+ke[y? ]uYˎ +0CH\X,zHr*uo3HFl|aM@2O+WSiףg`b1K\t0Ardk8aڝZTq ,q-ò_:7SRю]%m<*w uh80^l|ҝѨ.sh]wUY ႉ4!.&0XMGhuw+n]ǐt d{OX830\~ ¿8sOFs_~z}~y5oa:=|~K d g KseƸW98п[?ro ;o|:s>[2-}p_=4 s׃{[FLlվ@ ݀}@ з*\S$l=o {Ox~IHƯ1&54/7I^B0go% DЬv 3IʊTHwcD yDT\Zӝf{!6DQ*TWpєkW,+8/䎒o?p'hSwBlWƪzSS }C0}1Am]ҙ94]ۗ^%@F> /__Fϫ7݇0K>dͧ'É7f. C7-v^2i-oJxb6{1OWFt,ğ_~M?c>/߀o*[}iØq"4:wxB܌$9e1bHHLqBA0*vm&߿6gYسδ]LJN 㚫=fKMXAKO`%r^$hv%в&j+XfZRX- .be ϔ/eYR6 x}e v#*'iSTiNEF? eQ1c3afRڪT}:խi8{A߶dER < #,=%p6YmQ.v|,2\޵8n#F6ߏMvǗAIT7nې홝=~$ے-)S='@ݦ%Vb % )A"Tr*m}xDEQh6>:]oqm0&RHΈsׅ ۹30XY=%ec@1tbjg JJY&[y[P$Nper ;nd.hal}mObzܮO˅-`s`n͔7?>V">_x'붭5=XEAuݦ#VǮ֒p`22ټTN1>D.bA  , |sod% |G/(4<&M!(ʥJJP]ApEzS|H6lAjW{*7ԣjŻ "vz5U8Ud:8.y|0ÀGl 5䮗jd{z`^3itZژJ‘>sĂAJˆ\ zeSRoyg`țrT/װ̆;ΚWY^'16[#HRٚA)*breHUfyBƚ(`pSq#̿}].:(6qdD`_|.$I$Ӹ`ٔ#YU_<c\' (nT 3'a14wLwl A0(R &ssT{DRܺ[/ lH_`8!׎P^ {.GڑYmq9ZmI5%*p7"l vU0fJQV7e]t۠oQ[p`^sIb&JK:>\H'tkTVw`TSHe[oc+ꌿW`^!9/u-8r20{ Gh՟Z+yF*8hQuh7V}Gdv)?LtmFz m5Ӯ;[:wت |dۘ Ƭ uѽX?rUD1Ƭ<8r/vXWlԹ6z}@̛ArE-C%+y#_'\X +[cl` w߳*?ucs9-O')x=*yFv3RrdlO!)2&I,<>hf7TGXTaVL:WG˩Y_pS}[`ly.`[>h[ Ig!dbVtB׼:qN息EǵAFP" KF}C~u!hxJSyWo@ddWxoI^}w [ܛV3֋<]|~ 8'. %X}) Pb!*13̊WR$˱Ż)5gS'Q)v::٘Br;J4< *GDrZyhBqHBB`p$$(pCPhis3DYCLl\VâTpLiC&#mvLnX 8>'3:ULKg-tƾ) `Kil]WKXlrq֞%D.Ү[C/|``_Ҍ**q}mr\i=F &髽5W¤CUz.*YEOW^߲+P2zU{ uӶ}v4T77B64X)Bܩ3!c%noCьi lj.嗷3B/|9/;h%`9L' ZiCU7ߗ*.V!{91=6hjvD]wx{ym^;BLpzMm̀͛ s𕙒*Y)^^fY6J[BBiKăNcVJ72B/䎠^]2GJFRux;G?/|݉ ܓZs^fNcvKB$D`l㱖1RD@*o:&TEBf\I# 6J*H΃8K3g-@O5UEh7n{WsiP[Wf fRBٕ8-.LZj8?j1) ?ʟ_!<ׇrX*-Umj+5YpRJG]2sy~9*Eٜwo-F:߼gmr4{6h8s&( 3 rN7.σ⥚ym;LI v;2>\Zror/?ʥU9m{ٛ.\pF2I JƉI c  9J2J  cvh7 Vv%I{:Q&LO5Z*j-CSz '!Uiq[n޺fpvjm2i&RqFj@!:-{#mu 6w3;t&O$Qib8:QԲK$|*1]9qH+KȿB 0әʺCs~@S{c9YPV0|w<[wx->qxa!EzH*[Fb|aw ߵOC3aqe'Ju@~|I #>䑋R1n_aۯ^MloضY6r5ECd$aaDۜTCndŒ3f L%8Ilౡ~{#rR[W^4< bIbL @8J!7KKQhY0XbíȨ<*Xhs N%Nx3K8" ֢aRJq!h0 B:$[;f38g!%RJc2z#!x\ !µNDmV P v"Xbj06LG"ᩐDDNdU( "2h00X00g1!:'Y'WPiٌţE_W}n7ؽ]J1zcCK|6C26͓nB=|>96y%ߤ:ɳyz^[/37CUtjdYxۃ`hkp0U_0=xQ2H!y<V< ,! bIshڇ2%t|؆"-҄ C 7 B62aUlpƎGDPÐ IeDHc8<@|U"N(PpQHQJlwch@@Co4>7k#[m@`D%5\btvkAB"zL ,ɔ|bҼlSH127~V'C讁Q}Ig_v\5g\X@cPI8m\3N`^-֫F>?V">_x'<Ot V&fNK2y,gȆpfhSԸkY45G"c P(4;)N e$>Y}r1znsB6[4\fAFEz@ I~MzT X'1CQah>  1@ B*3HUX3fv6;݂Rz5UlYӊ(^,Ym``瑬W%j[8V1YMAt  `(b~cX߆vaбl}4CMo.IGhن>Fm9{_/V"^|qx1到6R͂~_4bx3ЮW8qTxQZM`dcPD&JS,)LJBW" .QBy(9(d65@3/>Y#QjqSد\a B=~̃r̙UZoZYz1O68L6ܱ*O'x^{;wOTdj_r4Y Lm~K3m\/WYy: ΢퇐4M9mvm<+ P浯J>̷5i읈X8,^pp P'fl (Ez1u3ă.>F^'iQ?8Ns#Ij^$M|iqMAbv $Cyu{HuP(0VGaPMJ.oh2.`4:Бw\H\S3"Nt2\Dn*)R$A# BTa7[,Q!d]о5# EV/]ХfF$QQ7zGD(a$!vL((l>,GBXC4̑L#m5bLL̩Xe qbd?{HJc^A8 lyZ&N;,/eNT%=ĥGR)RZmS`4 }J#-@% LXUw`b\p6;cJ$rd EU(վdHiB?&j~b쮬-lܔj kyƦ/ɯ[]{WVve`pSƉ\iaںxH㑿-񌝐k;+ںx012ja{te}Ў%{7 ''T E^ƍK"2X?]O^/B/~bkn>H=7f۲9<7 [VDL 7 \A!-醯 ㄷ脏]$}ny_ݾZ&Րo&[v /Kّ4Nk׹kf]XqCȎwNV)hs9V;jgI}ȴ%RpGँ5@ʎXL _+8/_UV+)rw~Ond^v8\yTxBC^y|/MJ3AMA= -MKTm-Zj':$o99F"mVi;R0 qIU6J468tѫr Vˑ>֩{oq٦Fa'uWGQ Ωa-UyTGV<֩hOZT` 3-QhsTXj<=q]͊P 5Iϰ~ #|-7nL9)A#U1zK"*Akx9s" 'A* Lb= ^s)N02XjCWw[< HYQZz,յ$jy멖v-uRp7ٕםs=s/JCɎ4qz2SonP:~daR-~r+R[dRA Ĉq^Phl䔜V.CɖbN1YX'bXHgAkg9U xe >INDS/flc-j `e S7KҌ6?bK_k4/uڨod˟w_tMgZ:f Ep6ǵRy-HztxM%nݜFE")hx5JuSӴZ|Zԓ홣Ms?91suRl[ǮnژCi0@GRj͇ra(r4rӹlMjTԶ.W4}yM}ă}W72%=zA'4nznnZ}{ﺻzHPR8zu^F\ԦcU7IFjBWW^68<B\BwTA; p>K.'jK. ssꦞa%5.Ld몛K?\:(ip~51ЯTԃx:`қtP6{d0mgApPpI;^轓ZV)~=&ylz/6as2Ѱ AF~+@aɪCTNKz5@#m6ޒҒ>"^-CW`vxh*eޡdA8-ij>UKxPCTXNS +饥4MlDϙV>ɤ\eWg%h/>g<vKx33\d 2medޘQsaJ^dM)L^U7BP%9N{,PC 8E@ ,H-[߄˭^p\bbaq'̰xkEV fFA&gR`wR*Ӵ;PF @qY}PnUVˋE1AAWȮ|^\\Qs9G* E(7 F `1$g؟gIwƅ9 h5@߱ tܫQEǹx@ǐ*Hˑ^]-n)4f9]f)__^;\\;\/nf9 CO g8;:jk࡜1JNQ8̵fwf%78f2*LJc 1IKF|1Bw}s;_,`@iH\EZ<ƑH;m5ɃjdyrZ.y+TZ0Ɏ8i0. %N;O`LQ kf<(vV].)nLR(Bߒ3ґs 1NןM8kՑ`#3P@̅Ѯ}?~/幔1BqHO%1܃M_ϳ\72#.}X+ m0<d& M~"d-K6kkOml}-Qs_w'FڝẼtrI&׾vUBNBB>)Z|F \DG=5Hp&}sj56Q3sAPI+4C:2<*y Y?`.:⛛I Q(̚b3LbSeTB ?=-Jr9aޚ?\̊ 6? e"qI 萝:PKgwrO5 7Fz$YHFeN, ILxf±BܺP*i)#E*jD *f UJ ;vMxGbd%͢3 OrOY| 6y&hC9 4P%~GkYuw yaRy(op.F &meVzCp*P(.Jb((R-^o1{eoПMhF cn_0چ?F9в{h=X#IJm$5Ǐ9/Ə^Zzh%@)JiY3(2Xw?v\VA# kl3XHcYx-rxm.kejy"*JobnE 뀁 @G/z 7 mh P3=%owe}GgP\Ã%j۞;;hɚ$x2FQ0%Ćd<}C Za' S$+B&c9*o(#Ē6,:,$G{ue= rMr y'oT-xXm}5>@Xnemw{o~;-ZUxwi?JKp6 ˮ!" c) uvXx1F<&)F>,͒ps*ԆRQ$+ @>F-c8ѓu?y=y ]C{޷9uFGnƫp:MkR.EpL8fc $C#L%X1jxˮ(`j^ :Gvy`X SW.cYiSʔ4e`U.Мs댧xn-yq|+!hߙxp>e%t)s$B8.\88}oz"{Wu%0 w;bfXVޠ{ w޼Ϸ4n5^ VIΒŒQ4"L52*yq=]R6-% 3~zsӯT˾E(?o\=,dhԹ5ο'Kq г0tVzmc404, F &g\^4os(h81}c4-4'])k[]lI㘂A͙s}%Y!YY/FP xuqtgJ!:JQ9ITD@Miu='YS ӫW@֌=oQ[.3rsh9l0Ѿ)YN][ Z+VNAk0it'~5(5\;]xUBﲍF1mx_8ͦgY\?zqo8w&I@^4T8Oγ-а3T2꾞VKHiMR~d2OY]FjNnIYmQ;S=}VW=^uq23} V?8q,άO5׶z;NWflK;鼘]mQc݄ uI05VV<*&ˀz1oSߥnve%ޜv @YZ)ru2uZ߮S6sFV6,M|g{\1Le3u=d!?vW|,~ ޭ Ngn/]mUһugnDpMn]et>w{* u*ϻuԻa!?>٦8~O]KnAK>,)[ )fbc$DB -G{Re}7Yvi_EV}$AvA1iaƝ%#5]ڼ-U%6Iĭl1\\_<+o[n毋wGGo;]hkptorv p ϥ;"*PLy6j~ѰO0L}d>#)h/j^Yʤp[8Lq]e#pFM} +np"h<Ҏ2DZ2BߵL@jw|>Mlyu8eT3@RУ 2$aȭT+9{|1 2%0(G2Ka1r.LK+t~`|4tQeK ^T 9D^1﶐iC06D+!&Li"<[K)nCn UL9B^: ;q Nް4 C;#l2Ѧ)ɐ/'R,8r!]CY[B'k7Z@RUe&ҰRʄe2T9SڨՍJFY!T劬RY~TD5%.Dx2{/@P7:xz:>c W}^v[K%oԆH M RĹd8)hf=:pofZHr,{čcmEx?YTFs0Cp)?, <XW<*]YrKD(tN"k$IX#U1[1]pMSt\fyznzloml[BêZFQݟV**?BIe^w'*CvY9'OP2qѣk 2;O/Yͫ|%dS*ML;è;RZAb0tX ޞ%_5bX>'lHzN y.2ݮ.JB{^cI»74/W#.xAIKe甗eflB~@ Ksz-.+p{UbsUVЉp4"n/HbS98mqGߐZsWz8􋱋6{cKVRrLqGKÔHJV;y̌Y9P>MX|h<_* w4f)*b;'</P2LGcţ@PE-pޣ| , d݌×B4je+1‚; '$3it6a X+Mq`lW"lb+9Ni /IXƄK5-43?nJV4a[hs% 8 R OBA ֥8?{F%CлG[dp]f hB ZMWI,ڼx W0U*<0B"oS*Ffg‚7.1ɴM;"N y uMD=2 *t€yJ/QXMNQ޲J4~0^꫑;L8M-a~4TR(ē,){FF磤5G2L11ŊK;X.k!GtP2r١0KXV]+Ǹn՚Bs,9/Zjѳ~ƅǤnmG?=mȆڐ1V|*AUPkA`9fk&ִWy.LzZ+R+D)xTsZh39H^4q$j_< z=O)qlRh}$Ul6)YB~Hd15>,M}v0]D mj^?$!Z@19%̑lrS&<6bhα0TXHKBYzUL1{4} ]O>hi꼌m 9![o˗ Q_%mH5&`RCNLxPJҜ{e `T.&}IxFDEpH$ RV5SֽJ)̷#7OE ,T+sPny&ziƗw+xC7$Ry?.^9 jӿ_C%u|}rr|nwӦQ)J}pJNw -J/7W9"o5UZiѨJ =1mS<<[tW6jv)J4{RG:sxcr7iogrC/\!i"G0}a7LxCz\{uL;_ {د?ozEg'W-K՛o_zopu{pF䗵XlYҳب+N,kۼ=2Ͳ},k#8s*숷/7gg/9]mkXCsa_. Z1d7yik鎾s=Q@[4k4 ;+LU6@dV(XYNAbN.i,xsK8:%_Je,+M뮾YVnf=fYtF%ȝeoeFzQogY՘q(8+2O;zt_H:oR85ɘ/nM0h4)~4zR= J*VJ#\{uzLYRq#9ZG%FtTKktY[bVKXHP.ilV^?3h: L|hu &B:yu <&UM /0Q?u솪Jkg5ĘM.>WiK+\"xIGGׇR<\qKIـ\l]6{ACHtVhڭ<鎞.l; <-R; qkvCVRgCx/ؽoz)J쬤>້HS^\B~vpF.p^c{??f0[e׃OR?N ˽VR>{x\ܬmnW./bF[do|~͓"ϯOU :>d1C1ˋyniw,z^6lrN}MSp|toO$kDձ.Tbo{mE'~X豤hR)j +}o{mW+o+V*'`;Pi)ygPCDmbpl],e1H$eL:06YSIJ5 ֋8eiξ}Y'aLQnuVȟǓaVU_oۆJ/a@ٶJVSVT-(e!eO3 nuI'K?><_ Te[i^ACQe/`gx) pu. M*5v&#8/E_bH`ԍ0$孫LۦaDSȜkccȥ%DBS6@ d-'C䘍%Jyc)=i'Ҁ|[Nm;^H_S݌CH $2DB?}'frfD42re@E%aSD2ljdL#wO}n5hP#%;av{m{O@KSwri1# E%sɴT7ULOm՛]ygiY/2|&:Ȧw3w ŠtwbC:]*UN6KwBs}M!xxwa LS2--1: R1L+rU>vHg0Ոإxr: ZhcM, (S]AtAA+.`P4s-n<%3}lҳkVearQJQx+fCi@@P;r]s! ^XN3N1QzzXX_ Oؗ.P܋Ӗ b\(KSt)KziRV#H]*PR |y];+`GVHFHbI#qS.UJ $Z5ˡ9!pWñ:e l*h"ԀߓILyʱR6V7Rvm3kק4E1 Y*j*id&{ðH$X99|sb̵do$_V+t޿d+V$'bN Ť ZB>:W ޳g׳iTIq IV4rLHY4_h1׳6 n2T$BIܙ `8 s8Сq0|Rϸ}m56ZD)1./)P#iI&Q7R%s /~noab5Z[ɰso*-Mq"iqgАZHeH1袤=(?OT m*8+-١zv bh @խY!#ھhxe~qC?6 Jo`e&.A&$9!BICqZ–wSI2+,E4C+Ͻst70*qs 4ИP/Ym"#KJA@c#h<k+!gPrhHYGI,ɝu.!o^+KqO"H)//TJZF& JLT 5 =J+,s @(J9zbX^P[!'Cv,2ʹ1׬st8ɝ2q{4+1QL$hH8/O͇83ES>ndɕ詥,etE,^=b˼X 3peW &2/G8nρx2hǂmPkt ?ôTWb<2TڲRj%&!;)Fm$+fw)wW A[ ng'L f(-y USEHJr$H"W]]]rm󻨅W[zAWB@tRӗuR/]qa&]$zAB R%ZHl1Ybэ K 0's VV2AOR5̨-%/g '/E^堫mŕ~^+ӽHқ,5MDJ XgA.=.$.~vR"@|أ,*3&&MzQL'0)Vꊲ8V))G0WoCv~r+u괐.5)%N'C,a61޲h"2\L-O>Z4K=;nIx.H\qs7՟zXyirp4I<ՙ<#2ٵEā6USDkKBz.Nٳ2KTFD>E{" Cݦ/waҐlxr _M֩.$>P3M4qsdlj܄10%e|Mx|2+YyKzƃec)[Y2.CB[,')7~?ZCj)Su}syfQ'!\_P 9ZR$yJbw h7% bCB&F7cyŎ ԟ7TrxDfv^ @t }}6zFMݨtjBJ&zΗqU'՝QR0W4*%ՊR#&}!QEmd'0v]CA[Pm})Px^/;Ubdh y} b`:`\5,T$kcuZG:b% B,Cb%qBC)w4."w:t3E4UaH!GP¯;udUyXr,4-zM.WgQ_ܓ#?k$h{6*KT뮑Ɇ6Š}9ƖI\cdWI*ɾg j@\4Yv})?hZ(:.w5}PѢͧTo:dϕPQ!{Ts )Bp %JyvV*Rm5/hE! 6BC-"s 3@ +<Wʇ9r8P;Fq؍{!%BYu-i\oZ8j|hD]~<&l7SΗ`` -j`ZZ>:+rxIBְXȮKEԗ8"CۙոF{T(|G|; xS.E $Mi 9N޾^L7w8ZMJ%@4SDPBMR2KToaTR80 [ 萬eBՃts@Hga $npZ8h} #UM!'c2>k!YF&8[?+#߂c^*Hs=xiı' %K]gq?'ZO}>n*tOl`àv_Kh4LCFHp _~t| Vky7F3[z\s`dCɏ]~[-j/;ér7n[V Z*=q?! g!yN܉=E\?Qc/Y$ˤ0."tj-#lRBIr,:&w,Np#NgL9|gfWUEH!L@ĈO-Enτ&eK4\|!2zL FoE<ԑR*a~!gW= (Dju\|:s(*H{]c.>[#Gɻ`r$&tiöU!? ^g`o91bTB0A]jtRlƽ: Pj`u6FU)f'W=$Թ@\,DO%zSKM&%PzqUJ^]"2Sb6?M6c;'@᷹?sFi4j+leJeYe*c~{,g rYOQjfd:np 1n7<=.\nҔsL .g^e9h ןZ 7M;\*L*p@S=pe!㥤F2Ň+\r04ԇ 8HCn0#J7ćq~¡qB86úLk)"Zl(q!gWoǴ.?tZұԵVe,#8HiA灏j:>^&B[x*R*4~ }Lt2 f6͍Z RdKfb /sAHDucPNo`DA>GOz0 YF b̦@Ha û|R(H."Y&RTQKn}5fО<"4tاo)_-đ~5ڍо1qedAPnH3,=0! OѦ m]G\1+%kSn E#{ۋ-mw\{R33r"DN]eeDi1vqtjWZ<`@ITۡi)b -Nv; &Ew5[uUђJhGۜ}pD͊"OlXwp 8|ԃ:WIVp*/;, DJ>u?Bj\>ohZJ mҪkZPbe,$TPG,f)ph;l5fV|-eQ7-+X@OU S?ESR%0"[ѭh$L ȃj~.%@OwiR\n=|&ed}IRnKx) \HBXMco6-/[hֻwf)y78\YI!翆dC:,9@)Y؀H^MTܽ߄:x@~~x~(rBʇoe<[XQp|:ʱ;9.0fKI}W$t;^Mjqf\F9ģ pi:A5Ɏwl+-dWyzzbTSǮApI~T~ 2Lٸ?orI3 Ocp8p'G4?5ٻ޶v$W)Q$M7n,zsndK)J} s$+dI!ůXV#Q})uE[NL%udixD<:!蜄I].'EK>򴺢ힾ܈Ng&j쯆X,&si΄׺!“'jRr>r5Qdn^/9&E:)TT#Hr (^=pPPd^0upR 4Ӭ~MX4ܜJbxxUSQڙ)j88mtɠkKpcuw8iBXa-7F@w,G[T,TZzxnAqjaL;{յ N"zZ~'evxI4_/׮SBH%MaΎ\/=_oh X[z?z?ZpďR%9^DJ߁ݍVgJTFva=S? .p)|_OO,th!7|8\ٵ"nKMlcHlMc~B^)/g&O*'ɓIɻr:g$#S9B).+nEJz ET3GnJٺ_l_sj6N~};ѩ^KO؝|CL~zZ47Z CX?8"[r/PK. =pn|\n\B@XEyVoq=af2SRk^z-u ribWCʄ ^L 0uVݡi=WֳY5UnzLqKB { ?y2fĹ|ۼܮ+^nÊtO<=i_6UmRF>nn}?~u]5}|ߥT8[?卾rW W '+_; > /-!'dJ΁wR; Q(*q=5"@1a.G)F0e~MtV_oJه~R~_;nu{UI3,t`{^s jeEs/;.Z(r4%zH9I9) L̿(?덭GS˜Xa›znnMQMS 9oЙi צ:ToZÌI\^ޔ *B͊b  ȣ2-N'}F_#)@0?n"Fa`29֘0NDR vkΩgEҟ>'+  TBdNb`Bv!P5Z9ZUYer,xphYDWT%\0^шW,xUW%}3^4W#̅!J?C6Rq9drsdb:&Su+³z#%%K֞Hvj0K‚1?ɉ+Lv8(NL7>0FX ('|*<"9Z"3y,x6ZMbȺ,eRb<N<94BaJFJIhОRU@ޛ2.jbҖ\6'eQ=Tq%}Ư1zeiXp)m7[ޗFrѹa]m4,ߓ{r[p!g<ȨEb,$Y2DY$ƻ. 5U#@58|~IҠSpRӔ!"DU$kM[fTa&+ ^#cQءӰA@LAJᴾf$xu~H䑳9yQ)LLdzAGwޗBZ(4jtޥՇ>x4%S65ZȒI,QP-Eq=M V_`1)LbCq*P(.J^@]xT;/DGxzEdsfhU .Z,T( EpH.˖!2$8 EB撼4m",CΕ?ivYcEGN%ܚF;򳰈MYyOQH|g)^ luR wwLI o sf0[YRȦ88몋6."8[q5I.ʤ)P pU;Ɖuh`ǟ[9rVK7DWwҜiK7m\LH*4G!IgœBό!ˤ$K?L'H'="+2>у@Il ̂pdx?냶mp蕀bx-Nw8 uv$Ҋ71/=m/ZcÈ E f%'IM{i%}0>v8(:41#.4 ]~%ճ\벴XD %hÍLpm3EO(.0~ $ bMR3a #AI2A0`wB * Z QT+Bԑ%T_\V}=VL/gb26+۶Y~;j#}oW.O<+WCX EP2b bbt-$VdB 1[{INMnuLvۚD!G *ƏMдDO&.uu~'(&Dxљx( Ehǂ},At z1\*'G@~/d"fc+z}/uZdf^y]Ҧ={*2wCH,YLM:ARH` jz&%AM1F|]oGW~mC8XEpňK@CM 9 _ E )Rr2$g]U]Mq0"];_*9uq "ѩ0+t*AH*`=q#t{S\؂@آ 8r)"~jyf8luT`8CGW΃Y*Tp4UJkBܙ6lq8OMv$͌O=ءL2S[+}vP,'DS:Y(} l mb'E\Cڈ%UM ֤}$<-}SF 05 6M!ߍzbJPN"Gqx:5c7԰!JH591.56LU`k뀗'>=|Ǔ/,C#j˄sL5P n02xxkg[5\3Wߎ$K2#O]ZaGWH|BiRF9[x:XN0i'<;&Lx4U l,RF!a>JaR - lEN%sBNdhXgJ剠/0 \Œ0D;ۥ|XxG*\AnyH,bx7Vl&-$?,99O\“v{^-hڝݶ_7Ѽb]f?o7Qdy/[/sprԽy /Qo޾糖Lz׃2~`<(g+ !W7tsuɱ3 *:Ȫ<߿ %Jًg^};o#|W?/o e sI,z~xXPXbhocfA' R"&׎kٕweJ۱Ńl J3{ x|ְpRDT r]D9P]b?3IP0UԚ>^$ej~ӸKo1bcTE\0)7Wr?HymK\>YI!OCh-*ݤsc&?úAթ6xT6n*Z&4+WZ:U;Uu$[]TQ}lcݎ3av0VukBCr-?Gs%⨉J[#JJ]ǟ~=xF6 ~}`~Ć.h4Ռ'!6\=Q$*N].fg:2a%ȎݠN Hq2bJ)Bٞ")]u,,@ G"iYq6ݱ!\Ȯ"S0dD)~,֛x<0Ъ8(O[> -#j0Vnp|w`l+ M^5CQQRfDqy@̻Ъx֌΄W3iy0 kq W bxǓ+%Te%1B,pƤ䊇m/6 $.I@7|͉lDuULF1iz \W ؜+Az5jCʬϫkF6),jGSK*< bt *@HD<1}&NGxt;R:{S z٬J gePST'B*Y.Abm ^* ʩFb-SZԥЪk9*Kr2#K EJy+|HH.W`2X_CR8Gco^nS&YBJpV۫0Ċ1s+=PC,7֛0W:+'l;Ufbœ˰]Dw$+X˾Fo[뤢jR6u7*Mj ƶ,P˵*2u|4G}<W3rw(e4}b4T'%J o9`0-GS 5)-؍S>zz,H(:[_TԬtVR^% ?UP>~SAv.;020E%zSl%Qb{Nym^+ŧ.[d2Ä]K62nL+̌m5 G8pỏSR0b*Wu>zyod)\SZĕVᾋSRh\]4 sDpغv); )Dǿͥ `s7#]Dӊ b[t#FİOU*Yv+=ӹ =xnlEF;} f%LH?rRW!è,{?$ݼ[FvQl]Š~i?+v}ӱ2F⿪3O>o= aous`+vA O濗eE;G|K '52IܑV7.ryen^8]l0[L]<6Hg0iRd:bRt-T6ZZ$F)djb*-R)d{cgiHyZ٠)DYabm6@P6TD! cԎIQ I6.X A_{=kvJvȠDPR|Ň. ND & D)"1}J+s۞(|z&crt2e.d)=f A/US9 žw>o$RƝUbow8/qǡ%)!4Q̃0jMObo_>M<޹@]vd H$ \Ĥ\%PhX(`vT a̞Վd$pR&0%,8(kxJ(F*gs? ǽ7=)XD~])ҲDxg'ِ DRCaW< |x8 /} F?e,Ű\& 'ւS##CSԓxW7c$4渒*nwem$IT^aeǻb1PSu6IjϟHJiDUU2) -X_QqiE[Eme笅`.o5㟕?˫gL>F~դ96ie;N#N"O3&?9"\0EWd}忑 K TYSh`,ޒ7>c0j(K sYr ıd ք*)>jCc4Y#5(@s um#ޠkusMdtC)#-3m_$gL+EU& XWO˫g=g[!OvdS =6*|vl$nqW.2? 5Jם,&!5k1`Q{2'9%j`zJ *@:tб8(hif`OLyvr*F./~b'LϮ?,;}RP$Vu }!FϨzFk7/>Є1gQD9Sm,bLq{͖_ۖߖyͿm@C"pl+| T{~2_Fd{FɤHF+>^d!$o8[5賸xI]IZQa4rˁ8_ :ɪ׏E>f'Z7VMP׺ur"7)YF:2~!']9F8䰄.AY|$mL"*zsW^Yd^ ,=yexe_WvzhoӪi~آXֶ!^E(0@O)caD3\v`]oY cEMŗy]$Z_ 9Py1KP$h:~Ə1HCtYwb@?E簊/3Y&OJ%Q]Z YE9i{ON!J/iyOzp澁O,PXfV=_tT4^mO\ϹV k\4J8Pk Q3($}6*%V܎,A˃UV3HJ}o~Խ9Dɜizď9Þ㆖?7k;>@\$nh/j~rAX ]e5$I Evdd~;HZ:O4g WϟOSi7ύ-#%Ix!p?o{Ź2F7Dh"&lȄDfY/LF+$cA׽?\lZ;α45Zo9jå'="pQz5~qmLܠUobWmY>/}9`w,VqPXi<]+<}r*Ƹ}j˂CU}p,Xy7qFԚՠ - ζb~] xV6u۾qO6qJ֚jg߉mKr[>cP*)ݶ7vAB޸L!OoC05q|>_Xzm ,Z/>+ sg?z_M1k5zw?\u|:^#RQl|g@@RrKQr^ XjFQk6$Y.?o#[44N'öݎVš7Ǯ8G)vZ%Ol[NןKpuu& >{"~r:$s]ȊYXA  2=[=ZDY4NajըZ->~ݢu3M5ʡˋoUngI +"Ed+, EY [_}T4x=1JfM8]՗YEƸt91REBl 5[|ItͶodZ3Ek>(n΢GUJ)Q2"ٮ870e-`ĞoZ,:<`BDqEW Q.|dm0T+sm {ר"e7.n;$01G Z9CQy&#W;PlZRw@azfȌآzKEίN7wBdxҥK" AʒA Y{I"[] ci7,˰Z5pXmg m`{l7vW>O cBp)XI'"))ih'3*ǏQ%]7|qX2X3Bo9p^,/o;IGIX @ #YRQ׽&9%N.x)R. Gռ.J~7q6uҼ ZpjwmW!_WR`Rjl0"Bۑ.@B_R=siK² N]4RDzN;bǢ6Om/|N]UhcaFECpǻ<ٷ= R۟=n0)ڼW%)'JμwTjbsGc,ZqP3 s}s~EP S Qɋ 6G7CQR3Rn嵫Lj̬8ÅyAX4T|M+Xbkv7NU[짐UWoxf=p[#*$# O[{&)=o> 1~۪CcİjWsV};r{ϱgv]^c("Əa6U z_g6URkI[󫟮O!9Wj5 Wf;e8rl\n)% gej2}EqmbI\Krm=$^5Wz$"3?g8yn}zhh6|%X&M,LaKCY}[f[ !NR0NZJT} vvWj7(x2,MSqʘVFXj'eNiA:Eg2qՀC Zܖ@p \^`:e6G6ݷ>{B,V<73%70Z0 iEg?)-m@㡔u&D0T+s_`>@ͽal.4@9~$f<àǚšHH%'#EP. ]&.g{,9k !P(MNAGCY* RyJ>ә HUl>Wdl.> ä~]'y 67Aoɠ71sc/}z%FFkv̀‹1Na{?f/L&`y~ЛB _GZ))Lmqh6_> |lԇ7t! י4ȦmD)j3][1`mq"q[xCk+s2 njz<4$ƨTE|`=)A8Ab{bgoX8~?pPB Vl1gWH'<]Q%Yy|fG/od!A.zp`+H =":%6)Me:C9aKeB,hJ٤.kss$r'?SX9DzĔʰהIRfKF5ب|szgdRCx8xʣ5E{TѸ7>>yӛ=43HMB0JA圾c:>K[QtQJyd Y+zǰW ]σd8\37Içϓ0 NZ9BI`~r[iRATUgZ,'H陉ۇy°'ߕ :Z'nZx5 _ϻ3+M xsd\]g('bw4Q[ v?F6#ۗ-0$FL _O{=06}&7\O'_%XJzf2L3 -i0hR VELhDx!qڢߝjp0EV,joh8kZ ("qQR[ 1a`Y` NBB˒ٜ\♹IGp{mp_F!&pm&wCppj 6Czf?(JwFVZEJ L%l4!>KFT`,h C3ZݩX#r՚՝!`PO㢄x/P ? Fd\$;L{`3 E` .>WL%*u((:KON 2Uunݸ`Nn/.d h2] A.s4?dS4fKn$"(BpBp$[MDA SN4Z5/0UJ;Us]/8v˃v;@L՞hvk!_8Dw"̑*+[RA7W?~tz{SUEX/{ބ9&.9&bjTt7= ]w6v\լuTX.3LeR9eZ#GI L;}&" c =0R"fVE5ᎏ V) WxD~<]W >ͭ>c< $O_x|ՇJEyY1y-ַh>&&>WR^Xkz-U?S +#Î/ kpQ3EWԛ3wr(7D}fY,R,<-Uh||41Y<YHCfX<>LH2grK9B1Ex Ɣ('dcS KH~FO8Ȋ8s~c/>jjz!3 d5#ODKlL^ 5Њ,D,DGaDKA:qۊ\+||؜\FQ8 8V_bEWHIcG:| W!y.3oG2MRRtyMbc)bpΤ!F{R(6%I Ӕd 3kHaor读Bvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000005211126115136502607017701 0ustar rootrootJan 28 20:39:30 crc systemd[1]: Starting Kubernetes Kubelet... Jan 28 20:39:30 crc restorecon[4681]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 20:39:30 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 20:39:31 crc restorecon[4681]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 28 20:39:31 crc restorecon[4681]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 28 20:39:32 crc kubenswrapper[4746]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 20:39:32 crc kubenswrapper[4746]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 28 20:39:32 crc kubenswrapper[4746]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 20:39:32 crc kubenswrapper[4746]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 20:39:32 crc kubenswrapper[4746]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 28 20:39:32 crc kubenswrapper[4746]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.586717 4746 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.593955 4746 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.593990 4746 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594001 4746 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594011 4746 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594020 4746 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594029 4746 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594039 4746 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594051 4746 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594063 4746 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594073 4746 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594113 4746 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594159 4746 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594169 4746 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594181 4746 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594192 4746 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594200 4746 feature_gate.go:330] unrecognized feature gate: Example Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594209 4746 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594217 4746 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594227 4746 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594235 4746 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594243 4746 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594251 4746 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594260 4746 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594268 4746 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594277 4746 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594285 4746 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594293 4746 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594301 4746 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594310 4746 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594318 4746 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594327 4746 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594335 4746 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594344 4746 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594354 4746 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594363 4746 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594373 4746 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594381 4746 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594389 4746 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594398 4746 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594408 4746 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594416 4746 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594424 4746 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594432 4746 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594441 4746 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594450 4746 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594458 4746 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594466 4746 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594474 4746 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594486 4746 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594496 4746 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594505 4746 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594515 4746 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594523 4746 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594533 4746 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594545 4746 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594557 4746 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594568 4746 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594578 4746 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594588 4746 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594596 4746 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594605 4746 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594616 4746 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594626 4746 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594634 4746 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594642 4746 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594651 4746 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594659 4746 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594672 4746 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594684 4746 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594693 4746 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.594705 4746 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.594877 4746 flags.go:64] FLAG: --address="0.0.0.0" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.594896 4746 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.594911 4746 flags.go:64] FLAG: --anonymous-auth="true" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.594923 4746 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.594935 4746 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.594946 4746 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.594958 4746 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.594970 4746 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.594980 4746 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.594990 4746 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595001 4746 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595011 4746 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595021 4746 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595031 4746 flags.go:64] FLAG: --cgroup-root="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595042 4746 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595052 4746 flags.go:64] FLAG: --client-ca-file="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595061 4746 flags.go:64] FLAG: --cloud-config="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595071 4746 flags.go:64] FLAG: --cloud-provider="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595131 4746 flags.go:64] FLAG: --cluster-dns="[]" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595154 4746 flags.go:64] FLAG: --cluster-domain="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595163 4746 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595174 4746 flags.go:64] FLAG: --config-dir="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595183 4746 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595194 4746 flags.go:64] FLAG: --container-log-max-files="5" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595208 4746 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595221 4746 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595246 4746 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595266 4746 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595283 4746 flags.go:64] FLAG: --contention-profiling="false" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595295 4746 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595309 4746 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595322 4746 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595335 4746 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595351 4746 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595362 4746 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595371 4746 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595382 4746 flags.go:64] FLAG: --enable-load-reader="false" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595392 4746 flags.go:64] FLAG: --enable-server="true" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595403 4746 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595428 4746 flags.go:64] FLAG: --event-burst="100" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595438 4746 flags.go:64] FLAG: --event-qps="50" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595448 4746 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595459 4746 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595469 4746 flags.go:64] FLAG: --eviction-hard="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595481 4746 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595491 4746 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595501 4746 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595511 4746 flags.go:64] FLAG: --eviction-soft="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595521 4746 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595531 4746 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595555 4746 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595565 4746 flags.go:64] FLAG: --experimental-mounter-path="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595575 4746 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595585 4746 flags.go:64] FLAG: --fail-swap-on="true" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595594 4746 flags.go:64] FLAG: --feature-gates="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595606 4746 flags.go:64] FLAG: --file-check-frequency="20s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595616 4746 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595625 4746 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595636 4746 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595647 4746 flags.go:64] FLAG: --healthz-port="10248" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595658 4746 flags.go:64] FLAG: --help="false" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595667 4746 flags.go:64] FLAG: --hostname-override="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595677 4746 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595687 4746 flags.go:64] FLAG: --http-check-frequency="20s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595697 4746 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595706 4746 flags.go:64] FLAG: --image-credential-provider-config="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595715 4746 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595726 4746 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595736 4746 flags.go:64] FLAG: --image-service-endpoint="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595746 4746 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595757 4746 flags.go:64] FLAG: --kube-api-burst="100" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595767 4746 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595777 4746 flags.go:64] FLAG: --kube-api-qps="50" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595787 4746 flags.go:64] FLAG: --kube-reserved="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595797 4746 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595806 4746 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595816 4746 flags.go:64] FLAG: --kubelet-cgroups="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595825 4746 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595834 4746 flags.go:64] FLAG: --lock-file="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595844 4746 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595854 4746 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595864 4746 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595878 4746 flags.go:64] FLAG: --log-json-split-stream="false" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595888 4746 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595898 4746 flags.go:64] FLAG: --log-text-split-stream="false" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595907 4746 flags.go:64] FLAG: --logging-format="text" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595929 4746 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595940 4746 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595950 4746 flags.go:64] FLAG: --manifest-url="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595960 4746 flags.go:64] FLAG: --manifest-url-header="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595972 4746 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.595983 4746 flags.go:64] FLAG: --max-open-files="1000000" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596000 4746 flags.go:64] FLAG: --max-pods="110" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596010 4746 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596020 4746 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596029 4746 flags.go:64] FLAG: --memory-manager-policy="None" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596039 4746 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596049 4746 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596058 4746 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596069 4746 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596465 4746 flags.go:64] FLAG: --node-status-max-images="50" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596483 4746 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596493 4746 flags.go:64] FLAG: --oom-score-adj="-999" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596503 4746 flags.go:64] FLAG: --pod-cidr="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596512 4746 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596528 4746 flags.go:64] FLAG: --pod-manifest-path="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596538 4746 flags.go:64] FLAG: --pod-max-pids="-1" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596548 4746 flags.go:64] FLAG: --pods-per-core="0" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596557 4746 flags.go:64] FLAG: --port="10250" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596568 4746 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596578 4746 flags.go:64] FLAG: --provider-id="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596588 4746 flags.go:64] FLAG: --qos-reserved="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596599 4746 flags.go:64] FLAG: --read-only-port="10255" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596609 4746 flags.go:64] FLAG: --register-node="true" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596618 4746 flags.go:64] FLAG: --register-schedulable="true" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596629 4746 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596646 4746 flags.go:64] FLAG: --registry-burst="10" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596656 4746 flags.go:64] FLAG: --registry-qps="5" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596665 4746 flags.go:64] FLAG: --reserved-cpus="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596674 4746 flags.go:64] FLAG: --reserved-memory="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596687 4746 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596697 4746 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596710 4746 flags.go:64] FLAG: --rotate-certificates="false" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596720 4746 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596730 4746 flags.go:64] FLAG: --runonce="false" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596741 4746 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596751 4746 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596760 4746 flags.go:64] FLAG: --seccomp-default="false" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596770 4746 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596779 4746 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596790 4746 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596800 4746 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596810 4746 flags.go:64] FLAG: --storage-driver-password="root" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596819 4746 flags.go:64] FLAG: --storage-driver-secure="false" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596829 4746 flags.go:64] FLAG: --storage-driver-table="stats" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596839 4746 flags.go:64] FLAG: --storage-driver-user="root" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596849 4746 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596858 4746 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596868 4746 flags.go:64] FLAG: --system-cgroups="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596878 4746 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596893 4746 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596903 4746 flags.go:64] FLAG: --tls-cert-file="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596913 4746 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596926 4746 flags.go:64] FLAG: --tls-min-version="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596935 4746 flags.go:64] FLAG: --tls-private-key-file="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596945 4746 flags.go:64] FLAG: --topology-manager-policy="none" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596955 4746 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596964 4746 flags.go:64] FLAG: --topology-manager-scope="container" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596974 4746 flags.go:64] FLAG: --v="2" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596987 4746 flags.go:64] FLAG: --version="false" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.596999 4746 flags.go:64] FLAG: --vmodule="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.597011 4746 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.597022 4746 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597284 4746 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597297 4746 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597308 4746 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597318 4746 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597326 4746 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597343 4746 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597352 4746 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597361 4746 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597369 4746 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597378 4746 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597386 4746 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597394 4746 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597402 4746 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597411 4746 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597419 4746 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597427 4746 feature_gate.go:330] unrecognized feature gate: Example Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597436 4746 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597444 4746 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597455 4746 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597467 4746 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597478 4746 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597487 4746 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597495 4746 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597504 4746 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597516 4746 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597527 4746 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597538 4746 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597548 4746 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597557 4746 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597567 4746 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597576 4746 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597585 4746 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597596 4746 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597606 4746 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597614 4746 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597631 4746 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597639 4746 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597648 4746 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597656 4746 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597664 4746 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597677 4746 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597686 4746 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597695 4746 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597703 4746 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597711 4746 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597719 4746 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597727 4746 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597736 4746 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597744 4746 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597752 4746 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597761 4746 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597769 4746 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597777 4746 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597785 4746 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597793 4746 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597802 4746 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597810 4746 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597818 4746 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597826 4746 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597834 4746 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597842 4746 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597851 4746 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597860 4746 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597868 4746 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597877 4746 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597885 4746 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597893 4746 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597905 4746 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597913 4746 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597922 4746 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.597930 4746 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.597957 4746 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.609555 4746 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.609593 4746 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609742 4746 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609757 4746 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609768 4746 feature_gate.go:330] unrecognized feature gate: Example Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609779 4746 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609788 4746 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609798 4746 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609806 4746 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609815 4746 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609823 4746 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609832 4746 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609840 4746 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609848 4746 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609857 4746 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609865 4746 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609873 4746 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609881 4746 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609890 4746 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609901 4746 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609912 4746 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609924 4746 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609933 4746 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609942 4746 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.609952 4746 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610001 4746 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610010 4746 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610019 4746 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610027 4746 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610036 4746 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610045 4746 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610053 4746 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610062 4746 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610100 4746 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610110 4746 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610119 4746 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610127 4746 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610137 4746 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610146 4746 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610154 4746 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610163 4746 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610172 4746 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610180 4746 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610191 4746 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610201 4746 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610211 4746 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610219 4746 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610227 4746 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610237 4746 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610248 4746 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610260 4746 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610270 4746 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610280 4746 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610290 4746 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610301 4746 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610311 4746 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610320 4746 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610328 4746 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610337 4746 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610345 4746 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610354 4746 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610365 4746 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610373 4746 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610381 4746 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610392 4746 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610404 4746 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610414 4746 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610423 4746 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610433 4746 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610444 4746 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610457 4746 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610468 4746 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610478 4746 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.610492 4746 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610725 4746 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610738 4746 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610748 4746 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610756 4746 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610765 4746 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610773 4746 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610781 4746 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610789 4746 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610801 4746 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610812 4746 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610822 4746 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610831 4746 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610841 4746 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610849 4746 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610858 4746 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610866 4746 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610875 4746 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610884 4746 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610892 4746 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610903 4746 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610913 4746 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610922 4746 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610931 4746 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610940 4746 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610950 4746 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610958 4746 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610967 4746 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610975 4746 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610984 4746 feature_gate.go:330] unrecognized feature gate: Example Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.610992 4746 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611000 4746 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611008 4746 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611018 4746 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611026 4746 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611035 4746 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611043 4746 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611054 4746 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611064 4746 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611075 4746 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611193 4746 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611208 4746 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611221 4746 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611231 4746 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611242 4746 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611254 4746 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611264 4746 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611275 4746 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611284 4746 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611292 4746 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611301 4746 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611310 4746 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611319 4746 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611327 4746 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611336 4746 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611345 4746 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611353 4746 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611362 4746 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611374 4746 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611384 4746 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611394 4746 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611404 4746 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611414 4746 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611423 4746 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611432 4746 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611442 4746 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611451 4746 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611459 4746 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611469 4746 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611482 4746 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611491 4746 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.611500 4746 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.611514 4746 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.611755 4746 server.go:940] "Client rotation is on, will bootstrap in background" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.618482 4746 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.618612 4746 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.621329 4746 server.go:997] "Starting client certificate rotation" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.621385 4746 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.621643 4746 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-09 06:49:11.922063025 +0000 UTC Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.621781 4746 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.656771 4746 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 28 20:39:32 crc kubenswrapper[4746]: E0128 20:39:32.662136 4746 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.663565 4746 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.684249 4746 log.go:25] "Validated CRI v1 runtime API" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.723436 4746 log.go:25] "Validated CRI v1 image API" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.725227 4746 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.730583 4746 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-28-20-35-19-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.730625 4746 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.744237 4746 manager.go:217] Machine: {Timestamp:2026-01-28 20:39:32.742699507 +0000 UTC m=+0.698885881 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:e89ecf32-8beb-4b41-b6df-0f1293ce0213 BootID:349dd4ef-f3ea-4c41-bfa2-75ea02498ab0 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:27:5f:68 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:27:5f:68 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c2:c3:93 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:62:17:41 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:9d:81:f4 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:c0:25:e9 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:02:5a:47:d2:48:a6 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fa:36:5f:b9:c1:7b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.744503 4746 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.744774 4746 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.747057 4746 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.747450 4746 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.747506 4746 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.747804 4746 topology_manager.go:138] "Creating topology manager with none policy" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.747820 4746 container_manager_linux.go:303] "Creating device plugin manager" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.748330 4746 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.748377 4746 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.748743 4746 state_mem.go:36] "Initialized new in-memory state store" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.749312 4746 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.754725 4746 kubelet.go:418] "Attempting to sync node with API server" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.754756 4746 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.754813 4746 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.754832 4746 kubelet.go:324] "Adding apiserver pod source" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.754850 4746 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.759306 4746 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.760337 4746 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.762039 4746 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Jan 28 20:39:32 crc kubenswrapper[4746]: E0128 20:39:32.762178 4746 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.762041 4746 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Jan 28 20:39:32 crc kubenswrapper[4746]: E0128 20:39:32.762228 4746 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.763477 4746 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.765158 4746 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.765196 4746 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.765209 4746 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.765219 4746 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.765235 4746 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.765246 4746 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.765256 4746 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.765271 4746 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.765280 4746 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.765290 4746 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.765327 4746 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.765338 4746 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.766180 4746 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.766798 4746 server.go:1280] "Started kubelet" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.767690 4746 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.767137 4746 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.767952 4746 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Jan 28 20:39:32 crc systemd[1]: Started Kubernetes Kubelet. Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.774744 4746 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.776036 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.776113 4746 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.776600 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 16:32:28.922791581 +0000 UTC Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.776795 4746 server.go:460] "Adding debug handlers to kubelet server" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.776846 4746 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.776864 4746 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 28 20:39:32 crc kubenswrapper[4746]: E0128 20:39:32.776867 4746 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 28 20:39:32 crc kubenswrapper[4746]: E0128 20:39:32.778448 4746 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="200ms" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.778578 4746 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.778645 4746 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Jan 28 20:39:32 crc kubenswrapper[4746]: E0128 20:39:32.778716 4746 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.780480 4746 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.780623 4746 factory.go:55] Registering systemd factory Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.780709 4746 factory.go:221] Registration of the systemd container factory successfully Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.782282 4746 factory.go:153] Registering CRI-O factory Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.782312 4746 factory.go:221] Registration of the crio container factory successfully Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.782334 4746 factory.go:103] Registering Raw factory Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.782349 4746 manager.go:1196] Started watching for new ooms in manager Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.783207 4746 manager.go:319] Starting recovery of all containers Jan 28 20:39:32 crc kubenswrapper[4746]: E0128 20:39:32.784722 4746 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.201:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188effa183a72472 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-28 20:39:32.76677029 +0000 UTC m=+0.722956644,LastTimestamp:2026-01-28 20:39:32.76677029 +0000 UTC m=+0.722956644,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.792969 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793033 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793054 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793069 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793105 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793120 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793135 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793150 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793168 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793212 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793235 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793300 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793314 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793334 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793353 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793376 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793394 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793411 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793427 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793444 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793467 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793489 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793509 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793583 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793604 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793627 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793646 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793661 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793677 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793695 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793713 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793733 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793748 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793789 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793812 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793833 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793855 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793875 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793892 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793911 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793931 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793949 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793968 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.793990 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794010 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794028 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794046 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794065 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794135 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794157 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794184 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794204 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794232 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794265 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794287 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794311 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794334 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794352 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794372 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794393 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794411 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794429 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794449 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794469 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794490 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794510 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794562 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794586 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794605 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794623 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794642 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794657 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794670 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794686 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794702 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794718 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794732 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794748 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794762 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794787 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794802 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794818 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794831 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794845 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794882 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794909 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.794930 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.795030 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.795046 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.795110 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.802978 4746 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803070 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803208 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803239 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803297 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803324 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803356 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803432 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803509 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803532 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803556 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803576 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803593 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803613 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803630 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803662 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803687 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803707 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803732 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803754 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803770 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803790 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803820 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803833 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803850 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803864 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803887 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803899 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803911 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803924 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803935 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803948 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.803959 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804000 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804015 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804027 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804041 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804057 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804096 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804113 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804126 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804139 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804151 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804162 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804175 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804185 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804199 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804210 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804222 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804235 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804245 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804260 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804271 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804282 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804300 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804314 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804329 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804344 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804355 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804369 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804380 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804390 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804403 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804413 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804428 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804439 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804453 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804473 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804488 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804504 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804519 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804534 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804550 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804564 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804582 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804593 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804605 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804623 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804636 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804653 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804669 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804683 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.804700 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805153 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805201 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805242 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805269 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805294 4746 manager.go:324] Recovery completed Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805311 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805343 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805369 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805405 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805438 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805469 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805493 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805517 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805549 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805573 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805603 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805644 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805668 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805699 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805723 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805754 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805781 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805807 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805837 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805862 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805894 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805917 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805940 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805969 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.805993 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.806025 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.806049 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.806112 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.806146 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.806188 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.806213 4746 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.806236 4746 reconstruct.go:97] "Volume reconstruction finished" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.806251 4746 reconciler.go:26] "Reconciler: start to sync state" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.824646 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.827657 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.827726 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.827776 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.828660 4746 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.828676 4746 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.828698 4746 state_mem.go:36] "Initialized new in-memory state store" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.832531 4746 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.834472 4746 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.834522 4746 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.834554 4746 kubelet.go:2335] "Starting kubelet main sync loop" Jan 28 20:39:32 crc kubenswrapper[4746]: E0128 20:39:32.834622 4746 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 28 20:39:32 crc kubenswrapper[4746]: W0128 20:39:32.836515 4746 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Jan 28 20:39:32 crc kubenswrapper[4746]: E0128 20:39:32.836628 4746 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.851751 4746 policy_none.go:49] "None policy: Start" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.852793 4746 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.852816 4746 state_mem.go:35] "Initializing new in-memory state store" Jan 28 20:39:32 crc kubenswrapper[4746]: E0128 20:39:32.878027 4746 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.904938 4746 manager.go:334] "Starting Device Plugin manager" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.905031 4746 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.905049 4746 server.go:79] "Starting device plugin registration server" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.905644 4746 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.905668 4746 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.905906 4746 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.906026 4746 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.906042 4746 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 28 20:39:32 crc kubenswrapper[4746]: E0128 20:39:32.916197 4746 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.935440 4746 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.935622 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.939360 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.939414 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.939426 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.939685 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.939927 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.939974 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.940859 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.940897 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.940907 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.941283 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.941313 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.941328 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.941448 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.941930 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.941975 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.942117 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.942147 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.942159 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.942276 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.942371 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.942402 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.942745 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.942776 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.942793 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.943180 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.943231 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.943244 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.943190 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.943312 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.943324 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.943448 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.943661 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.943690 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.944581 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.944618 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.944634 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.944873 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.944909 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.945203 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.945237 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.945250 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.945656 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.945695 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:32 crc kubenswrapper[4746]: I0128 20:39:32.945713 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:32 crc kubenswrapper[4746]: E0128 20:39:32.979783 4746 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="400ms" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.005917 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.008294 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.008347 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.008359 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.008390 4746 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 20:39:33 crc kubenswrapper[4746]: E0128 20:39:33.009177 4746 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.009663 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.009824 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.010004 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.010774 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.010812 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.010839 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.010865 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.010887 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.010905 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.010988 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.011050 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.011123 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.011170 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.011208 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.011258 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.112476 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.112590 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.112655 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.112685 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.112736 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.112764 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.112810 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.112808 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.112897 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.112897 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.112833 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.113049 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.112984 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.112852 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.112924 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.113074 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.113170 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.112994 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.113220 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.113274 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.113285 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.113341 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.113391 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.113397 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.113388 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.113439 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.113451 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.113514 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.113570 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.113602 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.210040 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.212110 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.212180 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.212208 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.212258 4746 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 20:39:33 crc kubenswrapper[4746]: E0128 20:39:33.213177 4746 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.265478 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.277636 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.298134 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.304588 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.310737 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 28 20:39:33 crc kubenswrapper[4746]: W0128 20:39:33.321801 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-6e043ba0aff2dc20a679fc948da0291afbce041faddeed773542b405bdf15739 WatchSource:0}: Error finding container 6e043ba0aff2dc20a679fc948da0291afbce041faddeed773542b405bdf15739: Status 404 returned error can't find the container with id 6e043ba0aff2dc20a679fc948da0291afbce041faddeed773542b405bdf15739 Jan 28 20:39:33 crc kubenswrapper[4746]: W0128 20:39:33.322219 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-af2daf3b61098cfef3670ef804ed6d24aaf1c8de7a667d6c34b679ed08635ad5 WatchSource:0}: Error finding container af2daf3b61098cfef3670ef804ed6d24aaf1c8de7a667d6c34b679ed08635ad5: Status 404 returned error can't find the container with id af2daf3b61098cfef3670ef804ed6d24aaf1c8de7a667d6c34b679ed08635ad5 Jan 28 20:39:33 crc kubenswrapper[4746]: W0128 20:39:33.330635 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-bd8bcc8256d3028555944d99b31567e03fb2332d3aeb926b491e71bb3b6144b6 WatchSource:0}: Error finding container bd8bcc8256d3028555944d99b31567e03fb2332d3aeb926b491e71bb3b6144b6: Status 404 returned error can't find the container with id bd8bcc8256d3028555944d99b31567e03fb2332d3aeb926b491e71bb3b6144b6 Jan 28 20:39:33 crc kubenswrapper[4746]: W0128 20:39:33.345000 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-8016521c9b8abe9faf5e12962aceecfb8f91c5417fd927f28131d0eace15ccde WatchSource:0}: Error finding container 8016521c9b8abe9faf5e12962aceecfb8f91c5417fd927f28131d0eace15ccde: Status 404 returned error can't find the container with id 8016521c9b8abe9faf5e12962aceecfb8f91c5417fd927f28131d0eace15ccde Jan 28 20:39:33 crc kubenswrapper[4746]: E0128 20:39:33.381027 4746 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="800ms" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.614319 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.615923 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.615969 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.615985 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.616015 4746 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 20:39:33 crc kubenswrapper[4746]: E0128 20:39:33.616499 4746 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Jan 28 20:39:33 crc kubenswrapper[4746]: W0128 20:39:33.687702 4746 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Jan 28 20:39:33 crc kubenswrapper[4746]: E0128 20:39:33.687855 4746 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.770167 4746 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.777173 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 18:39:55.252054488 +0000 UTC Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.839578 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"15397dd236ec9961d8e1c5ff60aa6a87a97ffad12075340a730067fe7977b765"} Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.840816 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bd8bcc8256d3028555944d99b31567e03fb2332d3aeb926b491e71bb3b6144b6"} Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.842043 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6e043ba0aff2dc20a679fc948da0291afbce041faddeed773542b405bdf15739"} Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.843176 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"af2daf3b61098cfef3670ef804ed6d24aaf1c8de7a667d6c34b679ed08635ad5"} Jan 28 20:39:33 crc kubenswrapper[4746]: I0128 20:39:33.845242 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8016521c9b8abe9faf5e12962aceecfb8f91c5417fd927f28131d0eace15ccde"} Jan 28 20:39:33 crc kubenswrapper[4746]: W0128 20:39:33.936462 4746 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Jan 28 20:39:33 crc kubenswrapper[4746]: E0128 20:39:33.936722 4746 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Jan 28 20:39:34 crc kubenswrapper[4746]: W0128 20:39:34.104449 4746 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Jan 28 20:39:34 crc kubenswrapper[4746]: E0128 20:39:34.104561 4746 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Jan 28 20:39:34 crc kubenswrapper[4746]: E0128 20:39:34.182000 4746 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="1.6s" Jan 28 20:39:34 crc kubenswrapper[4746]: W0128 20:39:34.315032 4746 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Jan 28 20:39:34 crc kubenswrapper[4746]: E0128 20:39:34.315228 4746 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.416838 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.418757 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.418810 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.418822 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.418849 4746 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 20:39:34 crc kubenswrapper[4746]: E0128 20:39:34.420719 4746 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.770120 4746 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.778058 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 19:01:52.780013382 +0000 UTC Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.781500 4746 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 28 20:39:34 crc kubenswrapper[4746]: E0128 20:39:34.782561 4746 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.854224 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0"} Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.854281 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91"} Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.854302 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602"} Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.854344 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec"} Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.854380 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.855940 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.856004 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.856024 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.856433 4746 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c" exitCode=0 Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.856551 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c"} Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.856607 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.858267 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.858304 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.858314 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.859262 4746 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b2a1482245dbcc87fb46916cade2708c81888aec325fdd981b47846d9998f6d2" exitCode=0 Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.859311 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b2a1482245dbcc87fb46916cade2708c81888aec325fdd981b47846d9998f6d2"} Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.859358 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.860474 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.860505 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.860518 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.860776 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.861451 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.861479 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.861490 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.862024 4746 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="57975efbbe1fef93d88a55df5a0951063515ba898b928963f616fc23a05b5c09" exitCode=0 Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.862074 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"57975efbbe1fef93d88a55df5a0951063515ba898b928963f616fc23a05b5c09"} Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.862101 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.862881 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.862917 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.862934 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.888269 4746 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd" exitCode=0 Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.888323 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd"} Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.888470 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.890454 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.890500 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:34 crc kubenswrapper[4746]: I0128 20:39:34.890516 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.770262 4746 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.778467 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 16:21:35.644046091 +0000 UTC Jan 28 20:39:35 crc kubenswrapper[4746]: E0128 20:39:35.783606 4746 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="3.2s" Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.894003 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0c996cd0a95ad18373212e715f575ffd233fb252c0c1701b6fd7d1c0fcade202"} Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.894136 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.895163 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.895200 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.895211 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.896539 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cf6d80884e00b1a12051a7a97148a46fba0e8514a1233a180262392c302db77f"} Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.896608 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.896617 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"00d75277c1a1e8fdc34e37a3ae3d697e002007456fde3dae5d49c1c932a0a7f5"} Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.896637 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2ba6f8586593282902571354d53a7bdc0945ea8d1970cac1bfc2f8cc4019a4ab"} Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.897260 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.897296 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.897307 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.904500 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5"} Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.904600 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530"} Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.904616 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3"} Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.904628 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c"} Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.907175 4746 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="778ac26701d97ec6712ad019620908fe8a0b861d6f780192d4674d3e04bd12ab" exitCode=0 Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.907220 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"778ac26701d97ec6712ad019620908fe8a0b861d6f780192d4674d3e04bd12ab"} Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.907313 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.907313 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.908207 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.908251 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.908266 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.909113 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.909147 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:35 crc kubenswrapper[4746]: I0128 20:39:35.909160 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.021605 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.023347 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.023403 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.023417 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.023451 4746 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 20:39:36 crc kubenswrapper[4746]: E0128 20:39:36.024040 4746 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.779314 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 14:12:09.773363605 +0000 UTC Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.912208 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2bd133218655e67cdc29b14c35357368452991c316986b1fa90069554eba28c0"} Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.912268 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.913058 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.913103 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.913113 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.913809 4746 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d068c6c0f2664a5fe70ef836e158cb2f9ca7d7c64ee2b2cb7ab8406f584c21de" exitCode=0 Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.913875 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d068c6c0f2664a5fe70ef836e158cb2f9ca7d7c64ee2b2cb7ab8406f584c21de"} Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.913878 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.913927 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.913946 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.914020 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.914507 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.914530 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.914539 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.914606 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.914621 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.914628 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.914729 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.914743 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:36 crc kubenswrapper[4746]: I0128 20:39:36.914750 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:37 crc kubenswrapper[4746]: I0128 20:39:37.779841 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 04:31:43.277469313 +0000 UTC Jan 28 20:39:37 crc kubenswrapper[4746]: I0128 20:39:37.922264 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"56371247ab5de90c39d65341269582f77c95f06691e70b30012756a2b91c2478"} Jan 28 20:39:37 crc kubenswrapper[4746]: I0128 20:39:37.922331 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"959945b9576f9c3f90550119b9cce0c28a363de3f41aef0b31cd43f134edd1d3"} Jan 28 20:39:37 crc kubenswrapper[4746]: I0128 20:39:37.922348 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c5dd5146e539117f8dd83780beb11865b672e3866cc05b2728c92c3d2657975b"} Jan 28 20:39:37 crc kubenswrapper[4746]: I0128 20:39:37.922362 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a0840ed7999d45932a41fcdfdd5aff3f6ecf46aa297d83a9b2ddb7b21830f985"} Jan 28 20:39:37 crc kubenswrapper[4746]: I0128 20:39:37.922387 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:39:37 crc kubenswrapper[4746]: I0128 20:39:37.922362 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:37 crc kubenswrapper[4746]: I0128 20:39:37.922362 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:37 crc kubenswrapper[4746]: I0128 20:39:37.923546 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:37 crc kubenswrapper[4746]: I0128 20:39:37.923576 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:37 crc kubenswrapper[4746]: I0128 20:39:37.923587 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:37 crc kubenswrapper[4746]: I0128 20:39:37.924335 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:37 crc kubenswrapper[4746]: I0128 20:39:37.924349 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:37 crc kubenswrapper[4746]: I0128 20:39:37.924359 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.455195 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.455406 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.456856 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.457164 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.457353 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.780695 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 09:46:30.223239721 +0000 UTC Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.847409 4746 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.865663 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.930189 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"26d1a6cd0639d4035c3de44e34402b2f9c4d8bbe2feb9e6c9c32de270de5a92f"} Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.930279 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.930394 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.930922 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.931695 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.931745 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.931763 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.932030 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.932066 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.932091 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.932818 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.932854 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:38 crc kubenswrapper[4746]: I0128 20:39:38.932867 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:39 crc kubenswrapper[4746]: I0128 20:39:39.069122 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:39:39 crc kubenswrapper[4746]: I0128 20:39:39.225208 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:39 crc kubenswrapper[4746]: I0128 20:39:39.226792 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:39 crc kubenswrapper[4746]: I0128 20:39:39.226852 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:39 crc kubenswrapper[4746]: I0128 20:39:39.226869 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:39 crc kubenswrapper[4746]: I0128 20:39:39.226905 4746 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 20:39:39 crc kubenswrapper[4746]: I0128 20:39:39.781207 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 23:06:51.821706632 +0000 UTC Jan 28 20:39:39 crc kubenswrapper[4746]: I0128 20:39:39.933266 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:39 crc kubenswrapper[4746]: I0128 20:39:39.933355 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:39 crc kubenswrapper[4746]: I0128 20:39:39.934631 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:39 crc kubenswrapper[4746]: I0128 20:39:39.934700 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:39 crc kubenswrapper[4746]: I0128 20:39:39.934722 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:39 crc kubenswrapper[4746]: I0128 20:39:39.934963 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:39 crc kubenswrapper[4746]: I0128 20:39:39.935020 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:39 crc kubenswrapper[4746]: I0128 20:39:39.935034 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:40 crc kubenswrapper[4746]: I0128 20:39:40.480214 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:39:40 crc kubenswrapper[4746]: I0128 20:39:40.573304 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 28 20:39:40 crc kubenswrapper[4746]: I0128 20:39:40.782185 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 04:18:01.854146329 +0000 UTC Jan 28 20:39:40 crc kubenswrapper[4746]: I0128 20:39:40.935632 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:40 crc kubenswrapper[4746]: I0128 20:39:40.935902 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:40 crc kubenswrapper[4746]: I0128 20:39:40.937376 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:40 crc kubenswrapper[4746]: I0128 20:39:40.937453 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:40 crc kubenswrapper[4746]: I0128 20:39:40.937474 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:40 crc kubenswrapper[4746]: I0128 20:39:40.937550 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:40 crc kubenswrapper[4746]: I0128 20:39:40.937593 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:40 crc kubenswrapper[4746]: I0128 20:39:40.937603 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:41 crc kubenswrapper[4746]: I0128 20:39:41.545805 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:39:41 crc kubenswrapper[4746]: I0128 20:39:41.545998 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:41 crc kubenswrapper[4746]: I0128 20:39:41.547290 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:41 crc kubenswrapper[4746]: I0128 20:39:41.547379 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:41 crc kubenswrapper[4746]: I0128 20:39:41.547425 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:41 crc kubenswrapper[4746]: I0128 20:39:41.782437 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 05:26:30.381346923 +0000 UTC Jan 28 20:39:42 crc kubenswrapper[4746]: I0128 20:39:42.783170 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 00:22:00.090511421 +0000 UTC Jan 28 20:39:42 crc kubenswrapper[4746]: E0128 20:39:42.916402 4746 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 28 20:39:43 crc kubenswrapper[4746]: I0128 20:39:43.333404 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:39:43 crc kubenswrapper[4746]: I0128 20:39:43.333679 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:43 crc kubenswrapper[4746]: I0128 20:39:43.335563 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:43 crc kubenswrapper[4746]: I0128 20:39:43.335603 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:43 crc kubenswrapper[4746]: I0128 20:39:43.335614 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:43 crc kubenswrapper[4746]: I0128 20:39:43.339993 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:39:43 crc kubenswrapper[4746]: I0128 20:39:43.371753 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 28 20:39:43 crc kubenswrapper[4746]: I0128 20:39:43.372143 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:43 crc kubenswrapper[4746]: I0128 20:39:43.373771 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:43 crc kubenswrapper[4746]: I0128 20:39:43.373803 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:43 crc kubenswrapper[4746]: I0128 20:39:43.373814 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:43 crc kubenswrapper[4746]: I0128 20:39:43.783831 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 23:33:22.432034761 +0000 UTC Jan 28 20:39:43 crc kubenswrapper[4746]: I0128 20:39:43.942932 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:43 crc kubenswrapper[4746]: I0128 20:39:43.943935 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:43 crc kubenswrapper[4746]: I0128 20:39:43.943977 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:43 crc kubenswrapper[4746]: I0128 20:39:43.943994 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:43 crc kubenswrapper[4746]: I0128 20:39:43.947511 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:39:44 crc kubenswrapper[4746]: I0128 20:39:44.546586 4746 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 28 20:39:44 crc kubenswrapper[4746]: I0128 20:39:44.546739 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 20:39:44 crc kubenswrapper[4746]: I0128 20:39:44.784141 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 08:13:17.278467481 +0000 UTC Jan 28 20:39:44 crc kubenswrapper[4746]: I0128 20:39:44.946354 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:44 crc kubenswrapper[4746]: I0128 20:39:44.949032 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:44 crc kubenswrapper[4746]: I0128 20:39:44.949143 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:44 crc kubenswrapper[4746]: I0128 20:39:44.949164 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:45 crc kubenswrapper[4746]: I0128 20:39:45.784937 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 11:59:54.018057938 +0000 UTC Jan 28 20:39:46 crc kubenswrapper[4746]: W0128 20:39:46.553135 4746 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 28 20:39:46 crc kubenswrapper[4746]: I0128 20:39:46.553843 4746 trace.go:236] Trace[281555365]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Jan-2026 20:39:36.550) (total time: 10002ms): Jan 28 20:39:46 crc kubenswrapper[4746]: Trace[281555365]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (20:39:46.553) Jan 28 20:39:46 crc kubenswrapper[4746]: Trace[281555365]: [10.002445316s] [10.002445316s] END Jan 28 20:39:46 crc kubenswrapper[4746]: E0128 20:39:46.553898 4746 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 28 20:39:46 crc kubenswrapper[4746]: I0128 20:39:46.555152 4746 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:40170->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 28 20:39:46 crc kubenswrapper[4746]: I0128 20:39:46.555211 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:40170->192.168.126.11:17697: read: connection reset by peer" Jan 28 20:39:46 crc kubenswrapper[4746]: W0128 20:39:46.754774 4746 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 28 20:39:46 crc kubenswrapper[4746]: I0128 20:39:46.754907 4746 trace.go:236] Trace[925705444]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Jan-2026 20:39:36.753) (total time: 10001ms): Jan 28 20:39:46 crc kubenswrapper[4746]: Trace[925705444]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (20:39:46.754) Jan 28 20:39:46 crc kubenswrapper[4746]: Trace[925705444]: [10.001724636s] [10.001724636s] END Jan 28 20:39:46 crc kubenswrapper[4746]: E0128 20:39:46.754932 4746 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 28 20:39:46 crc kubenswrapper[4746]: I0128 20:39:46.770867 4746 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 28 20:39:46 crc kubenswrapper[4746]: I0128 20:39:46.785162 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 23:07:38.429500414 +0000 UTC Jan 28 20:39:46 crc kubenswrapper[4746]: I0128 20:39:46.954042 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 28 20:39:46 crc kubenswrapper[4746]: I0128 20:39:46.955864 4746 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2bd133218655e67cdc29b14c35357368452991c316986b1fa90069554eba28c0" exitCode=255 Jan 28 20:39:46 crc kubenswrapper[4746]: I0128 20:39:46.955909 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2bd133218655e67cdc29b14c35357368452991c316986b1fa90069554eba28c0"} Jan 28 20:39:46 crc kubenswrapper[4746]: I0128 20:39:46.956101 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:46 crc kubenswrapper[4746]: I0128 20:39:46.957021 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:46 crc kubenswrapper[4746]: I0128 20:39:46.957236 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:46 crc kubenswrapper[4746]: I0128 20:39:46.957253 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:46 crc kubenswrapper[4746]: I0128 20:39:46.957859 4746 scope.go:117] "RemoveContainer" containerID="2bd133218655e67cdc29b14c35357368452991c316986b1fa90069554eba28c0" Jan 28 20:39:47 crc kubenswrapper[4746]: W0128 20:39:47.015501 4746 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 28 20:39:47 crc kubenswrapper[4746]: I0128 20:39:47.015630 4746 trace.go:236] Trace[1167408024]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Jan-2026 20:39:37.012) (total time: 10003ms): Jan 28 20:39:47 crc kubenswrapper[4746]: Trace[1167408024]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10003ms (20:39:47.015) Jan 28 20:39:47 crc kubenswrapper[4746]: Trace[1167408024]: [10.003177997s] [10.003177997s] END Jan 28 20:39:47 crc kubenswrapper[4746]: E0128 20:39:47.015663 4746 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 28 20:39:47 crc kubenswrapper[4746]: I0128 20:39:47.493043 4746 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 28 20:39:47 crc kubenswrapper[4746]: I0128 20:39:47.493115 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 28 20:39:47 crc kubenswrapper[4746]: I0128 20:39:47.497067 4746 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 28 20:39:47 crc kubenswrapper[4746]: I0128 20:39:47.497122 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 28 20:39:47 crc kubenswrapper[4746]: I0128 20:39:47.785538 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 09:10:31.954390727 +0000 UTC Jan 28 20:39:47 crc kubenswrapper[4746]: I0128 20:39:47.963858 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 28 20:39:47 crc kubenswrapper[4746]: I0128 20:39:47.966267 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83"} Jan 28 20:39:47 crc kubenswrapper[4746]: I0128 20:39:47.966479 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:47 crc kubenswrapper[4746]: I0128 20:39:47.967701 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:47 crc kubenswrapper[4746]: I0128 20:39:47.967734 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:47 crc kubenswrapper[4746]: I0128 20:39:47.967746 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:48 crc kubenswrapper[4746]: I0128 20:39:48.786610 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 19:06:58.803199181 +0000 UTC Jan 28 20:39:49 crc kubenswrapper[4746]: I0128 20:39:49.787609 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 21:09:40.420223701 +0000 UTC Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.484423 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.484577 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.484660 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.485716 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.485746 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.485756 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.490875 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.618440 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.618640 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.619786 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.619819 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.619828 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.637734 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.788555 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 00:56:23.237979038 +0000 UTC Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.974797 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.974859 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.975974 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.976019 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.976017 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.976066 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.976034 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:50 crc kubenswrapper[4746]: I0128 20:39:50.976133 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:51 crc kubenswrapper[4746]: I0128 20:39:51.675843 4746 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 28 20:39:51 crc kubenswrapper[4746]: I0128 20:39:51.789248 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 13:17:35.286097455 +0000 UTC Jan 28 20:39:51 crc kubenswrapper[4746]: I0128 20:39:51.977957 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:51 crc kubenswrapper[4746]: I0128 20:39:51.978954 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:51 crc kubenswrapper[4746]: I0128 20:39:51.979010 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:51 crc kubenswrapper[4746]: I0128 20:39:51.979024 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.497997 4746 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.499957 4746 trace.go:236] Trace[142488718]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (28-Jan-2026 20:39:37.499) (total time: 14999ms): Jan 28 20:39:52 crc kubenswrapper[4746]: Trace[142488718]: ---"Objects listed" error: 14999ms (20:39:52.499) Jan 28 20:39:52 crc kubenswrapper[4746]: Trace[142488718]: [14.999880133s] [14.999880133s] END Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.500066 4746 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.502543 4746 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.510128 4746 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.520145 4746 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.535965 4746 csr.go:261] certificate signing request csr-c9s2v is approved, waiting to be issued Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.548746 4746 csr.go:257] certificate signing request csr-c9s2v is issued Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.620669 4746 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 28 20:39:52 crc kubenswrapper[4746]: W0128 20:39:52.621009 4746 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 28 20:39:52 crc kubenswrapper[4746]: W0128 20:39:52.621114 4746 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.621115 4746 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events/crc.188effa1874a218f\": read tcp 38.102.83.201:43968->38.102.83.201:6443: use of closed network connection" event="&Event{ObjectMeta:{crc.188effa1874a218f default 26207 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-28 20:39:32 +0000 UTC,LastTimestamp:2026-01-28 20:39:32.943251288 +0000 UTC m=+0.899437652,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.691334 4746 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.733408 4746 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.769121 4746 apiserver.go:52] "Watching apiserver" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.771959 4746 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.772293 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.772926 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.772961 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.772975 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.773014 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.773020 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.772944 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.773296 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.773356 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.773414 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.775780 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.777334 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.777592 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.777642 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.777816 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.777916 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.779320 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.780140 4746 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.783500 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.785117 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.789458 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 19:02:47.194126893 +0000 UTC Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.791198 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-gcrxx"] Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.791673 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gcrxx" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.794169 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.794288 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.794384 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.803535 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.803601 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.803646 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.803675 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.803700 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.803887 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.803918 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.803983 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804026 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804048 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804048 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804089 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804161 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804222 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804257 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804286 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804320 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804347 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804374 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804399 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804450 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804468 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804485 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804516 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804543 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804573 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804598 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804623 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804648 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804674 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804703 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804730 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804755 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804779 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804803 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804824 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804847 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804869 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804892 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804920 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804933 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804953 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804944 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.804942 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805001 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805026 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805052 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805096 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805122 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805147 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805174 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805202 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805212 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805248 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805227 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805304 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805320 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805325 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805395 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805426 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805430 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805452 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805480 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805504 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805511 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805525 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805553 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805552 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805580 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805606 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805629 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805655 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805660 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805676 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805698 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805718 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805735 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805743 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805803 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805804 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805854 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805878 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805902 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805925 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805949 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805975 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.805980 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806000 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806011 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806024 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806052 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806117 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806130 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806158 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806184 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806210 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806237 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806273 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806298 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806322 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806390 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806414 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806438 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806464 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806487 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806510 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806539 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806562 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806585 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806609 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806635 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806658 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806681 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806705 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806729 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806757 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806790 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806815 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806838 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806863 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806893 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806920 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806946 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807238 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807277 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807301 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807324 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807347 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807372 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807395 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807422 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807446 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807662 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807690 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807714 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807740 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807765 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807792 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807820 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807845 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807876 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.807903 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.815879 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.815975 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816007 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816035 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816076 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816125 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816155 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816180 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816178 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816208 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816235 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816258 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816283 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816308 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816334 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816358 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816384 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816441 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816468 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816494 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816514 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806198 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816537 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816561 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816587 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816692 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816802 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816834 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816858 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816881 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816922 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816946 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.816978 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.817009 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.817028 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.817564 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.817592 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.817626 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.817752 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:39:53.317665509 +0000 UTC m=+21.273851863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.817810 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.817846 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.818003 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.818040 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.818068 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.818066 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.818121 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.818148 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.818174 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.818381 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.818838 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.818878 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.818905 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.818961 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.818200 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.819392 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.819429 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.819465 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.819525 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.819562 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.819595 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.819622 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.819646 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.819672 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.819762 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.819788 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.820382 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.820544 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.820595 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.820621 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.820648 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.820675 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.820704 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.820726 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.820752 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.820777 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.820799 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.820779 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.820824 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.820949 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.820993 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821044 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821101 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821136 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821139 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821193 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821222 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821249 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821330 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821437 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821360 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821706 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821756 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821780 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821813 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821844 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821870 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821898 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821925 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821951 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.821977 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822000 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822029 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822149 4746 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822166 4746 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822179 4746 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822197 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822209 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822222 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822305 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822318 4746 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822354 4746 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822365 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822382 4746 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822393 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822404 4746 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822419 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822434 4746 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822445 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822457 4746 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822820 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822831 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822894 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.822970 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.823169 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.823211 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.823289 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.823310 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.823325 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.823347 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.823361 4746 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.823375 4746 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.823388 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.823407 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.823421 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.823485 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.823642 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806234 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806478 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806495 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806680 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806735 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806742 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806791 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806879 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.806949 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.809310 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.810326 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.810440 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.810634 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.810660 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.810669 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.810799 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.810893 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.810980 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.815338 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.815442 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.815513 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.815771 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.815832 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.815915 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.824066 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.824286 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.824516 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.824590 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.824656 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.824839 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.825048 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.825093 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.825982 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.826300 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.827324 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.827780 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.827879 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.827278 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.827451 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.828026 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.828062 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.828208 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.828493 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.828536 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.828593 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.828617 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.828868 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.829043 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.829188 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.830221 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.830417 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.830620 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.831112 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.831167 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.831145 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.831282 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.831419 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.831455 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.831756 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.831948 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.832064 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.832308 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.832330 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.832519 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.832922 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.833007 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.833303 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.833988 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.834139 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.834345 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.834408 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.834499 4746 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.834600 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.834695 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 20:39:53.334677209 +0000 UTC m=+21.290863563 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.834742 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.837163 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.837331 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.837647 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.837853 4746 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.838431 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.838901 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.839438 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.839886 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.840788 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.840808 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.841318 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.841796 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.841897 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.841899 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.842133 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.842194 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.842390 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.842896 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.842954 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.842981 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.843068 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.843396 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.843615 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.843634 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.843659 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.843906 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.844010 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.844040 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.844136 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.844446 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.844811 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.844863 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.845182 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.845297 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.845332 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.845384 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.845455 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.845471 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.845853 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.846003 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.846285 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.846678 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.846998 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.847193 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.847404 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.847520 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.848049 4746 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.848068 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.848322 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.848338 4746 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.848351 4746 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.848974 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.848983 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.850465 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.851005 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.851156 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.851215 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.851396 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.852196 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.852697 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.852743 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.852975 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.853006 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.853116 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.853383 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.853493 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.853558 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.853684 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.853742 4746 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.853849 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 20:39:53.35382299 +0000 UTC m=+21.310009334 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.854014 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.854482 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.855100 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.855392 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.855428 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.856033 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.856061 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.856090 4746 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.856140 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 20:39:53.356125514 +0000 UTC m=+21.312311868 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.856497 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.859919 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.860136 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.861051 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.862602 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.862632 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.862673 4746 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.862775 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 20:39:53.362747532 +0000 UTC m=+21.318933886 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.863857 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.866531 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.866933 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.867095 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.867544 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.868645 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.869324 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.870391 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.872306 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.874160 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.874357 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.875217 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.875926 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.876196 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.877873 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.879113 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.882126 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.882379 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.882680 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.884360 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.885589 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.881877 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.885843 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.886619 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.887855 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.894480 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.894970 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.895841 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.908556 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.909398 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.910782 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.911849 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.912964 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.913472 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.920207 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.924397 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.925611 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.925833 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.925916 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.927112 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.930261 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.934445 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.935027 4746 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.935182 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.939594 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.940733 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.942316 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.943176 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.944922 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.945439 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.946276 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.947494 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.948547 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.948838 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1c3c93a1-7ddf-4339-9ca3-79f3753943b4-hosts-file\") pod \"node-resolver-gcrxx\" (UID: \"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\") " pod="openshift-dns/node-resolver-gcrxx" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.948884 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.948934 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn6hr\" (UniqueName: \"kubernetes.io/projected/1c3c93a1-7ddf-4339-9ca3-79f3753943b4-kube-api-access-wn6hr\") pod \"node-resolver-gcrxx\" (UID: \"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\") " pod="openshift-dns/node-resolver-gcrxx" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.948989 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.949018 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.949098 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.949674 4746 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.949829 4746 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.949847 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.949859 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.949873 4746 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.949885 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.949895 4746 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.949907 4746 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.949919 4746 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.949933 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.949944 4746 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.949956 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.949966 4746 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950064 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950121 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950138 4746 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950152 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950229 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950264 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950279 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950295 4746 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950308 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950295 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950321 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950416 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950430 4746 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950450 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950452 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950465 4746 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950482 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950496 4746 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950509 4746 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950524 4746 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950540 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950556 4746 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950570 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950583 4746 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950596 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950610 4746 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950623 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950638 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950655 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950670 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950682 4746 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950698 4746 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950710 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950723 4746 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950737 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950750 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950763 4746 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950796 4746 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950817 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950827 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950837 4746 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950848 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950857 4746 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950869 4746 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950879 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950890 4746 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950900 4746 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950912 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950921 4746 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950931 4746 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950943 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950955 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950964 4746 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950986 4746 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950996 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.951008 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.951017 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.951027 4746 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.951036 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.951047 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.950963 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: E0128 20:39:52.951034 4746 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83.scope\": RecentStats: unable to find data in memory cache]" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.951065 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952162 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952180 4746 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952193 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952206 4746 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952215 4746 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952225 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952235 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952245 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952275 4746 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952287 4746 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952297 4746 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952308 4746 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952323 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952338 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952357 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952368 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952379 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952389 4746 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952400 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952410 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952422 4746 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952432 4746 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952443 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952454 4746 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952465 4746 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952474 4746 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952485 4746 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952497 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952507 4746 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952518 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952529 4746 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952539 4746 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952549 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952562 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952573 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952583 4746 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952594 4746 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952604 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952614 4746 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952623 4746 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952634 4746 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952643 4746 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952653 4746 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952662 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952672 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952682 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952851 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952866 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952878 4746 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952889 4746 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952902 4746 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952913 4746 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952923 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952934 4746 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952944 4746 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952954 4746 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952964 4746 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952975 4746 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952986 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952998 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.952995 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953011 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953034 4746 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953045 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953055 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953069 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953094 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953104 4746 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953113 4746 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953123 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953132 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953141 4746 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953154 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953166 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953176 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953185 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953196 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953206 4746 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953217 4746 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953227 4746 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953236 4746 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953246 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953255 4746 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953265 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953274 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953288 4746 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953297 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953309 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953321 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953334 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.953349 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.954208 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.954797 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.955840 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.956775 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.957673 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.958580 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.959836 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.960342 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.960823 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.961917 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.962654 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.962732 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.963569 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.974419 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:52 crc kubenswrapper[4746]: I0128 20:39:52.992054 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.005975 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.016588 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.026547 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.036701 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.051046 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.053801 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1c3c93a1-7ddf-4339-9ca3-79f3753943b4-hosts-file\") pod \"node-resolver-gcrxx\" (UID: \"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\") " pod="openshift-dns/node-resolver-gcrxx" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.053845 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn6hr\" (UniqueName: \"kubernetes.io/projected/1c3c93a1-7ddf-4339-9ca3-79f3753943b4-kube-api-access-wn6hr\") pod \"node-resolver-gcrxx\" (UID: \"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\") " pod="openshift-dns/node-resolver-gcrxx" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.054242 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1c3c93a1-7ddf-4339-9ca3-79f3753943b4-hosts-file\") pod \"node-resolver-gcrxx\" (UID: \"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\") " pod="openshift-dns/node-resolver-gcrxx" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.062856 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.075671 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn6hr\" (UniqueName: \"kubernetes.io/projected/1c3c93a1-7ddf-4339-9ca3-79f3753943b4-kube-api-access-wn6hr\") pod \"node-resolver-gcrxx\" (UID: \"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\") " pod="openshift-dns/node-resolver-gcrxx" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.088560 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.102315 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 28 20:39:53 crc kubenswrapper[4746]: W0128 20:39:53.117258 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-a06ff5fcc6e0d3350ea7ceab34e74677274182f56c86a58b740e6de6cc67bff9 WatchSource:0}: Error finding container a06ff5fcc6e0d3350ea7ceab34e74677274182f56c86a58b740e6de6cc67bff9: Status 404 returned error can't find the container with id a06ff5fcc6e0d3350ea7ceab34e74677274182f56c86a58b740e6de6cc67bff9 Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.126472 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.135276 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gcrxx" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.188616 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-6wrnw"] Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.188891 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-qhpvf"] Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.189007 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8vmvh"] Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.189549 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.189974 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-ht6hp"] Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.190295 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.190788 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.190807 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.193727 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.193904 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.194016 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.194134 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.194232 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.194323 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.194413 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.194501 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.194705 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.194799 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.196653 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.196762 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.196814 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.196823 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.196893 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.196964 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.196979 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.197016 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.198675 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.202552 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.222989 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.236388 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.247891 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.255929 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-host-var-lib-cni-multus\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256509 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-log-socket\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256551 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-cni-bin\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256595 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgrqs\" (UniqueName: \"kubernetes.io/projected/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-kube-api-access-kgrqs\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256630 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-host-var-lib-cni-bin\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256648 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-run-openvswitch\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256667 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-run-ovn-kubernetes\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256688 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-cni-netd\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256706 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-host-run-multus-certs\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256731 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-systemd-units\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256747 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-ovnkube-config\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256762 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-os-release\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256780 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-multus-socket-dir-parent\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256796 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-host-run-netns\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256825 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-multus-cni-dir\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256845 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-slash\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256888 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-run-systemd\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256904 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-host-var-lib-kubelet\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256923 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-multus-conf-dir\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256940 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnpsl\" (UniqueName: \"kubernetes.io/projected/cdf26de0-b602-4bdf-b492-65b3b6b31434-kube-api-access-jnpsl\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256973 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-system-cni-dir\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.256997 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-system-cni-dir\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257060 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-run-netns\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257076 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-etc-openvswitch\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257110 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257139 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr62b\" (UniqueName: \"kubernetes.io/projected/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-kube-api-access-rr62b\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257204 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6dc8b546-9734-4082-b2b3-2bafe3f1564d-proxy-tls\") pod \"machine-config-daemon-6wrnw\" (UID: \"6dc8b546-9734-4082-b2b3-2bafe3f1564d\") " pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257243 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-ovn-node-metrics-cert\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257268 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-cnibin\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257290 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257326 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-kubelet\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257346 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257366 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cdf26de0-b602-4bdf-b492-65b3b6b31434-cni-binary-copy\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257386 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-hostroot\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257408 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-node-log\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257431 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-os-release\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257460 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-cnibin\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257483 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-cni-binary-copy\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257504 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-host-run-k8s-cni-cncf-io\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257528 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cdf26de0-b602-4bdf-b492-65b3b6b31434-multus-daemon-config\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257598 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-var-lib-openvswitch\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257641 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-etc-kubernetes\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257674 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-run-ovn\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257695 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-ovnkube-script-lib\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257713 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-env-overrides\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257731 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6dc8b546-9734-4082-b2b3-2bafe3f1564d-rootfs\") pod \"machine-config-daemon-6wrnw\" (UID: \"6dc8b546-9734-4082-b2b3-2bafe3f1564d\") " pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257748 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6dc8b546-9734-4082-b2b3-2bafe3f1564d-mcd-auth-proxy-config\") pod \"machine-config-daemon-6wrnw\" (UID: \"6dc8b546-9734-4082-b2b3-2bafe3f1564d\") " pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.257764 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z84f9\" (UniqueName: \"kubernetes.io/projected/6dc8b546-9734-4082-b2b3-2bafe3f1564d-kube-api-access-z84f9\") pod \"machine-config-daemon-6wrnw\" (UID: \"6dc8b546-9734-4082-b2b3-2bafe3f1564d\") " pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.260198 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.276413 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.298410 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.316040 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.333187 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.346507 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359087 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359171 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-systemd-units\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359216 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-ovnkube-config\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359232 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-os-release\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359249 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-multus-socket-dir-parent\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359265 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-host-run-netns\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359286 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359302 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-multus-cni-dir\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359318 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-slash\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359331 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-run-systemd\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359344 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-host-var-lib-kubelet\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359358 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-multus-conf-dir\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359372 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnpsl\" (UniqueName: \"kubernetes.io/projected/cdf26de0-b602-4bdf-b492-65b3b6b31434-kube-api-access-jnpsl\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359388 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-system-cni-dir\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359405 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-system-cni-dir\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359420 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359434 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-run-netns\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359456 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-etc-openvswitch\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359472 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359487 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr62b\" (UniqueName: \"kubernetes.io/projected/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-kube-api-access-rr62b\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359501 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6dc8b546-9734-4082-b2b3-2bafe3f1564d-proxy-tls\") pod \"machine-config-daemon-6wrnw\" (UID: \"6dc8b546-9734-4082-b2b3-2bafe3f1564d\") " pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359553 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-cnibin\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359567 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-ovn-node-metrics-cert\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359582 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-kubelet\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359596 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359611 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cdf26de0-b602-4bdf-b492-65b3b6b31434-cni-binary-copy\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359642 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-hostroot\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359660 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359675 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-node-log\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359690 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-os-release\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359707 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-cnibin\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359722 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-cni-binary-copy\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359736 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-host-run-k8s-cni-cncf-io\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359750 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cdf26de0-b602-4bdf-b492-65b3b6b31434-multus-daemon-config\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359771 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-var-lib-openvswitch\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359789 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-etc-kubernetes\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359804 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-run-ovn\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359819 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-ovnkube-script-lib\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359870 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-env-overrides\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359890 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6dc8b546-9734-4082-b2b3-2bafe3f1564d-rootfs\") pod \"machine-config-daemon-6wrnw\" (UID: \"6dc8b546-9734-4082-b2b3-2bafe3f1564d\") " pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359906 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6dc8b546-9734-4082-b2b3-2bafe3f1564d-mcd-auth-proxy-config\") pod \"machine-config-daemon-6wrnw\" (UID: \"6dc8b546-9734-4082-b2b3-2bafe3f1564d\") " pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359921 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z84f9\" (UniqueName: \"kubernetes.io/projected/6dc8b546-9734-4082-b2b3-2bafe3f1564d-kube-api-access-z84f9\") pod \"machine-config-daemon-6wrnw\" (UID: \"6dc8b546-9734-4082-b2b3-2bafe3f1564d\") " pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359939 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359953 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-log-socket\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359984 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-cni-bin\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.359999 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgrqs\" (UniqueName: \"kubernetes.io/projected/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-kube-api-access-kgrqs\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.360013 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-host-var-lib-cni-bin\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.360035 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-host-var-lib-cni-multus\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.360050 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-run-ovn-kubernetes\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.360064 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-cni-netd\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.360090 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-host-run-multus-certs\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.360105 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-run-openvswitch\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.360163 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-run-openvswitch\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: E0128 20:39:53.360235 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:39:54.360219311 +0000 UTC m=+22.316405665 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.360258 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-systemd-units\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.360807 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-ovnkube-config\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.360865 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-os-release\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.360900 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-multus-socket-dir-parent\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.360923 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-host-run-netns\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: E0128 20:39:53.360985 4746 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 20:39:53 crc kubenswrapper[4746]: E0128 20:39:53.361020 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 20:39:54.361012974 +0000 UTC m=+22.317199328 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.361063 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-multus-cni-dir\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.361106 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-slash\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.361128 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-run-systemd\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.361147 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-host-var-lib-kubelet\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.361170 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-multus-conf-dir\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.361398 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-system-cni-dir\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.361438 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-system-cni-dir\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: E0128 20:39:53.361479 4746 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 20:39:53 crc kubenswrapper[4746]: E0128 20:39:53.361507 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 20:39:54.361498247 +0000 UTC m=+22.317684601 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.361638 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-run-netns\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.361765 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-etc-openvswitch\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.362400 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.364603 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.364771 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-etc-kubernetes\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.364894 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-node-log\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.365238 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-cnibin\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.365284 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-os-release\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.365312 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-cnibin\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.365746 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-cni-binary-copy\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.365784 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-var-lib-openvswitch\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.365808 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-log-socket\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.365841 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-run-ovn\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.366823 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-host-var-lib-cni-multus\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.366904 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-cni-bin\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.367005 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-ovnkube-script-lib\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.367183 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-host-var-lib-cni-bin\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.367232 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-cni-netd\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.367268 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-run-ovn-kubernetes\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.367471 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-ovn-node-metrics-cert\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.367514 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-hostroot\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.367539 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.367667 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6dc8b546-9734-4082-b2b3-2bafe3f1564d-rootfs\") pod \"machine-config-daemon-6wrnw\" (UID: \"6dc8b546-9734-4082-b2b3-2bafe3f1564d\") " pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.367694 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-host-run-multus-certs\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.367770 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6dc8b546-9734-4082-b2b3-2bafe3f1564d-mcd-auth-proxy-config\") pod \"machine-config-daemon-6wrnw\" (UID: \"6dc8b546-9734-4082-b2b3-2bafe3f1564d\") " pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.367870 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-kubelet\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: E0128 20:39:53.367876 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 20:39:53 crc kubenswrapper[4746]: E0128 20:39:53.367895 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 20:39:53 crc kubenswrapper[4746]: E0128 20:39:53.367905 4746 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:39:53 crc kubenswrapper[4746]: E0128 20:39:53.367947 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 20:39:54.367933799 +0000 UTC m=+22.324120153 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.369048 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.369264 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cdf26de0-b602-4bdf-b492-65b3b6b31434-cni-binary-copy\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.369349 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cdf26de0-b602-4bdf-b492-65b3b6b31434-host-run-k8s-cni-cncf-io\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.369478 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cdf26de0-b602-4bdf-b492-65b3b6b31434-multus-daemon-config\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.370156 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-env-overrides\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.370516 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6dc8b546-9734-4082-b2b3-2bafe3f1564d-proxy-tls\") pod \"machine-config-daemon-6wrnw\" (UID: \"6dc8b546-9734-4082-b2b3-2bafe3f1564d\") " pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.383295 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.392714 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr62b\" (UniqueName: \"kubernetes.io/projected/beb2b795-6bf4-4d38-89f7-bcb5512c3e61-kube-api-access-rr62b\") pod \"multus-additional-cni-plugins-ht6hp\" (UID: \"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\") " pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.396357 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgrqs\" (UniqueName: \"kubernetes.io/projected/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-kube-api-access-kgrqs\") pod \"ovnkube-node-8vmvh\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.396751 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z84f9\" (UniqueName: \"kubernetes.io/projected/6dc8b546-9734-4082-b2b3-2bafe3f1564d-kube-api-access-z84f9\") pod \"machine-config-daemon-6wrnw\" (UID: \"6dc8b546-9734-4082-b2b3-2bafe3f1564d\") " pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.397720 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnpsl\" (UniqueName: \"kubernetes.io/projected/cdf26de0-b602-4bdf-b492-65b3b6b31434-kube-api-access-jnpsl\") pod \"multus-qhpvf\" (UID: \"cdf26de0-b602-4bdf-b492-65b3b6b31434\") " pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.401351 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.422120 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.449170 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.461326 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:39:53 crc kubenswrapper[4746]: E0128 20:39:53.461515 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 20:39:53 crc kubenswrapper[4746]: E0128 20:39:53.461627 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 20:39:53 crc kubenswrapper[4746]: E0128 20:39:53.461642 4746 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:39:53 crc kubenswrapper[4746]: E0128 20:39:53.461697 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 20:39:54.461679347 +0000 UTC m=+22.417865691 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.472193 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.477374 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.483070 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.486773 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.489028 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.498357 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.510511 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.513931 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.522095 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.522051 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: W0128 20:39:53.526114 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4d15639_62fb_41b7_a1d4_6f51f3af6d99.slice/crio-898c5dcede9f2cde269997afb517c22e410533983244db06d196bc12f2c22793 WatchSource:0}: Error finding container 898c5dcede9f2cde269997afb517c22e410533983244db06d196bc12f2c22793: Status 404 returned error can't find the container with id 898c5dcede9f2cde269997afb517c22e410533983244db06d196bc12f2c22793 Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.532481 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.535828 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" Jan 28 20:39:53 crc kubenswrapper[4746]: W0128 20:39:53.538570 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dc8b546_9734_4082_b2b3_2bafe3f1564d.slice/crio-f24f7b9a6a114372a23d74f6ea2a656eb30cc225c1a6239bae2528a33fc2b3d2 WatchSource:0}: Error finding container f24f7b9a6a114372a23d74f6ea2a656eb30cc225c1a6239bae2528a33fc2b3d2: Status 404 returned error can't find the container with id f24f7b9a6a114372a23d74f6ea2a656eb30cc225c1a6239bae2528a33fc2b3d2 Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.545916 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.549877 4746 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-28 20:34:52 +0000 UTC, rotation deadline is 2026-11-26 19:10:05.026273359 +0000 UTC Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.549949 4746 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7246h30m11.476327459s for next certificate rotation Jan 28 20:39:53 crc kubenswrapper[4746]: W0128 20:39:53.551142 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeb2b795_6bf4_4d38_89f7_bcb5512c3e61.slice/crio-95691ec05d99caa03223f621722b4eccf9d6d2ca54e85bc2f5a35fd3e298a8c5 WatchSource:0}: Error finding container 95691ec05d99caa03223f621722b4eccf9d6d2ca54e85bc2f5a35fd3e298a8c5: Status 404 returned error can't find the container with id 95691ec05d99caa03223f621722b4eccf9d6d2ca54e85bc2f5a35fd3e298a8c5 Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.559065 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.562113 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qhpvf" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.578401 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.589927 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: W0128 20:39:53.592182 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdf26de0_b602_4bdf_b492_65b3b6b31434.slice/crio-82b432ed3d1be70f79fa21b108de8a1e1772c0cbb6414d9394d1525366d096e1 WatchSource:0}: Error finding container 82b432ed3d1be70f79fa21b108de8a1e1772c0cbb6414d9394d1525366d096e1: Status 404 returned error can't find the container with id 82b432ed3d1be70f79fa21b108de8a1e1772c0cbb6414d9394d1525366d096e1 Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.603889 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.618859 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.631148 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.642698 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.663258 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.678941 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.688317 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.790153 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 12:15:49.420713394 +0000 UTC Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.835734 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:39:53 crc kubenswrapper[4746]: E0128 20:39:53.835874 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.986061 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gcrxx" event={"ID":"1c3c93a1-7ddf-4339-9ca3-79f3753943b4","Type":"ContainerStarted","Data":"5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468"} Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.986147 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gcrxx" event={"ID":"1c3c93a1-7ddf-4339-9ca3-79f3753943b4","Type":"ContainerStarted","Data":"4f5886871715cde4096e74fe0567fb04221d86b4d7f1f5babc93cfa0682448df"} Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.987597 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ef34a9b9c01078fa799a0029b6794bc82c11f8b695f8d7c3706f5c5aa9eebe40"} Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.989486 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.989991 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.991737 4746 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83" exitCode=255 Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.991792 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83"} Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.991844 4746 scope.go:117] "RemoveContainer" containerID="2bd133218655e67cdc29b14c35357368452991c316986b1fa90069554eba28c0" Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.993730 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerStarted","Data":"50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038"} Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.993779 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerStarted","Data":"66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188"} Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.993794 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerStarted","Data":"f24f7b9a6a114372a23d74f6ea2a656eb30cc225c1a6239bae2528a33fc2b3d2"} Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.994847 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qhpvf" event={"ID":"cdf26de0-b602-4bdf-b492-65b3b6b31434","Type":"ContainerStarted","Data":"f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367"} Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.994879 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qhpvf" event={"ID":"cdf26de0-b602-4bdf-b492-65b3b6b31434","Type":"ContainerStarted","Data":"82b432ed3d1be70f79fa21b108de8a1e1772c0cbb6414d9394d1525366d096e1"} Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.995867 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" event={"ID":"beb2b795-6bf4-4d38-89f7-bcb5512c3e61","Type":"ContainerStarted","Data":"b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6"} Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.995905 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" event={"ID":"beb2b795-6bf4-4d38-89f7-bcb5512c3e61","Type":"ContainerStarted","Data":"95691ec05d99caa03223f621722b4eccf9d6d2ca54e85bc2f5a35fd3e298a8c5"} Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.996831 4746 generic.go:334] "Generic (PLEG): container finished" podID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerID="45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328" exitCode=0 Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.996891 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerDied","Data":"45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328"} Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.996913 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerStarted","Data":"898c5dcede9f2cde269997afb517c22e410533983244db06d196bc12f2c22793"} Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.998493 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab"} Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.998565 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad"} Jan 28 20:39:53 crc kubenswrapper[4746]: I0128 20:39:53.998580 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d313aa4b7bcc95255b93cc9a81c6a93f7f56d7d537d693df3b36724451a91689"} Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.001031 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a"} Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.001075 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a06ff5fcc6e0d3350ea7ceab34e74677274182f56c86a58b740e6de6cc67bff9"} Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.005413 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.016038 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.026638 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.039003 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.060314 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.061371 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.066834 4746 scope.go:117] "RemoveContainer" containerID="56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83" Jan 28 20:39:54 crc kubenswrapper[4746]: E0128 20:39:54.067070 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.146626 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.163151 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.175414 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.187184 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.201482 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.215050 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.229023 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.243374 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.260220 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.277642 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.300980 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.323697 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.346204 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.370779 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.370928 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:39:54 crc kubenswrapper[4746]: E0128 20:39:54.370987 4746 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 20:39:54 crc kubenswrapper[4746]: E0128 20:39:54.370997 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:39:56.370967946 +0000 UTC m=+24.327154290 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:39:54 crc kubenswrapper[4746]: E0128 20:39:54.371046 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 20:39:56.371034708 +0000 UTC m=+24.327221062 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.371146 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.371189 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:39:54 crc kubenswrapper[4746]: E0128 20:39:54.371299 4746 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 20:39:54 crc kubenswrapper[4746]: E0128 20:39:54.371334 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 20:39:56.371327416 +0000 UTC m=+24.327513770 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 20:39:54 crc kubenswrapper[4746]: E0128 20:39:54.371396 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 20:39:54 crc kubenswrapper[4746]: E0128 20:39:54.371407 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 20:39:54 crc kubenswrapper[4746]: E0128 20:39:54.371418 4746 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:39:54 crc kubenswrapper[4746]: E0128 20:39:54.371440 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 20:39:56.371433859 +0000 UTC m=+24.327620213 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.391507 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bd133218655e67cdc29b14c35357368452991c316986b1fa90069554eba28c0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:46Z\\\",\\\"message\\\":\\\"W0128 20:39:36.056270 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0128 20:39:36.056819 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769632776 cert, and key in /tmp/serving-cert-875558266/serving-signer.crt, /tmp/serving-cert-875558266/serving-signer.key\\\\nI0128 20:39:36.239389 1 observer_polling.go:159] Starting file observer\\\\nW0128 20:39:36.242369 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0128 20:39:36.242616 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:36.243645 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-875558266/tls.crt::/tmp/serving-cert-875558266/tls.key\\\\\\\"\\\\nF0128 20:39:46.545113 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.422348 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.462455 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.472584 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:39:54 crc kubenswrapper[4746]: E0128 20:39:54.472731 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 20:39:54 crc kubenswrapper[4746]: E0128 20:39:54.472747 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 20:39:54 crc kubenswrapper[4746]: E0128 20:39:54.472757 4746 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:39:54 crc kubenswrapper[4746]: E0128 20:39:54.472797 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 20:39:56.472786152 +0000 UTC m=+24.428972496 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.503109 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.545190 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.581147 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.622411 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:54Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.790682 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 15:46:41.363091114 +0000 UTC Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.835290 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.835362 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:39:54 crc kubenswrapper[4746]: E0128 20:39:54.835935 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:39:54 crc kubenswrapper[4746]: E0128 20:39:54.836018 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.841058 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.842217 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.843924 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.844785 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.846099 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.846941 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.847778 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.849459 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 28 20:39:54 crc kubenswrapper[4746]: I0128 20:39:54.851031 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.005364 4746 generic.go:334] "Generic (PLEG): container finished" podID="beb2b795-6bf4-4d38-89f7-bcb5512c3e61" containerID="b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6" exitCode=0 Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.005441 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" event={"ID":"beb2b795-6bf4-4d38-89f7-bcb5512c3e61","Type":"ContainerDied","Data":"b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6"} Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.011100 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerStarted","Data":"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec"} Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.011154 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerStarted","Data":"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88"} Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.011171 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerStarted","Data":"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f"} Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.011183 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerStarted","Data":"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1"} Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.011192 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerStarted","Data":"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88"} Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.011203 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerStarted","Data":"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d"} Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.014184 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.017097 4746 scope.go:117] "RemoveContainer" containerID="56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83" Jan 28 20:39:55 crc kubenswrapper[4746]: E0128 20:39:55.017340 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.023968 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.046795 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.064573 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.076760 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.096727 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.109869 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.137816 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.155442 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2bd133218655e67cdc29b14c35357368452991c316986b1fa90069554eba28c0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:46Z\\\",\\\"message\\\":\\\"W0128 20:39:36.056270 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0128 20:39:36.056819 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769632776 cert, and key in /tmp/serving-cert-875558266/serving-signer.crt, /tmp/serving-cert-875558266/serving-signer.key\\\\nI0128 20:39:36.239389 1 observer_polling.go:159] Starting file observer\\\\nW0128 20:39:36.242369 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0128 20:39:36.242616 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:36.243645 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-875558266/tls.crt::/tmp/serving-cert-875558266/tls.key\\\\\\\"\\\\nF0128 20:39:46.545113 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.170200 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.185125 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.200248 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.218986 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.231845 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.244795 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.260852 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.277744 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.323346 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.354760 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.381670 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.421561 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.465247 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.503464 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.543816 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.580945 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.618628 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.660936 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:55Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.790855 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 13:56:11.306920815 +0000 UTC Jan 28 20:39:55 crc kubenswrapper[4746]: I0128 20:39:55.835376 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:39:55 crc kubenswrapper[4746]: E0128 20:39:55.835938 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.022042 4746 generic.go:334] "Generic (PLEG): container finished" podID="beb2b795-6bf4-4d38-89f7-bcb5512c3e61" containerID="40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098" exitCode=0 Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.022141 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" event={"ID":"beb2b795-6bf4-4d38-89f7-bcb5512c3e61","Type":"ContainerDied","Data":"40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098"} Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.023814 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035"} Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.041318 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.054412 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.071539 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.084802 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.097787 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.110402 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.130465 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.142953 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.155052 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.171732 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.177315 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-d8rwq"] Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.177717 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d8rwq" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.179723 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.179772 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.179723 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.184568 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.189284 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.220010 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.263861 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.289545 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/405572d8-50d2-4d7a-adf0-d8d6adea31ca-serviceca\") pod \"node-ca-d8rwq\" (UID: \"405572d8-50d2-4d7a-adf0-d8d6adea31ca\") " pod="openshift-image-registry/node-ca-d8rwq" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.289594 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/405572d8-50d2-4d7a-adf0-d8d6adea31ca-host\") pod \"node-ca-d8rwq\" (UID: \"405572d8-50d2-4d7a-adf0-d8d6adea31ca\") " pod="openshift-image-registry/node-ca-d8rwq" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.289629 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb4h4\" (UniqueName: \"kubernetes.io/projected/405572d8-50d2-4d7a-adf0-d8d6adea31ca-kube-api-access-mb4h4\") pod \"node-ca-d8rwq\" (UID: \"405572d8-50d2-4d7a-adf0-d8d6adea31ca\") " pod="openshift-image-registry/node-ca-d8rwq" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.303625 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.339455 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.379052 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.390669 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.390803 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb4h4\" (UniqueName: \"kubernetes.io/projected/405572d8-50d2-4d7a-adf0-d8d6adea31ca-kube-api-access-mb4h4\") pod \"node-ca-d8rwq\" (UID: \"405572d8-50d2-4d7a-adf0-d8d6adea31ca\") " pod="openshift-image-registry/node-ca-d8rwq" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.390856 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.390883 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/405572d8-50d2-4d7a-adf0-d8d6adea31ca-serviceca\") pod \"node-ca-d8rwq\" (UID: \"405572d8-50d2-4d7a-adf0-d8d6adea31ca\") " pod="openshift-image-registry/node-ca-d8rwq" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.390913 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.390937 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/405572d8-50d2-4d7a-adf0-d8d6adea31ca-host\") pod \"node-ca-d8rwq\" (UID: \"405572d8-50d2-4d7a-adf0-d8d6adea31ca\") " pod="openshift-image-registry/node-ca-d8rwq" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.390962 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:39:56 crc kubenswrapper[4746]: E0128 20:39:56.391055 4746 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 20:39:56 crc kubenswrapper[4746]: E0128 20:39:56.391133 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 20:40:00.391114739 +0000 UTC m=+28.347301103 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 20:39:56 crc kubenswrapper[4746]: E0128 20:39:56.391199 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:40:00.391189121 +0000 UTC m=+28.347375475 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:39:56 crc kubenswrapper[4746]: E0128 20:39:56.391387 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 20:39:56 crc kubenswrapper[4746]: E0128 20:39:56.391404 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 20:39:56 crc kubenswrapper[4746]: E0128 20:39:56.391417 4746 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:39:56 crc kubenswrapper[4746]: E0128 20:39:56.391450 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 20:40:00.391440018 +0000 UTC m=+28.347626372 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.391489 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/405572d8-50d2-4d7a-adf0-d8d6adea31ca-host\") pod \"node-ca-d8rwq\" (UID: \"405572d8-50d2-4d7a-adf0-d8d6adea31ca\") " pod="openshift-image-registry/node-ca-d8rwq" Jan 28 20:39:56 crc kubenswrapper[4746]: E0128 20:39:56.391506 4746 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 20:39:56 crc kubenswrapper[4746]: E0128 20:39:56.391602 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 20:40:00.391580122 +0000 UTC m=+28.347766536 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.393204 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/405572d8-50d2-4d7a-adf0-d8d6adea31ca-serviceca\") pod \"node-ca-d8rwq\" (UID: \"405572d8-50d2-4d7a-adf0-d8d6adea31ca\") " pod="openshift-image-registry/node-ca-d8rwq" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.432231 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb4h4\" (UniqueName: \"kubernetes.io/projected/405572d8-50d2-4d7a-adf0-d8d6adea31ca-kube-api-access-mb4h4\") pod \"node-ca-d8rwq\" (UID: \"405572d8-50d2-4d7a-adf0-d8d6adea31ca\") " pod="openshift-image-registry/node-ca-d8rwq" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.441564 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.479677 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.491203 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d8rwq" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.491559 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:39:56 crc kubenswrapper[4746]: E0128 20:39:56.491879 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 20:39:56 crc kubenswrapper[4746]: E0128 20:39:56.491932 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 20:39:56 crc kubenswrapper[4746]: E0128 20:39:56.491951 4746 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:39:56 crc kubenswrapper[4746]: E0128 20:39:56.492044 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 20:40:00.492012768 +0000 UTC m=+28.448199122 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:39:56 crc kubenswrapper[4746]: W0128 20:39:56.506427 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod405572d8_50d2_4d7a_adf0_d8d6adea31ca.slice/crio-634bf553a58ba3d9bed120f3a3bea199d6eb56a622c54e026a8ccc2bafb854cf WatchSource:0}: Error finding container 634bf553a58ba3d9bed120f3a3bea199d6eb56a622c54e026a8ccc2bafb854cf: Status 404 returned error can't find the container with id 634bf553a58ba3d9bed120f3a3bea199d6eb56a622c54e026a8ccc2bafb854cf Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.525813 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.561099 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.599539 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.639141 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.682261 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.723018 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.763719 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.813231 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 21:48:24.399171089 +0000 UTC Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.818133 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.835177 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.835239 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:39:56 crc kubenswrapper[4746]: E0128 20:39:56.835306 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:39:56 crc kubenswrapper[4746]: E0128 20:39:56.835382 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:39:56 crc kubenswrapper[4746]: I0128 20:39:56.848515 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:56Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.032412 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerStarted","Data":"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc"} Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.034230 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d8rwq" event={"ID":"405572d8-50d2-4d7a-adf0-d8d6adea31ca","Type":"ContainerStarted","Data":"eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244"} Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.034297 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d8rwq" event={"ID":"405572d8-50d2-4d7a-adf0-d8d6adea31ca","Type":"ContainerStarted","Data":"634bf553a58ba3d9bed120f3a3bea199d6eb56a622c54e026a8ccc2bafb854cf"} Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.037151 4746 generic.go:334] "Generic (PLEG): container finished" podID="beb2b795-6bf4-4d38-89f7-bcb5512c3e61" containerID="498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b" exitCode=0 Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.037205 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" event={"ID":"beb2b795-6bf4-4d38-89f7-bcb5512c3e61","Type":"ContainerDied","Data":"498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b"} Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.055958 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.068803 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.083049 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.099024 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.111663 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.139230 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.201282 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.226578 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.236328 4746 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.241072 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.263301 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.300939 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.342377 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.382339 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.423974 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.460328 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.501919 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.541275 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.581176 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.622762 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.665364 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.703091 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.740298 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.779283 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.813962 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 18:22:51.986791649 +0000 UTC Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.823869 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.835555 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:39:57 crc kubenswrapper[4746]: E0128 20:39:57.835702 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.860325 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.901349 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.940695 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:57 crc kubenswrapper[4746]: I0128 20:39:57.979179 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:57Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.042720 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" event={"ID":"beb2b795-6bf4-4d38-89f7-bcb5512c3e61","Type":"ContainerDied","Data":"661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b"} Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.042734 4746 generic.go:334] "Generic (PLEG): container finished" podID="beb2b795-6bf4-4d38-89f7-bcb5512c3e61" containerID="661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b" exitCode=0 Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.060614 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:58Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.079694 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:58Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.099801 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:58Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.141940 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:58Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.186279 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:58Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.222455 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:58Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.262568 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:58Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.300371 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:58Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.342354 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:58Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.384118 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:58Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.420346 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:58Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.465195 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:58Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.497830 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:58Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.546640 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:58Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.814752 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 02:22:06.782184384 +0000 UTC Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.835479 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.835590 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:39:58 crc kubenswrapper[4746]: E0128 20:39:58.835744 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:39:58 crc kubenswrapper[4746]: E0128 20:39:58.835997 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.911146 4746 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.914832 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.914932 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.914945 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.915055 4746 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.932332 4746 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.932849 4746 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.934816 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.934895 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.934916 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.934948 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.934967 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:39:58Z","lastTransitionTime":"2026-01-28T20:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:39:58 crc kubenswrapper[4746]: E0128 20:39:58.951239 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:58Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.957509 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.957956 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.958223 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.958418 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.958606 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:39:58Z","lastTransitionTime":"2026-01-28T20:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:39:58 crc kubenswrapper[4746]: E0128 20:39:58.983359 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:58Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.989935 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.989994 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.990010 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.990035 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:39:58 crc kubenswrapper[4746]: I0128 20:39:58.990052 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:39:58Z","lastTransitionTime":"2026-01-28T20:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:39:59 crc kubenswrapper[4746]: E0128 20:39:59.039048 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:59Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.058461 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.058501 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.058511 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.058528 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.058540 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:39:59Z","lastTransitionTime":"2026-01-28T20:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.064326 4746 generic.go:334] "Generic (PLEG): container finished" podID="beb2b795-6bf4-4d38-89f7-bcb5512c3e61" containerID="a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e" exitCode=0 Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.064396 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" event={"ID":"beb2b795-6bf4-4d38-89f7-bcb5512c3e61","Type":"ContainerDied","Data":"a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e"} Jan 28 20:39:59 crc kubenswrapper[4746]: E0128 20:39:59.072099 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:59Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.076719 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.076759 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.076773 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.076796 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.076811 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:39:59Z","lastTransitionTime":"2026-01-28T20:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.095825 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:59Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:59 crc kubenswrapper[4746]: E0128 20:39:59.100151 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:59Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:59 crc kubenswrapper[4746]: E0128 20:39:59.100322 4746 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.106841 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.107180 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.107571 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.107632 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.107645 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:39:59Z","lastTransitionTime":"2026-01-28T20:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.114634 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:59Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.128919 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:59Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.150146 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:59Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.165485 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:59Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.181934 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:59Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.195326 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:59Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.209314 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:59Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.211020 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.211065 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.211116 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.211137 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.211147 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:39:59Z","lastTransitionTime":"2026-01-28T20:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.224028 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:59Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.240589 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:59Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.255136 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:59Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.271145 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:59Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.288144 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:59Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.304615 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:39:59Z is after 2025-08-24T17:21:41Z" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.314258 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.314565 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.314572 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.314586 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.314596 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:39:59Z","lastTransitionTime":"2026-01-28T20:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.418457 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.418500 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.418513 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.418530 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.418543 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:39:59Z","lastTransitionTime":"2026-01-28T20:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.522133 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.522161 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.522169 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.522183 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.522192 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:39:59Z","lastTransitionTime":"2026-01-28T20:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.624998 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.625087 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.625106 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.625128 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.625144 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:39:59Z","lastTransitionTime":"2026-01-28T20:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.728810 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.728882 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.728905 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.728932 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.728952 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:39:59Z","lastTransitionTime":"2026-01-28T20:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.816379 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 21:33:41.807277995 +0000 UTC Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.831488 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.831546 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.831558 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.831576 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.831589 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:39:59Z","lastTransitionTime":"2026-01-28T20:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.835036 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:39:59 crc kubenswrapper[4746]: E0128 20:39:59.835295 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.934235 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.934285 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.934298 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.934317 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.934330 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:39:59Z","lastTransitionTime":"2026-01-28T20:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.973348 4746 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:39:59 crc kubenswrapper[4746]: I0128 20:39:59.974151 4746 scope.go:117] "RemoveContainer" containerID="56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83" Jan 28 20:39:59 crc kubenswrapper[4746]: E0128 20:39:59.974358 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.038277 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.038324 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.038335 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.038358 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.038370 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:00Z","lastTransitionTime":"2026-01-28T20:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.074569 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerStarted","Data":"d5e9d34221d49b3737f9b22c58b53644245a0384c7dce6402d4e84c84fef9aa4"} Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.075550 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.075656 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.082830 4746 generic.go:334] "Generic (PLEG): container finished" podID="beb2b795-6bf4-4d38-89f7-bcb5512c3e61" containerID="cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14" exitCode=0 Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.082919 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" event={"ID":"beb2b795-6bf4-4d38-89f7-bcb5512c3e61","Type":"ContainerDied","Data":"cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14"} Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.109350 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.130195 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.140737 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.140772 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.140783 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.140801 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.140812 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:00Z","lastTransitionTime":"2026-01-28T20:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.148781 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.150409 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.156448 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.170501 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.187574 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.210781 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e9d34221d49b3737f9b22c58b53644245a0384c7dce6402d4e84c84fef9aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.228973 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.242358 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.244895 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.244943 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.244956 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.244978 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.244994 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:00Z","lastTransitionTime":"2026-01-28T20:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.256576 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.271321 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.289702 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.308389 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.325117 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.338790 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.347819 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.347872 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.347889 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.347909 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.347921 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:00Z","lastTransitionTime":"2026-01-28T20:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.352196 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.365884 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.378328 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.396737 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e9d34221d49b3737f9b22c58b53644245a0384c7dce6402d4e84c84fef9aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.413523 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.428839 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.441133 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.450347 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.450385 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.450395 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.450414 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.450425 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:00Z","lastTransitionTime":"2026-01-28T20:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.456005 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.464222 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:40:00 crc kubenswrapper[4746]: E0128 20:40:00.464412 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:40:08.464386555 +0000 UTC m=+36.420572929 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.464523 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.464622 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:00 crc kubenswrapper[4746]: E0128 20:40:00.464676 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 20:40:00 crc kubenswrapper[4746]: E0128 20:40:00.464701 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 20:40:00 crc kubenswrapper[4746]: E0128 20:40:00.464718 4746 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:40:00 crc kubenswrapper[4746]: E0128 20:40:00.464763 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 20:40:08.464752065 +0000 UTC m=+36.420938429 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.464841 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:00 crc kubenswrapper[4746]: E0128 20:40:00.464838 4746 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 20:40:00 crc kubenswrapper[4746]: E0128 20:40:00.465062 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 20:40:08.465051874 +0000 UTC m=+36.421238228 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 20:40:00 crc kubenswrapper[4746]: E0128 20:40:00.464887 4746 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 20:40:00 crc kubenswrapper[4746]: E0128 20:40:00.465287 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 20:40:08.46527671 +0000 UTC m=+36.421463064 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.471257 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.487072 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.503536 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.515982 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.530390 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.544644 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:00Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.553622 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.553781 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.553851 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.553920 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.554002 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:00Z","lastTransitionTime":"2026-01-28T20:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.566167 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:00 crc kubenswrapper[4746]: E0128 20:40:00.566365 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 20:40:00 crc kubenswrapper[4746]: E0128 20:40:00.566413 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 20:40:00 crc kubenswrapper[4746]: E0128 20:40:00.566430 4746 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:40:00 crc kubenswrapper[4746]: E0128 20:40:00.566505 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 20:40:08.566484138 +0000 UTC m=+36.522670492 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.657414 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.657471 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.657484 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.657507 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.657526 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:00Z","lastTransitionTime":"2026-01-28T20:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.760806 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.760844 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.760856 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.760873 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.760885 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:00Z","lastTransitionTime":"2026-01-28T20:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.817570 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 03:01:03.093468279 +0000 UTC Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.835051 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.835070 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:00 crc kubenswrapper[4746]: E0128 20:40:00.835294 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:00 crc kubenswrapper[4746]: E0128 20:40:00.835414 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.863097 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.863488 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.863609 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.863934 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.864233 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:00Z","lastTransitionTime":"2026-01-28T20:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.967715 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.967764 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.967779 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.967810 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:00 crc kubenswrapper[4746]: I0128 20:40:00.967829 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:00Z","lastTransitionTime":"2026-01-28T20:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.071565 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.071643 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.071670 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.071704 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.071724 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:01Z","lastTransitionTime":"2026-01-28T20:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.103799 4746 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.105439 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" event={"ID":"beb2b795-6bf4-4d38-89f7-bcb5512c3e61","Type":"ContainerStarted","Data":"564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607"} Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.129614 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.145020 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.167416 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.176288 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.176359 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.176397 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.176426 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.176447 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:01Z","lastTransitionTime":"2026-01-28T20:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.186465 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.217366 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e9d34221d49b3737f9b22c58b53644245a0384c7dce6402d4e84c84fef9aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.237519 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.256791 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.269275 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.279609 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.279645 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.279656 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.279676 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.279691 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:01Z","lastTransitionTime":"2026-01-28T20:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.286811 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.301579 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.323886 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.347123 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.361550 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.377177 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.382147 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.382176 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.382186 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.382197 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.382206 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:01Z","lastTransitionTime":"2026-01-28T20:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.484156 4746 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.484987 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.485023 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.485034 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.485054 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.485066 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:01Z","lastTransitionTime":"2026-01-28T20:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.587919 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.587954 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.587963 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.587982 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.587991 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:01Z","lastTransitionTime":"2026-01-28T20:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.691464 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.691503 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.691522 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.691544 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.691557 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:01Z","lastTransitionTime":"2026-01-28T20:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.795796 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.796430 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.796541 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.796629 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.796706 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:01Z","lastTransitionTime":"2026-01-28T20:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.817778 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 22:29:46.8440006 +0000 UTC Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.835191 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:01 crc kubenswrapper[4746]: E0128 20:40:01.835315 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.899232 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.899260 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.899268 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.899280 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:01 crc kubenswrapper[4746]: I0128 20:40:01.899289 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:01Z","lastTransitionTime":"2026-01-28T20:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.002093 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.002124 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.002133 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.002146 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.002154 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:02Z","lastTransitionTime":"2026-01-28T20:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.105213 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.105246 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.105254 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.105268 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.105277 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:02Z","lastTransitionTime":"2026-01-28T20:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.107748 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovnkube-controller/0.log" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.110170 4746 generic.go:334] "Generic (PLEG): container finished" podID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerID="d5e9d34221d49b3737f9b22c58b53644245a0384c7dce6402d4e84c84fef9aa4" exitCode=1 Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.110228 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerDied","Data":"d5e9d34221d49b3737f9b22c58b53644245a0384c7dce6402d4e84c84fef9aa4"} Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.111150 4746 scope.go:117] "RemoveContainer" containerID="d5e9d34221d49b3737f9b22c58b53644245a0384c7dce6402d4e84c84fef9aa4" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.127300 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.140298 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.157068 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.175123 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.194236 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.205524 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.206969 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.207011 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.207022 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.207041 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.207053 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:02Z","lastTransitionTime":"2026-01-28T20:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.241530 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e9d34221d49b3737f9b22c58b53644245a0384c7dce6402d4e84c84fef9aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e9d34221d49b3737f9b22c58b53644245a0384c7dce6402d4e84c84fef9aa4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"message\\\":\\\"enshift-dns/node-resolver-gcrxx\\\\nI0128 20:40:01.942633 6016 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-gcrxx in node crc\\\\nI0128 20:40:01.942655 6016 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gcrxx after 0 failed attempt(s)\\\\nI0128 20:40:01.942651 6016 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:01.942779 6016 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:01.942677 6016 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gcrxx\\\\nF0128 20:40:01.941342 6016 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.260863 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.284896 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.310208 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.310245 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.310255 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.310272 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.310285 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:02Z","lastTransitionTime":"2026-01-28T20:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.310483 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.327992 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.344394 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.360463 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.376598 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.413808 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.413843 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.413853 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.413869 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.413880 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:02Z","lastTransitionTime":"2026-01-28T20:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.517747 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.517798 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.517810 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.517830 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.517843 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:02Z","lastTransitionTime":"2026-01-28T20:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.621300 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.621595 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.621668 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.621729 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.621794 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:02Z","lastTransitionTime":"2026-01-28T20:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.724269 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.724321 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.724335 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.724357 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.724372 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:02Z","lastTransitionTime":"2026-01-28T20:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.819033 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 08:54:37.831510849 +0000 UTC Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.827606 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.827654 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.827666 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.827685 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.827697 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:02Z","lastTransitionTime":"2026-01-28T20:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.835643 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.835778 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:02 crc kubenswrapper[4746]: E0128 20:40:02.835904 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:02 crc kubenswrapper[4746]: E0128 20:40:02.835959 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.859561 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.886730 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.902575 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.923758 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.930241 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.930293 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.930305 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.930324 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.930336 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:02Z","lastTransitionTime":"2026-01-28T20:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.936373 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.955887 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e9d34221d49b3737f9b22c58b53644245a0384c7dce6402d4e84c84fef9aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e9d34221d49b3737f9b22c58b53644245a0384c7dce6402d4e84c84fef9aa4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"message\\\":\\\"enshift-dns/node-resolver-gcrxx\\\\nI0128 20:40:01.942633 6016 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-gcrxx in node crc\\\\nI0128 20:40:01.942655 6016 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gcrxx after 0 failed attempt(s)\\\\nI0128 20:40:01.942651 6016 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:01.942779 6016 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:01.942677 6016 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gcrxx\\\\nF0128 20:40:01.941342 6016 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.975307 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:02 crc kubenswrapper[4746]: I0128 20:40:02.991337 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.004942 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.018845 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.033516 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.033589 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.033607 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.033630 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.033643 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:03Z","lastTransitionTime":"2026-01-28T20:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.038058 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.054549 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.074121 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.085356 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.115702 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovnkube-controller/0.log" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.119187 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerStarted","Data":"b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9"} Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.119299 4746 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.135231 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.136624 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.136701 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.136752 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.136797 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.136856 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:03Z","lastTransitionTime":"2026-01-28T20:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.150794 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.166391 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.181461 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.200977 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e9d34221d49b3737f9b22c58b53644245a0384c7dce6402d4e84c84fef9aa4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"message\\\":\\\"enshift-dns/node-resolver-gcrxx\\\\nI0128 20:40:01.942633 6016 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-gcrxx in node crc\\\\nI0128 20:40:01.942655 6016 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gcrxx after 0 failed attempt(s)\\\\nI0128 20:40:01.942651 6016 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:01.942779 6016 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:01.942677 6016 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gcrxx\\\\nF0128 20:40:01.941342 6016 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.213955 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.226284 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.236687 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.239457 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.239485 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.239496 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.239519 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.239532 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:03Z","lastTransitionTime":"2026-01-28T20:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.258136 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.273506 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.294340 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.309492 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.321510 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.334455 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.343202 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.343242 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.343251 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.343269 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.343281 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:03Z","lastTransitionTime":"2026-01-28T20:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.446056 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.446109 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.446117 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.446132 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.446141 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:03Z","lastTransitionTime":"2026-01-28T20:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.548724 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.548770 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.548779 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.548795 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.548804 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:03Z","lastTransitionTime":"2026-01-28T20:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.652262 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.652312 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.652323 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.652345 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.652355 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:03Z","lastTransitionTime":"2026-01-28T20:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.755390 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.755449 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.755465 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.755487 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.755499 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:03Z","lastTransitionTime":"2026-01-28T20:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.819628 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 06:41:49.467572818 +0000 UTC Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.835130 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:03 crc kubenswrapper[4746]: E0128 20:40:03.835386 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.858720 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.858760 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.858771 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.858787 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.858802 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:03Z","lastTransitionTime":"2026-01-28T20:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.961614 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.961679 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.961690 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.961706 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:03 crc kubenswrapper[4746]: I0128 20:40:03.961717 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:03Z","lastTransitionTime":"2026-01-28T20:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.064874 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.066000 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.066099 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.066125 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.066140 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:04Z","lastTransitionTime":"2026-01-28T20:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.126163 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovnkube-controller/1.log" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.127235 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovnkube-controller/0.log" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.131106 4746 generic.go:334] "Generic (PLEG): container finished" podID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerID="b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9" exitCode=1 Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.131162 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerDied","Data":"b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9"} Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.131221 4746 scope.go:117] "RemoveContainer" containerID="d5e9d34221d49b3737f9b22c58b53644245a0384c7dce6402d4e84c84fef9aa4" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.132407 4746 scope.go:117] "RemoveContainer" containerID="b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9" Jan 28 20:40:04 crc kubenswrapper[4746]: E0128 20:40:04.132730 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.151650 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:04Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.169384 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.169457 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.169479 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.169511 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.169568 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:04Z","lastTransitionTime":"2026-01-28T20:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.176240 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:04Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.196610 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:04Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.216605 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:04Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.231122 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:04Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.243500 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:04Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.271069 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e9d34221d49b3737f9b22c58b53644245a0384c7dce6402d4e84c84fef9aa4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"message\\\":\\\"enshift-dns/node-resolver-gcrxx\\\\nI0128 20:40:01.942633 6016 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-gcrxx in node crc\\\\nI0128 20:40:01.942655 6016 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gcrxx after 0 failed attempt(s)\\\\nI0128 20:40:01.942651 6016 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.59 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:01.942779 6016 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:01.942677 6016 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gcrxx\\\\nF0128 20:40:01.941342 6016 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:03Z\\\",\\\"message\\\":\\\"8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0128 20:40:03.124373 6172 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0128 20:40:03.124394 6172 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0128 20:40:03.124408 6172 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0128 20:40:03.124440 6172 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0128 20:40:03.124452 6172 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0128 20:40:03.124447 6172 handler.go:208] Removed *v1.Node event handler 2\\\\nI0128 20:40:03.124467 6172 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0128 20:40:03.124473 6172 handler.go:208] Removed *v1.Node event handler 7\\\\nI0128 20:40:03.124475 6172 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0128 20:40:03.124488 6172 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0128 20:40:03.124488 6172 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0128 20:40:03.124508 6172 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0128 20:40:03.124525 6172 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0128 20:40:03.124578 6172 factory.go:656] Stopping watch factory\\\\nI0128 20:40:03.124590 6172 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:04Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.274455 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.274503 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.274513 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.274534 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.274552 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:04Z","lastTransitionTime":"2026-01-28T20:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.289829 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:04Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.308895 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:04Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.327441 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:04Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.346720 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:04Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.366116 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:04Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.377612 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.377667 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.377685 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.377711 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.377730 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:04Z","lastTransitionTime":"2026-01-28T20:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.390059 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:04Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.413285 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:04Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.480457 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.480513 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.480525 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.480544 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.480558 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:04Z","lastTransitionTime":"2026-01-28T20:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.583944 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.584007 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.584024 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.584053 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.584069 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:04Z","lastTransitionTime":"2026-01-28T20:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.687722 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.687795 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.687812 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.687843 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.687861 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:04Z","lastTransitionTime":"2026-01-28T20:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.790230 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.790278 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.790289 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.790350 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.790363 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:04Z","lastTransitionTime":"2026-01-28T20:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.820179 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 04:02:12.308003019 +0000 UTC Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.835830 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.835845 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:04 crc kubenswrapper[4746]: E0128 20:40:04.836020 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:04 crc kubenswrapper[4746]: E0128 20:40:04.836233 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.893251 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.893642 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.893779 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.893901 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.894028 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:04Z","lastTransitionTime":"2026-01-28T20:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.996861 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.997198 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.997340 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.997449 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:04 crc kubenswrapper[4746]: I0128 20:40:04.997543 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:04Z","lastTransitionTime":"2026-01-28T20:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.100696 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.100772 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.100794 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.100852 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.100880 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:05Z","lastTransitionTime":"2026-01-28T20:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.138563 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovnkube-controller/1.log" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.144653 4746 scope.go:117] "RemoveContainer" containerID="b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9" Jan 28 20:40:05 crc kubenswrapper[4746]: E0128 20:40:05.145024 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.158995 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.186795 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.204072 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.204391 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.204500 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.204590 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.204686 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:05Z","lastTransitionTime":"2026-01-28T20:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.207026 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.227608 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:03Z\\\",\\\"message\\\":\\\"8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0128 20:40:03.124373 6172 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0128 20:40:03.124394 6172 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0128 20:40:03.124408 6172 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0128 20:40:03.124440 6172 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0128 20:40:03.124452 6172 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0128 20:40:03.124447 6172 handler.go:208] Removed *v1.Node event handler 2\\\\nI0128 20:40:03.124467 6172 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0128 20:40:03.124473 6172 handler.go:208] Removed *v1.Node event handler 7\\\\nI0128 20:40:03.124475 6172 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0128 20:40:03.124488 6172 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0128 20:40:03.124488 6172 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0128 20:40:03.124508 6172 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0128 20:40:03.124525 6172 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0128 20:40:03.124578 6172 factory.go:656] Stopping watch factory\\\\nI0128 20:40:03.124590 6172 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.246292 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.264987 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.281573 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.296224 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.307659 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.307713 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.307722 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.307738 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.307751 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:05Z","lastTransitionTime":"2026-01-28T20:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.316695 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.337860 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.353843 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.365111 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.381785 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.395266 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.410186 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.410236 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.410251 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.410297 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.410315 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:05Z","lastTransitionTime":"2026-01-28T20:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.513437 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.513484 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.513495 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.513513 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.513524 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:05Z","lastTransitionTime":"2026-01-28T20:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.616330 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.616364 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.616372 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.616386 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.616396 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:05Z","lastTransitionTime":"2026-01-28T20:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.719121 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.719173 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.719187 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.719208 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.719225 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:05Z","lastTransitionTime":"2026-01-28T20:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.820545 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 11:16:36.278423445 +0000 UTC Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.822626 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.822682 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.822700 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.822723 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.822740 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:05Z","lastTransitionTime":"2026-01-28T20:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.835297 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:05 crc kubenswrapper[4746]: E0128 20:40:05.835448 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.920922 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7"] Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.921614 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.925893 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.925991 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.926500 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.926576 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.926597 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.926625 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.926647 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:05Z","lastTransitionTime":"2026-01-28T20:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.939727 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.960531 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.974034 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:05 crc kubenswrapper[4746]: I0128 20:40:05.988183 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.002027 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:05Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.013697 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:06Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.024602 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:06Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.028879 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.028917 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.028930 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.028943 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.028952 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:06Z","lastTransitionTime":"2026-01-28T20:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.029203 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/de4fb5f0-7c65-4fdb-8389-d2c8462e130b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9zvm7\" (UID: \"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.029230 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/de4fb5f0-7c65-4fdb-8389-d2c8462e130b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9zvm7\" (UID: \"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.029279 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvkqz\" (UniqueName: \"kubernetes.io/projected/de4fb5f0-7c65-4fdb-8389-d2c8462e130b-kube-api-access-jvkqz\") pod \"ovnkube-control-plane-749d76644c-9zvm7\" (UID: \"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.029322 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/de4fb5f0-7c65-4fdb-8389-d2c8462e130b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9zvm7\" (UID: \"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.041367 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:06Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.057440 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:06Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.071811 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:06Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.083970 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:06Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.093181 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:06Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.111054 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:03Z\\\",\\\"message\\\":\\\"8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0128 20:40:03.124373 6172 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0128 20:40:03.124394 6172 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0128 20:40:03.124408 6172 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0128 20:40:03.124440 6172 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0128 20:40:03.124452 6172 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0128 20:40:03.124447 6172 handler.go:208] Removed *v1.Node event handler 2\\\\nI0128 20:40:03.124467 6172 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0128 20:40:03.124473 6172 handler.go:208] Removed *v1.Node event handler 7\\\\nI0128 20:40:03.124475 6172 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0128 20:40:03.124488 6172 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0128 20:40:03.124488 6172 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0128 20:40:03.124508 6172 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0128 20:40:03.124525 6172 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0128 20:40:03.124578 6172 factory.go:656] Stopping watch factory\\\\nI0128 20:40:03.124590 6172 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:06Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.128398 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:06Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.130539 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvkqz\" (UniqueName: \"kubernetes.io/projected/de4fb5f0-7c65-4fdb-8389-d2c8462e130b-kube-api-access-jvkqz\") pod \"ovnkube-control-plane-749d76644c-9zvm7\" (UID: \"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.130653 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/de4fb5f0-7c65-4fdb-8389-d2c8462e130b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9zvm7\" (UID: \"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.130724 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/de4fb5f0-7c65-4fdb-8389-d2c8462e130b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9zvm7\" (UID: \"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.130766 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/de4fb5f0-7c65-4fdb-8389-d2c8462e130b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9zvm7\" (UID: \"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.131468 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.131500 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/de4fb5f0-7c65-4fdb-8389-d2c8462e130b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9zvm7\" (UID: \"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.131540 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.131565 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.131612 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/de4fb5f0-7c65-4fdb-8389-d2c8462e130b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9zvm7\" (UID: \"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.131640 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.131663 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:06Z","lastTransitionTime":"2026-01-28T20:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.136473 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/de4fb5f0-7c65-4fdb-8389-d2c8462e130b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9zvm7\" (UID: \"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.148196 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:06Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.160672 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvkqz\" (UniqueName: \"kubernetes.io/projected/de4fb5f0-7c65-4fdb-8389-d2c8462e130b-kube-api-access-jvkqz\") pod \"ovnkube-control-plane-749d76644c-9zvm7\" (UID: \"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.234264 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.234561 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.234628 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.234713 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.234799 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:06Z","lastTransitionTime":"2026-01-28T20:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.242443 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" Jan 28 20:40:06 crc kubenswrapper[4746]: W0128 20:40:06.269916 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde4fb5f0_7c65_4fdb_8389_d2c8462e130b.slice/crio-3b2f35a4e42fcef6ff209900a3c673a3f927b56f874b435e7bf0f13d14b85d09 WatchSource:0}: Error finding container 3b2f35a4e42fcef6ff209900a3c673a3f927b56f874b435e7bf0f13d14b85d09: Status 404 returned error can't find the container with id 3b2f35a4e42fcef6ff209900a3c673a3f927b56f874b435e7bf0f13d14b85d09 Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.337010 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.337051 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.337063 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.337101 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.337119 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:06Z","lastTransitionTime":"2026-01-28T20:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.440763 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.440800 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.440808 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.440822 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.440830 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:06Z","lastTransitionTime":"2026-01-28T20:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.542989 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.543014 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.543021 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.543035 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.543044 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:06Z","lastTransitionTime":"2026-01-28T20:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.645825 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.645878 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.645887 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.645904 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.645914 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:06Z","lastTransitionTime":"2026-01-28T20:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.748571 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.748621 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.748629 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.748649 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.748660 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:06Z","lastTransitionTime":"2026-01-28T20:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.821427 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 23:37:02.773345039 +0000 UTC Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.835775 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.835842 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:06 crc kubenswrapper[4746]: E0128 20:40:06.836008 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:06 crc kubenswrapper[4746]: E0128 20:40:06.836224 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.850834 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.850889 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.850909 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.850935 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.850950 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:06Z","lastTransitionTime":"2026-01-28T20:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.954409 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.954525 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.954554 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.954592 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:06 crc kubenswrapper[4746]: I0128 20:40:06.954615 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:06Z","lastTransitionTime":"2026-01-28T20:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.038853 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-2blg6"] Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.039848 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:07 crc kubenswrapper[4746]: E0128 20:40:07.039971 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.058708 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.058786 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.058817 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.058851 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.058875 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:07Z","lastTransitionTime":"2026-01-28T20:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.063425 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.087221 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.102938 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.136014 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:03Z\\\",\\\"message\\\":\\\"8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0128 20:40:03.124373 6172 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0128 20:40:03.124394 6172 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0128 20:40:03.124408 6172 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0128 20:40:03.124440 6172 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0128 20:40:03.124452 6172 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0128 20:40:03.124447 6172 handler.go:208] Removed *v1.Node event handler 2\\\\nI0128 20:40:03.124467 6172 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0128 20:40:03.124473 6172 handler.go:208] Removed *v1.Node event handler 7\\\\nI0128 20:40:03.124475 6172 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0128 20:40:03.124488 6172 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0128 20:40:03.124488 6172 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0128 20:40:03.124508 6172 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0128 20:40:03.124525 6172 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0128 20:40:03.124578 6172 factory.go:656] Stopping watch factory\\\\nI0128 20:40:03.124590 6172 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.141408 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2www\" (UniqueName: \"kubernetes.io/projected/f60a5487-5012-4cc9-ad94-5dfb4957d74e-kube-api-access-l2www\") pod \"network-metrics-daemon-2blg6\" (UID: \"f60a5487-5012-4cc9-ad94-5dfb4957d74e\") " pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.141485 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs\") pod \"network-metrics-daemon-2blg6\" (UID: \"f60a5487-5012-4cc9-ad94-5dfb4957d74e\") " pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.154149 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" event={"ID":"de4fb5f0-7c65-4fdb-8389-d2c8462e130b","Type":"ContainerStarted","Data":"d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188"} Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.154200 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" event={"ID":"de4fb5f0-7c65-4fdb-8389-d2c8462e130b","Type":"ContainerStarted","Data":"972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c"} Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.154213 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" event={"ID":"de4fb5f0-7c65-4fdb-8389-d2c8462e130b","Type":"ContainerStarted","Data":"3b2f35a4e42fcef6ff209900a3c673a3f927b56f874b435e7bf0f13d14b85d09"} Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.159240 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.161380 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.161425 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.161437 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.161460 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.161473 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:07Z","lastTransitionTime":"2026-01-28T20:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.177426 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.194822 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.211558 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.230176 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.243495 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs\") pod \"network-metrics-daemon-2blg6\" (UID: \"f60a5487-5012-4cc9-ad94-5dfb4957d74e\") " pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.243622 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2www\" (UniqueName: \"kubernetes.io/projected/f60a5487-5012-4cc9-ad94-5dfb4957d74e-kube-api-access-l2www\") pod \"network-metrics-daemon-2blg6\" (UID: \"f60a5487-5012-4cc9-ad94-5dfb4957d74e\") " pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:07 crc kubenswrapper[4746]: E0128 20:40:07.244122 4746 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 20:40:07 crc kubenswrapper[4746]: E0128 20:40:07.244254 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs podName:f60a5487-5012-4cc9-ad94-5dfb4957d74e nodeName:}" failed. No retries permitted until 2026-01-28 20:40:07.744218559 +0000 UTC m=+35.700404953 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs") pod "network-metrics-daemon-2blg6" (UID: "f60a5487-5012-4cc9-ad94-5dfb4957d74e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.245863 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.260178 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.266749 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.266830 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.266850 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.266898 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.266929 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:07Z","lastTransitionTime":"2026-01-28T20:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.269551 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2www\" (UniqueName: \"kubernetes.io/projected/f60a5487-5012-4cc9-ad94-5dfb4957d74e-kube-api-access-l2www\") pod \"network-metrics-daemon-2blg6\" (UID: \"f60a5487-5012-4cc9-ad94-5dfb4957d74e\") " pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.282698 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.298697 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.313900 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.329589 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.348674 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.363912 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.371926 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.371995 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.372014 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.372044 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.372062 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:07Z","lastTransitionTime":"2026-01-28T20:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.380055 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.396574 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.418705 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.439691 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.458048 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.472121 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.474943 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.475024 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.475047 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.475103 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.475126 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:07Z","lastTransitionTime":"2026-01-28T20:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.498051 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:03Z\\\",\\\"message\\\":\\\"8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0128 20:40:03.124373 6172 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0128 20:40:03.124394 6172 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0128 20:40:03.124408 6172 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0128 20:40:03.124440 6172 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0128 20:40:03.124452 6172 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0128 20:40:03.124447 6172 handler.go:208] Removed *v1.Node event handler 2\\\\nI0128 20:40:03.124467 6172 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0128 20:40:03.124473 6172 handler.go:208] Removed *v1.Node event handler 7\\\\nI0128 20:40:03.124475 6172 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0128 20:40:03.124488 6172 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0128 20:40:03.124488 6172 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0128 20:40:03.124508 6172 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0128 20:40:03.124525 6172 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0128 20:40:03.124578 6172 factory.go:656] Stopping watch factory\\\\nI0128 20:40:03.124590 6172 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.513715 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.527886 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.545473 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.562267 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.578438 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.578505 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.578522 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.578551 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.578572 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:07Z","lastTransitionTime":"2026-01-28T20:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.582875 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.596000 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.610736 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.626193 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:07Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.682069 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.682488 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.682561 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.682631 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.682712 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:07Z","lastTransitionTime":"2026-01-28T20:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.750213 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs\") pod \"network-metrics-daemon-2blg6\" (UID: \"f60a5487-5012-4cc9-ad94-5dfb4957d74e\") " pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:07 crc kubenswrapper[4746]: E0128 20:40:07.750364 4746 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 20:40:07 crc kubenswrapper[4746]: E0128 20:40:07.750849 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs podName:f60a5487-5012-4cc9-ad94-5dfb4957d74e nodeName:}" failed. No retries permitted until 2026-01-28 20:40:08.750827236 +0000 UTC m=+36.707013590 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs") pod "network-metrics-daemon-2blg6" (UID: "f60a5487-5012-4cc9-ad94-5dfb4957d74e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.787800 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.787891 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.787917 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.787958 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.788001 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:07Z","lastTransitionTime":"2026-01-28T20:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.822249 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 12:13:51.212252092 +0000 UTC Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.835673 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:07 crc kubenswrapper[4746]: E0128 20:40:07.835837 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.891778 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.891835 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.891853 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.891884 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.891899 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:07Z","lastTransitionTime":"2026-01-28T20:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.995218 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.995299 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.995316 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.995330 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:07 crc kubenswrapper[4746]: I0128 20:40:07.995341 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:07Z","lastTransitionTime":"2026-01-28T20:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.099695 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.099743 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.099753 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.099772 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.099784 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:08Z","lastTransitionTime":"2026-01-28T20:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.203468 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.204102 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.204132 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.204168 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.204188 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:08Z","lastTransitionTime":"2026-01-28T20:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.307678 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.307740 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.307752 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.307781 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.307796 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:08Z","lastTransitionTime":"2026-01-28T20:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.411240 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.411313 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.411323 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.411337 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.411347 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:08Z","lastTransitionTime":"2026-01-28T20:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.515142 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.515211 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.515223 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.515243 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.515294 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:08Z","lastTransitionTime":"2026-01-28T20:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.558762 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.558919 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.558977 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.559008 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:08 crc kubenswrapper[4746]: E0128 20:40:08.559109 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:40:24.559058112 +0000 UTC m=+52.515244466 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:40:08 crc kubenswrapper[4746]: E0128 20:40:08.559193 4746 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 20:40:08 crc kubenswrapper[4746]: E0128 20:40:08.559212 4746 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 20:40:08 crc kubenswrapper[4746]: E0128 20:40:08.559264 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 20:40:08 crc kubenswrapper[4746]: E0128 20:40:08.559560 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 20:40:08 crc kubenswrapper[4746]: E0128 20:40:08.559582 4746 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:40:08 crc kubenswrapper[4746]: E0128 20:40:08.559442 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 20:40:24.559394982 +0000 UTC m=+52.515581376 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 20:40:08 crc kubenswrapper[4746]: E0128 20:40:08.559672 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 20:40:24.559644909 +0000 UTC m=+52.515831303 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 20:40:08 crc kubenswrapper[4746]: E0128 20:40:08.559705 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 20:40:24.55969329 +0000 UTC m=+52.515879674 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.617748 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.617843 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.617861 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.617892 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.617912 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:08Z","lastTransitionTime":"2026-01-28T20:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.659600 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:08 crc kubenswrapper[4746]: E0128 20:40:08.659852 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 20:40:08 crc kubenswrapper[4746]: E0128 20:40:08.659900 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 20:40:08 crc kubenswrapper[4746]: E0128 20:40:08.659921 4746 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:40:08 crc kubenswrapper[4746]: E0128 20:40:08.660020 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 20:40:24.659994032 +0000 UTC m=+52.616180416 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.720366 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.720414 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.720432 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.720454 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.720472 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:08Z","lastTransitionTime":"2026-01-28T20:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.760610 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs\") pod \"network-metrics-daemon-2blg6\" (UID: \"f60a5487-5012-4cc9-ad94-5dfb4957d74e\") " pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:08 crc kubenswrapper[4746]: E0128 20:40:08.760760 4746 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 20:40:08 crc kubenswrapper[4746]: E0128 20:40:08.760823 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs podName:f60a5487-5012-4cc9-ad94-5dfb4957d74e nodeName:}" failed. No retries permitted until 2026-01-28 20:40:10.76080806 +0000 UTC m=+38.716994414 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs") pod "network-metrics-daemon-2blg6" (UID: "f60a5487-5012-4cc9-ad94-5dfb4957d74e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.822422 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 08:22:17.64665375 +0000 UTC Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.824748 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.824782 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.824793 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.824810 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.824820 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:08Z","lastTransitionTime":"2026-01-28T20:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.835732 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.836106 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:08 crc kubenswrapper[4746]: E0128 20:40:08.836229 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.836384 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:08 crc kubenswrapper[4746]: E0128 20:40:08.837487 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:08 crc kubenswrapper[4746]: E0128 20:40:08.837715 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.926966 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.927007 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.927017 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.927032 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:08 crc kubenswrapper[4746]: I0128 20:40:08.927042 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:08Z","lastTransitionTime":"2026-01-28T20:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.029570 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.029617 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.029629 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.029647 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.029658 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:09Z","lastTransitionTime":"2026-01-28T20:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.134514 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.134554 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.134562 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.134579 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.134588 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:09Z","lastTransitionTime":"2026-01-28T20:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.237799 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.237851 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.237861 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.237877 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.237886 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:09Z","lastTransitionTime":"2026-01-28T20:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.342013 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.342124 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.342154 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.342193 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.342215 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:09Z","lastTransitionTime":"2026-01-28T20:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.445155 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.445207 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.445224 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.445245 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.445263 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:09Z","lastTransitionTime":"2026-01-28T20:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.456486 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.456575 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.456594 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.456628 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.456651 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:09Z","lastTransitionTime":"2026-01-28T20:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:09 crc kubenswrapper[4746]: E0128 20:40:09.473436 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:09Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.479464 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.479543 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.479565 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.479592 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.479610 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:09Z","lastTransitionTime":"2026-01-28T20:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:09 crc kubenswrapper[4746]: E0128 20:40:09.498857 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:09Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.505730 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.505808 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.505829 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.505863 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.505886 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:09Z","lastTransitionTime":"2026-01-28T20:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:09 crc kubenswrapper[4746]: E0128 20:40:09.524839 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:09Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.531397 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.531452 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.531465 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.531485 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.531500 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:09Z","lastTransitionTime":"2026-01-28T20:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:09 crc kubenswrapper[4746]: E0128 20:40:09.548981 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:09Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.553899 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.553948 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.553961 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.553978 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.553992 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:09Z","lastTransitionTime":"2026-01-28T20:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:09 crc kubenswrapper[4746]: E0128 20:40:09.576293 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:09Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:09 crc kubenswrapper[4746]: E0128 20:40:09.576554 4746 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.579152 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.579226 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.579246 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.579273 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.579289 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:09Z","lastTransitionTime":"2026-01-28T20:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.682789 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.682844 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.682861 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.682893 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.682913 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:09Z","lastTransitionTime":"2026-01-28T20:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.785833 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.785891 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.785909 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.785930 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.785943 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:09Z","lastTransitionTime":"2026-01-28T20:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.823177 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 04:13:14.265050685 +0000 UTC Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.834804 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:09 crc kubenswrapper[4746]: E0128 20:40:09.835033 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.889504 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.889590 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.889618 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.889650 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.889728 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:09Z","lastTransitionTime":"2026-01-28T20:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.993299 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.993381 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.993400 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.993426 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:09 crc kubenswrapper[4746]: I0128 20:40:09.993442 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:09Z","lastTransitionTime":"2026-01-28T20:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.097652 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.097718 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.097740 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.097770 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.097791 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:10Z","lastTransitionTime":"2026-01-28T20:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.201355 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.201458 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.201482 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.201516 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.201538 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:10Z","lastTransitionTime":"2026-01-28T20:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.305381 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.305466 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.305484 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.305513 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.305539 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:10Z","lastTransitionTime":"2026-01-28T20:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.409574 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.410230 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.410332 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.410370 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.410395 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:10Z","lastTransitionTime":"2026-01-28T20:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.515055 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.515112 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.515120 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.515133 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.515141 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:10Z","lastTransitionTime":"2026-01-28T20:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.619061 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.619249 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.619358 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.619450 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.619474 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:10Z","lastTransitionTime":"2026-01-28T20:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.722620 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.722672 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.722687 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.722706 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.722717 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:10Z","lastTransitionTime":"2026-01-28T20:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.783424 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs\") pod \"network-metrics-daemon-2blg6\" (UID: \"f60a5487-5012-4cc9-ad94-5dfb4957d74e\") " pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:10 crc kubenswrapper[4746]: E0128 20:40:10.783763 4746 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 20:40:10 crc kubenswrapper[4746]: E0128 20:40:10.783978 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs podName:f60a5487-5012-4cc9-ad94-5dfb4957d74e nodeName:}" failed. No retries permitted until 2026-01-28 20:40:14.783927226 +0000 UTC m=+42.740113890 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs") pod "network-metrics-daemon-2blg6" (UID: "f60a5487-5012-4cc9-ad94-5dfb4957d74e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.823463 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 17:28:53.934216017 +0000 UTC Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.826453 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.826554 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.826574 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.826605 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.826624 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:10Z","lastTransitionTime":"2026-01-28T20:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.835738 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.835790 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.835891 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:10 crc kubenswrapper[4746]: E0128 20:40:10.835937 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:10 crc kubenswrapper[4746]: E0128 20:40:10.836158 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:10 crc kubenswrapper[4746]: E0128 20:40:10.836459 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.929905 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.929968 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.929994 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.930022 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:10 crc kubenswrapper[4746]: I0128 20:40:10.930117 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:10Z","lastTransitionTime":"2026-01-28T20:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.033883 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.033940 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.033952 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.033974 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.033989 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:11Z","lastTransitionTime":"2026-01-28T20:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.137408 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.137464 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.137478 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.137496 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.137506 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:11Z","lastTransitionTime":"2026-01-28T20:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.241543 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.241623 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.241651 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.241685 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.241709 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:11Z","lastTransitionTime":"2026-01-28T20:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.345966 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.346035 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.346055 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.346117 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.346138 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:11Z","lastTransitionTime":"2026-01-28T20:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.468883 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.468925 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.468936 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.468951 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.468962 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:11Z","lastTransitionTime":"2026-01-28T20:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.571557 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.571591 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.571599 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.571613 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.571623 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:11Z","lastTransitionTime":"2026-01-28T20:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.674499 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.674595 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.674622 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.674663 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.674695 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:11Z","lastTransitionTime":"2026-01-28T20:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.783284 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.784195 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.784262 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.784286 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.784298 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:11Z","lastTransitionTime":"2026-01-28T20:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.823927 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 09:57:10.244254404 +0000 UTC Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.835357 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:11 crc kubenswrapper[4746]: E0128 20:40:11.835608 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.888330 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.888392 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.888410 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.888441 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.888459 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:11Z","lastTransitionTime":"2026-01-28T20:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.990979 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.991064 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.991130 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.991160 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:11 crc kubenswrapper[4746]: I0128 20:40:11.991179 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:11Z","lastTransitionTime":"2026-01-28T20:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.094304 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.094387 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.094409 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.094436 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.094454 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:12Z","lastTransitionTime":"2026-01-28T20:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.196771 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.196808 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.196817 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.196831 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.196839 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:12Z","lastTransitionTime":"2026-01-28T20:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.299938 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.300031 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.300050 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.300110 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.300131 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:12Z","lastTransitionTime":"2026-01-28T20:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.403357 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.403438 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.403470 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.403506 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.403534 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:12Z","lastTransitionTime":"2026-01-28T20:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.507046 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.507177 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.507208 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.507236 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.507255 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:12Z","lastTransitionTime":"2026-01-28T20:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.610677 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.610724 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.610734 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.610750 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.610761 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:12Z","lastTransitionTime":"2026-01-28T20:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.715734 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.715861 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.715887 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.715922 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.715945 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:12Z","lastTransitionTime":"2026-01-28T20:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.821336 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.821405 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.821427 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.821459 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.821528 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:12Z","lastTransitionTime":"2026-01-28T20:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.824612 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 20:14:45.436675728 +0000 UTC Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.835260 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.835342 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.835268 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:12 crc kubenswrapper[4746]: E0128 20:40:12.835492 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:12 crc kubenswrapper[4746]: E0128 20:40:12.835599 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:12 crc kubenswrapper[4746]: E0128 20:40:12.835748 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.855355 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:12Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.875826 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:12Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.902437 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:12Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.919354 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:12Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.925848 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.926399 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.926786 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.927170 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.927541 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:12Z","lastTransitionTime":"2026-01-28T20:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.935884 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:12Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.947661 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:12Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.974670 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:03Z\\\",\\\"message\\\":\\\"8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0128 20:40:03.124373 6172 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0128 20:40:03.124394 6172 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0128 20:40:03.124408 6172 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0128 20:40:03.124440 6172 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0128 20:40:03.124452 6172 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0128 20:40:03.124447 6172 handler.go:208] Removed *v1.Node event handler 2\\\\nI0128 20:40:03.124467 6172 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0128 20:40:03.124473 6172 handler.go:208] Removed *v1.Node event handler 7\\\\nI0128 20:40:03.124475 6172 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0128 20:40:03.124488 6172 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0128 20:40:03.124488 6172 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0128 20:40:03.124508 6172 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0128 20:40:03.124525 6172 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0128 20:40:03.124578 6172 factory.go:656] Stopping watch factory\\\\nI0128 20:40:03.124590 6172 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:12Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:12 crc kubenswrapper[4746]: I0128 20:40:12.992021 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:12Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.013320 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:13Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.030163 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.030214 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.030224 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.030248 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.030271 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:13Z","lastTransitionTime":"2026-01-28T20:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.032650 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:13Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.047220 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:13Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.064397 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:13Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.083227 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:13Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.097166 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:13Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.114393 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:13Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.130733 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:13Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.132276 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.132307 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.132316 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.132331 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.132344 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:13Z","lastTransitionTime":"2026-01-28T20:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.235948 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.236023 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.236044 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.236119 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.236139 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:13Z","lastTransitionTime":"2026-01-28T20:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.340303 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.340852 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.341002 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.341177 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.341381 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:13Z","lastTransitionTime":"2026-01-28T20:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.444262 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.445316 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.445363 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.445413 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.445432 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:13Z","lastTransitionTime":"2026-01-28T20:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.548514 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.548548 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.548557 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.548572 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.548582 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:13Z","lastTransitionTime":"2026-01-28T20:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.653668 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.653714 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.653724 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.653741 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.653752 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:13Z","lastTransitionTime":"2026-01-28T20:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.756972 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.757031 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.757049 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.757103 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.757122 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:13Z","lastTransitionTime":"2026-01-28T20:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.825206 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 04:31:35.528602783 +0000 UTC Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.835624 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:13 crc kubenswrapper[4746]: E0128 20:40:13.835800 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.860995 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.861120 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.861141 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.861168 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.861187 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:13Z","lastTransitionTime":"2026-01-28T20:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.964662 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.964724 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.964749 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.964780 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:13 crc kubenswrapper[4746]: I0128 20:40:13.964802 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:13Z","lastTransitionTime":"2026-01-28T20:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.067888 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.068471 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.068687 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.068845 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.068976 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:14Z","lastTransitionTime":"2026-01-28T20:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.172312 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.172374 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.172392 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.172419 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.172439 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:14Z","lastTransitionTime":"2026-01-28T20:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.274990 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.275033 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.275042 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.275057 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.275067 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:14Z","lastTransitionTime":"2026-01-28T20:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.377379 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.377424 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.377434 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.377447 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.377457 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:14Z","lastTransitionTime":"2026-01-28T20:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.480476 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.480525 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.480533 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.480549 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.480559 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:14Z","lastTransitionTime":"2026-01-28T20:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.584104 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.584150 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.584160 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.584186 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.584196 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:14Z","lastTransitionTime":"2026-01-28T20:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.686572 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.686617 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.686630 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.686647 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.686658 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:14Z","lastTransitionTime":"2026-01-28T20:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.789549 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.789838 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.789932 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.790032 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.790158 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:14Z","lastTransitionTime":"2026-01-28T20:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.826163 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 09:55:42.939232996 +0000 UTC Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.833834 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs\") pod \"network-metrics-daemon-2blg6\" (UID: \"f60a5487-5012-4cc9-ad94-5dfb4957d74e\") " pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:14 crc kubenswrapper[4746]: E0128 20:40:14.834068 4746 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 20:40:14 crc kubenswrapper[4746]: E0128 20:40:14.834265 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs podName:f60a5487-5012-4cc9-ad94-5dfb4957d74e nodeName:}" failed. No retries permitted until 2026-01-28 20:40:22.834228444 +0000 UTC m=+50.790414838 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs") pod "network-metrics-daemon-2blg6" (UID: "f60a5487-5012-4cc9-ad94-5dfb4957d74e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.834880 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.834944 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.834981 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:14 crc kubenswrapper[4746]: E0128 20:40:14.835102 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:14 crc kubenswrapper[4746]: E0128 20:40:14.835231 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:14 crc kubenswrapper[4746]: E0128 20:40:14.835360 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.893758 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.893847 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.893862 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.893883 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.893901 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:14Z","lastTransitionTime":"2026-01-28T20:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.996368 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.996412 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.996420 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.996435 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:14 crc kubenswrapper[4746]: I0128 20:40:14.996444 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:14Z","lastTransitionTime":"2026-01-28T20:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.099591 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.099665 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.099687 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.099721 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.099744 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:15Z","lastTransitionTime":"2026-01-28T20:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.202127 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.202193 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.202215 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.202243 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.202266 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:15Z","lastTransitionTime":"2026-01-28T20:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.305826 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.305893 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.305906 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.305937 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.305954 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:15Z","lastTransitionTime":"2026-01-28T20:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.409196 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.409256 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.409271 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.409288 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.409304 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:15Z","lastTransitionTime":"2026-01-28T20:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.513112 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.513181 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.513198 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.513226 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.513255 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:15Z","lastTransitionTime":"2026-01-28T20:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.616686 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.616780 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.616802 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.616831 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.616850 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:15Z","lastTransitionTime":"2026-01-28T20:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.719840 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.719898 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.719915 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.719939 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.719963 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:15Z","lastTransitionTime":"2026-01-28T20:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.823480 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.823540 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.823551 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.823567 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.823580 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:15Z","lastTransitionTime":"2026-01-28T20:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.826892 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 22:45:10.239756678 +0000 UTC Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.835410 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:15 crc kubenswrapper[4746]: E0128 20:40:15.835752 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.836033 4746 scope.go:117] "RemoveContainer" containerID="56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.928728 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.928777 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.928787 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.928808 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:15 crc kubenswrapper[4746]: I0128 20:40:15.928821 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:15Z","lastTransitionTime":"2026-01-28T20:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.031678 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.031715 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.031724 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.031741 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.031751 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:16Z","lastTransitionTime":"2026-01-28T20:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.134616 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.134661 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.134674 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.134691 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.134703 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:16Z","lastTransitionTime":"2026-01-28T20:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.200719 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.202444 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3"} Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.202816 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.218342 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:16Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.231485 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:16Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.236611 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.236642 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.236653 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.236668 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.236677 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:16Z","lastTransitionTime":"2026-01-28T20:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.245650 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:16Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.259424 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:16Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.270385 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:16Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.285542 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:16Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.298363 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:16Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.309075 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:16Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.328641 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:03Z\\\",\\\"message\\\":\\\"8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0128 20:40:03.124373 6172 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0128 20:40:03.124394 6172 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0128 20:40:03.124408 6172 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0128 20:40:03.124440 6172 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0128 20:40:03.124452 6172 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0128 20:40:03.124447 6172 handler.go:208] Removed *v1.Node event handler 2\\\\nI0128 20:40:03.124467 6172 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0128 20:40:03.124473 6172 handler.go:208] Removed *v1.Node event handler 7\\\\nI0128 20:40:03.124475 6172 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0128 20:40:03.124488 6172 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0128 20:40:03.124488 6172 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0128 20:40:03.124508 6172 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0128 20:40:03.124525 6172 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0128 20:40:03.124578 6172 factory.go:656] Stopping watch factory\\\\nI0128 20:40:03.124590 6172 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:16Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.339350 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.339371 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.339379 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.339390 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.339399 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:16Z","lastTransitionTime":"2026-01-28T20:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.345341 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:16Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.357328 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:16Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.374312 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:16Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.387994 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:16Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.398839 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:16Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.410683 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:16Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.426805 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:16Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.432187 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.433045 4746 scope.go:117] "RemoveContainer" containerID="b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.441326 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.441359 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.441372 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.441389 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.441401 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:16Z","lastTransitionTime":"2026-01-28T20:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.545098 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.545710 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.545722 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.545743 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.545757 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:16Z","lastTransitionTime":"2026-01-28T20:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.648015 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.648052 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.648061 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.648111 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.648130 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:16Z","lastTransitionTime":"2026-01-28T20:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.751398 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.751463 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.751482 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.751513 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.751534 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:16Z","lastTransitionTime":"2026-01-28T20:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.827411 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 03:48:14.6901677 +0000 UTC Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.838322 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.838426 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.838435 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:16 crc kubenswrapper[4746]: E0128 20:40:16.838590 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:16 crc kubenswrapper[4746]: E0128 20:40:16.838698 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:16 crc kubenswrapper[4746]: E0128 20:40:16.838861 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.853847 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.853886 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.853897 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.853913 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.853923 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:16Z","lastTransitionTime":"2026-01-28T20:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.957942 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.958004 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.958026 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.958047 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:16 crc kubenswrapper[4746]: I0128 20:40:16.958058 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:16Z","lastTransitionTime":"2026-01-28T20:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.060691 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.060739 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.060750 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.060767 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.060776 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:17Z","lastTransitionTime":"2026-01-28T20:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.170843 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.170893 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.170905 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.170926 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.170939 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:17Z","lastTransitionTime":"2026-01-28T20:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.207449 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovnkube-controller/1.log" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.209763 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerStarted","Data":"53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76"} Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.210511 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.224051 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:17Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.239813 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:17Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.253495 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:17Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.267490 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:17Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.274023 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.274099 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.274110 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.274131 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.274168 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:17Z","lastTransitionTime":"2026-01-28T20:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.282335 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:17Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.298003 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:17Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.313375 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:17Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.324501 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:17Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.337979 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:17Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.357453 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:17Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.372951 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:17Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.376876 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.376919 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.376931 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.376951 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.376964 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:17Z","lastTransitionTime":"2026-01-28T20:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.386904 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:17Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.401267 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:17Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.421817 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:03Z\\\",\\\"message\\\":\\\"8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0128 20:40:03.124373 6172 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0128 20:40:03.124394 6172 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0128 20:40:03.124408 6172 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0128 20:40:03.124440 6172 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0128 20:40:03.124452 6172 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0128 20:40:03.124447 6172 handler.go:208] Removed *v1.Node event handler 2\\\\nI0128 20:40:03.124467 6172 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0128 20:40:03.124473 6172 handler.go:208] Removed *v1.Node event handler 7\\\\nI0128 20:40:03.124475 6172 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0128 20:40:03.124488 6172 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0128 20:40:03.124488 6172 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0128 20:40:03.124508 6172 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0128 20:40:03.124525 6172 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0128 20:40:03.124578 6172 factory.go:656] Stopping watch factory\\\\nI0128 20:40:03.124590 6172 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:17Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.441943 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:17Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.458476 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:17Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.479681 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.479722 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.479730 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.479747 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.479757 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:17Z","lastTransitionTime":"2026-01-28T20:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.582695 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.582781 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.582802 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.582829 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.582850 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:17Z","lastTransitionTime":"2026-01-28T20:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.686257 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.686333 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.686349 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.686377 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.686392 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:17Z","lastTransitionTime":"2026-01-28T20:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.788925 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.788971 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.788981 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.788997 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.789007 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:17Z","lastTransitionTime":"2026-01-28T20:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.828155 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 06:06:38.813741475 +0000 UTC Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.835192 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:17 crc kubenswrapper[4746]: E0128 20:40:17.835455 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.892610 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.892727 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.892774 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.892798 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.892812 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:17Z","lastTransitionTime":"2026-01-28T20:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.996122 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.996191 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.996205 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.996229 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:17 crc kubenswrapper[4746]: I0128 20:40:17.996243 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:17Z","lastTransitionTime":"2026-01-28T20:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.099566 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.099615 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.099626 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.099644 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.099657 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:18Z","lastTransitionTime":"2026-01-28T20:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.202643 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.202715 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.202736 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.202762 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.202785 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:18Z","lastTransitionTime":"2026-01-28T20:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.215583 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovnkube-controller/2.log" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.216237 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovnkube-controller/1.log" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.219927 4746 generic.go:334] "Generic (PLEG): container finished" podID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerID="53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76" exitCode=1 Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.219968 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerDied","Data":"53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76"} Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.220004 4746 scope.go:117] "RemoveContainer" containerID="b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.220679 4746 scope.go:117] "RemoveContainer" containerID="53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76" Jan 28 20:40:18 crc kubenswrapper[4746]: E0128 20:40:18.220826 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.242584 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:18Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.258446 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:18Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.273387 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:18Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.287939 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:18Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.305565 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.305616 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.305625 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.305642 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.305652 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:18Z","lastTransitionTime":"2026-01-28T20:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.318000 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8dbe081b6eaa82770425f55f41842f6e94b364cccd11dfd0a2c67d00e0863e9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:03Z\\\",\\\"message\\\":\\\"8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0128 20:40:03.124373 6172 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0128 20:40:03.124394 6172 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0128 20:40:03.124408 6172 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0128 20:40:03.124440 6172 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0128 20:40:03.124452 6172 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0128 20:40:03.124447 6172 handler.go:208] Removed *v1.Node event handler 2\\\\nI0128 20:40:03.124467 6172 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0128 20:40:03.124473 6172 handler.go:208] Removed *v1.Node event handler 7\\\\nI0128 20:40:03.124475 6172 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0128 20:40:03.124488 6172 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0128 20:40:03.124488 6172 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0128 20:40:03.124508 6172 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0128 20:40:03.124525 6172 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0128 20:40:03.124578 6172 factory.go:656] Stopping watch factory\\\\nI0128 20:40:03.124590 6172 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:17Z\\\",\\\"message\\\":\\\"Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI0128 20:40:17.344732 6409 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:17.344766 6409 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0128 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:18Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.337453 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:18Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.353823 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:18Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.366438 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:18Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.383114 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:18Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.399977 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:18Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.408613 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.408668 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.408678 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.408693 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.408703 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:18Z","lastTransitionTime":"2026-01-28T20:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.413612 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:18Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.430049 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:18Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.453404 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:18Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.502825 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:18Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.512657 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.512709 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.512718 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.512736 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.512747 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:18Z","lastTransitionTime":"2026-01-28T20:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.518223 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:18Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.535353 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:18Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.616348 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.616441 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.616466 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.616502 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.616524 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:18Z","lastTransitionTime":"2026-01-28T20:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.720519 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.720629 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.720656 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.720704 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.720735 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:18Z","lastTransitionTime":"2026-01-28T20:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.824777 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.824850 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.824868 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.824897 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.824935 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:18Z","lastTransitionTime":"2026-01-28T20:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.828521 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 10:22:22.310799008 +0000 UTC Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.834925 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.834960 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:18 crc kubenswrapper[4746]: E0128 20:40:18.835257 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.834966 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:18 crc kubenswrapper[4746]: E0128 20:40:18.835527 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:18 crc kubenswrapper[4746]: E0128 20:40:18.835664 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.927896 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.927968 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.927986 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.928013 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:18 crc kubenswrapper[4746]: I0128 20:40:18.928033 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:18Z","lastTransitionTime":"2026-01-28T20:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.031512 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.031581 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.031599 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.031626 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.031645 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:19Z","lastTransitionTime":"2026-01-28T20:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.135278 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.135316 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.135332 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.135356 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.135373 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:19Z","lastTransitionTime":"2026-01-28T20:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.225143 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovnkube-controller/2.log" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.229700 4746 scope.go:117] "RemoveContainer" containerID="53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76" Jan 28 20:40:19 crc kubenswrapper[4746]: E0128 20:40:19.229964 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.238302 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.238357 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.238379 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.238405 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.238424 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:19Z","lastTransitionTime":"2026-01-28T20:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.247462 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.266548 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.283455 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.302175 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.324805 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.341255 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.341331 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.341354 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.341431 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.341454 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:19Z","lastTransitionTime":"2026-01-28T20:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.343787 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.375951 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:17Z\\\",\\\"message\\\":\\\"Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI0128 20:40:17.344732 6409 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:17.344766 6409 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0128 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.398432 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.419438 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.437868 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.445271 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.445317 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.445347 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.445373 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.445390 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:19Z","lastTransitionTime":"2026-01-28T20:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.453642 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.471097 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.484133 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.497770 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.517226 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.529368 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.548771 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.548810 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.548820 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.548839 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.548855 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:19Z","lastTransitionTime":"2026-01-28T20:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.638467 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.638521 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.638533 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.638555 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.638571 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:19Z","lastTransitionTime":"2026-01-28T20:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:19 crc kubenswrapper[4746]: E0128 20:40:19.661634 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.666906 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.666961 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.666976 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.667003 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.667025 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:19Z","lastTransitionTime":"2026-01-28T20:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:19 crc kubenswrapper[4746]: E0128 20:40:19.684990 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.689014 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.689094 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.689106 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.689136 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.689157 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:19Z","lastTransitionTime":"2026-01-28T20:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:19 crc kubenswrapper[4746]: E0128 20:40:19.710655 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.716325 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.716474 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.716496 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.716527 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.716545 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:19Z","lastTransitionTime":"2026-01-28T20:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:19 crc kubenswrapper[4746]: E0128 20:40:19.732911 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.738070 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.738137 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.738153 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.738182 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.738198 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:19Z","lastTransitionTime":"2026-01-28T20:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:19 crc kubenswrapper[4746]: E0128 20:40:19.760697 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:19Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:19 crc kubenswrapper[4746]: E0128 20:40:19.760861 4746 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.762840 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.762886 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.762899 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.762923 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.762938 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:19Z","lastTransitionTime":"2026-01-28T20:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.828782 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 02:54:23.249140917 +0000 UTC Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.835180 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:19 crc kubenswrapper[4746]: E0128 20:40:19.835351 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.865477 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.865545 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.865558 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.865578 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.865591 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:19Z","lastTransitionTime":"2026-01-28T20:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.968949 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.969021 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.969039 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.969067 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:19 crc kubenswrapper[4746]: I0128 20:40:19.969132 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:19Z","lastTransitionTime":"2026-01-28T20:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.072366 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.072431 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.072443 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.072469 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.072497 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:20Z","lastTransitionTime":"2026-01-28T20:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.176926 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.177013 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.177038 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.177119 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.177145 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:20Z","lastTransitionTime":"2026-01-28T20:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.280983 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.281120 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.281143 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.281168 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.281190 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:20Z","lastTransitionTime":"2026-01-28T20:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.384700 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.384815 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.384837 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.384864 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.384884 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:20Z","lastTransitionTime":"2026-01-28T20:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.488213 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.488281 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.488304 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.488336 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.488360 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:20Z","lastTransitionTime":"2026-01-28T20:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.591529 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.591595 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.591620 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.591649 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.591677 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:20Z","lastTransitionTime":"2026-01-28T20:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.695412 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.695505 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.695531 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.695566 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.695591 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:20Z","lastTransitionTime":"2026-01-28T20:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.799526 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.799646 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.799671 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.799713 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.799738 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:20Z","lastTransitionTime":"2026-01-28T20:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.829154 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 20:33:23.099899098 +0000 UTC Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.835606 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.835721 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.835797 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:20 crc kubenswrapper[4746]: E0128 20:40:20.835994 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:20 crc kubenswrapper[4746]: E0128 20:40:20.836265 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:20 crc kubenswrapper[4746]: E0128 20:40:20.836480 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.903665 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.903779 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.903804 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.903840 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:20 crc kubenswrapper[4746]: I0128 20:40:20.903865 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:20Z","lastTransitionTime":"2026-01-28T20:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.007572 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.007618 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.007632 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.007653 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.007666 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:21Z","lastTransitionTime":"2026-01-28T20:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.111133 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.111167 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.111175 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.111187 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.111196 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:21Z","lastTransitionTime":"2026-01-28T20:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.213216 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.213246 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.213254 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.213267 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.213276 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:21Z","lastTransitionTime":"2026-01-28T20:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.316031 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.316122 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.316138 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.316161 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.316178 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:21Z","lastTransitionTime":"2026-01-28T20:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.418621 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.418652 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.418660 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.418673 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.418684 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:21Z","lastTransitionTime":"2026-01-28T20:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.521016 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.521047 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.521055 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.521067 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.521091 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:21Z","lastTransitionTime":"2026-01-28T20:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.624773 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.624863 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.624891 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.624931 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.624955 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:21Z","lastTransitionTime":"2026-01-28T20:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.728687 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.728749 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.728770 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.728800 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.728821 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:21Z","lastTransitionTime":"2026-01-28T20:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.830198 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 14:15:45.864412473 +0000 UTC Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.833152 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.833230 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.833248 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.833276 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.833295 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:21Z","lastTransitionTime":"2026-01-28T20:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.835590 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:21 crc kubenswrapper[4746]: E0128 20:40:21.835766 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.935733 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.935776 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.935785 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.935800 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:21 crc kubenswrapper[4746]: I0128 20:40:21.935809 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:21Z","lastTransitionTime":"2026-01-28T20:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.038187 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.038233 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.038242 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.038256 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.038266 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:22Z","lastTransitionTime":"2026-01-28T20:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.141807 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.141884 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.141903 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.141933 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.141948 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:22Z","lastTransitionTime":"2026-01-28T20:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.245600 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.245669 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.245687 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.245715 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.245733 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:22Z","lastTransitionTime":"2026-01-28T20:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.349405 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.349473 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.349491 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.349522 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.349546 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:22Z","lastTransitionTime":"2026-01-28T20:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.454073 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.454178 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.454201 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.454228 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.454298 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:22Z","lastTransitionTime":"2026-01-28T20:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.558143 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.558259 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.558294 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.558338 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.558360 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:22Z","lastTransitionTime":"2026-01-28T20:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.662172 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.662250 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.662270 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.662302 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.662321 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:22Z","lastTransitionTime":"2026-01-28T20:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.765776 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.765877 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.765905 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.765939 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.765964 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:22Z","lastTransitionTime":"2026-01-28T20:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.831322 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 23:05:05.848761848 +0000 UTC Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.835860 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.835953 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.836159 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:22 crc kubenswrapper[4746]: E0128 20:40:22.836160 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:22 crc kubenswrapper[4746]: E0128 20:40:22.836317 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:22 crc kubenswrapper[4746]: E0128 20:40:22.836498 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.849910 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs\") pod \"network-metrics-daemon-2blg6\" (UID: \"f60a5487-5012-4cc9-ad94-5dfb4957d74e\") " pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:22 crc kubenswrapper[4746]: E0128 20:40:22.850172 4746 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 20:40:22 crc kubenswrapper[4746]: E0128 20:40:22.850269 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs podName:f60a5487-5012-4cc9-ad94-5dfb4957d74e nodeName:}" failed. No retries permitted until 2026-01-28 20:40:38.850235921 +0000 UTC m=+66.806422315 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs") pod "network-metrics-daemon-2blg6" (UID: "f60a5487-5012-4cc9-ad94-5dfb4957d74e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.855859 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:22Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.869362 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.869409 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.869425 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.869451 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.869470 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:22Z","lastTransitionTime":"2026-01-28T20:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.873175 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:22Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.893697 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:22Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.911947 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:22Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.931879 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:22Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.950148 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:22Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.971771 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:22Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.973946 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.974019 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.974045 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.974115 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.974142 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:22Z","lastTransitionTime":"2026-01-28T20:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:22 crc kubenswrapper[4746]: I0128 20:40:22.995591 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:22Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.012419 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:23Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.028792 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:23Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.046047 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:23Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.063268 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:23Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.077422 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.077487 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.077502 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.077537 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.077553 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:23Z","lastTransitionTime":"2026-01-28T20:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.078640 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:23Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.094865 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:23Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.108058 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:23Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.135879 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:17Z\\\",\\\"message\\\":\\\"Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI0128 20:40:17.344732 6409 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:17.344766 6409 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0128 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:23Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.180544 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.180937 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.181071 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.181276 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.181425 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:23Z","lastTransitionTime":"2026-01-28T20:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.285160 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.285209 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.285220 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.285239 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.285254 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:23Z","lastTransitionTime":"2026-01-28T20:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.389227 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.389279 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.389295 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.389322 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.389360 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:23Z","lastTransitionTime":"2026-01-28T20:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.492997 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.493151 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.493184 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.493215 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.493235 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:23Z","lastTransitionTime":"2026-01-28T20:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.596489 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.596566 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.596584 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.596615 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.596636 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:23Z","lastTransitionTime":"2026-01-28T20:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.699890 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.699939 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.699951 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.699969 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.699985 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:23Z","lastTransitionTime":"2026-01-28T20:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.802552 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.802662 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.802687 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.802727 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.802755 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:23Z","lastTransitionTime":"2026-01-28T20:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.831727 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 14:47:01.140050791 +0000 UTC Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.835124 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:23 crc kubenswrapper[4746]: E0128 20:40:23.835249 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.906413 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.906491 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.906510 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.906542 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:23 crc kubenswrapper[4746]: I0128 20:40:23.906561 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:23Z","lastTransitionTime":"2026-01-28T20:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.010072 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.010151 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.010165 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.010185 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.010200 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:24Z","lastTransitionTime":"2026-01-28T20:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.114241 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.114316 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.114339 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.114371 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.114393 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:24Z","lastTransitionTime":"2026-01-28T20:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.217992 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.218060 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.218121 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.218152 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.218173 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:24Z","lastTransitionTime":"2026-01-28T20:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.321648 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.321732 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.321752 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.321784 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.321805 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:24Z","lastTransitionTime":"2026-01-28T20:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.424348 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.424399 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.424410 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.424427 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.424440 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:24Z","lastTransitionTime":"2026-01-28T20:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.528283 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.528341 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.528352 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.528375 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.528388 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:24Z","lastTransitionTime":"2026-01-28T20:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.573044 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:40:24 crc kubenswrapper[4746]: E0128 20:40:24.573330 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:40:56.573294077 +0000 UTC m=+84.529480441 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.573432 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.573503 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.573554 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:24 crc kubenswrapper[4746]: E0128 20:40:24.573708 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 20:40:24 crc kubenswrapper[4746]: E0128 20:40:24.573718 4746 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 20:40:24 crc kubenswrapper[4746]: E0128 20:40:24.573749 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 20:40:24 crc kubenswrapper[4746]: E0128 20:40:24.573775 4746 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:40:24 crc kubenswrapper[4746]: E0128 20:40:24.573791 4746 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 20:40:24 crc kubenswrapper[4746]: E0128 20:40:24.573816 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 20:40:56.573798771 +0000 UTC m=+84.529985135 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 20:40:24 crc kubenswrapper[4746]: E0128 20:40:24.573894 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 20:40:56.573869113 +0000 UTC m=+84.530055507 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:40:24 crc kubenswrapper[4746]: E0128 20:40:24.573931 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 20:40:56.573917444 +0000 UTC m=+84.530103838 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.634841 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.634951 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.634971 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.635003 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.635024 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:24Z","lastTransitionTime":"2026-01-28T20:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.674326 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:24 crc kubenswrapper[4746]: E0128 20:40:24.674555 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 20:40:24 crc kubenswrapper[4746]: E0128 20:40:24.674624 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 20:40:24 crc kubenswrapper[4746]: E0128 20:40:24.674647 4746 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:40:24 crc kubenswrapper[4746]: E0128 20:40:24.674739 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 20:40:56.674717203 +0000 UTC m=+84.630903567 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.737991 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.738064 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.738121 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.738153 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.738173 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:24Z","lastTransitionTime":"2026-01-28T20:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.832499 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 13:06:44.408717197 +0000 UTC Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.834838 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.834867 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.835000 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:24 crc kubenswrapper[4746]: E0128 20:40:24.835249 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:24 crc kubenswrapper[4746]: E0128 20:40:24.835405 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:24 crc kubenswrapper[4746]: E0128 20:40:24.835562 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.839857 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.839889 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.839920 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.839934 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.839945 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:24Z","lastTransitionTime":"2026-01-28T20:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.933724 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.942936 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.942980 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.942989 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.943006 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.943017 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:24Z","lastTransitionTime":"2026-01-28T20:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.948642 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.961039 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:24Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.975669 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:24Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:24 crc kubenswrapper[4746]: I0128 20:40:24.995160 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:24Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.011363 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:25Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.028674 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:25Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.046203 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.046266 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.046284 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.046310 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.046328 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:25Z","lastTransitionTime":"2026-01-28T20:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.060715 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:17Z\\\",\\\"message\\\":\\\"Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI0128 20:40:17.344732 6409 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:17.344766 6409 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0128 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:25Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.083720 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:25Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.103051 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:25Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.121378 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:25Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.144751 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:25Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.148950 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.149026 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.149056 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.149127 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.149160 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:25Z","lastTransitionTime":"2026-01-28T20:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.165363 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:25Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.180043 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:25Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.200899 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:25Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.216804 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:25Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.229481 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:25Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.243236 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:25Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.256448 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.256528 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.256543 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.256587 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.256603 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:25Z","lastTransitionTime":"2026-01-28T20:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.359795 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.359862 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.359880 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.359908 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.359928 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:25Z","lastTransitionTime":"2026-01-28T20:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.463511 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.463595 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.463619 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.463653 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.463684 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:25Z","lastTransitionTime":"2026-01-28T20:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.567723 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.567780 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.567792 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.567812 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.567828 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:25Z","lastTransitionTime":"2026-01-28T20:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.672649 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.673304 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.673323 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.673358 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.673375 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:25Z","lastTransitionTime":"2026-01-28T20:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.777025 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.777120 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.777139 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.777166 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.777239 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:25Z","lastTransitionTime":"2026-01-28T20:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.832961 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 19:23:46.860405196 +0000 UTC Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.835389 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:25 crc kubenswrapper[4746]: E0128 20:40:25.835602 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.880945 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.881021 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.881043 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.881121 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.881212 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:25Z","lastTransitionTime":"2026-01-28T20:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.984674 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.984758 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.984780 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.984811 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:25 crc kubenswrapper[4746]: I0128 20:40:25.984832 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:25Z","lastTransitionTime":"2026-01-28T20:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.087292 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.087343 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.087357 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.087378 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.087429 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:26Z","lastTransitionTime":"2026-01-28T20:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.190884 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.190925 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.190935 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.190951 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.190964 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:26Z","lastTransitionTime":"2026-01-28T20:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.294715 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.294826 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.294840 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.294861 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.294874 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:26Z","lastTransitionTime":"2026-01-28T20:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.398409 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.398440 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.398451 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.398466 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.398475 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:26Z","lastTransitionTime":"2026-01-28T20:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.437276 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.450554 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:26Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.471624 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:26Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.491871 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3553d828-b1a0-4f51-9e70-5f4d25f3ee42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba6f8586593282902571354d53a7bdc0945ea8d1970cac1bfc2f8cc4019a4ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d75277c1a1e8fdc34e37a3ae3d697e002007456fde3dae5d49c1c932a0a7f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6d80884e00b1a12051a7a97148a46fba0e8514a1233a180262392c302db77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:26Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.501318 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.501378 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.501400 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.501431 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.501454 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:26Z","lastTransitionTime":"2026-01-28T20:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.511496 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:26Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.535464 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:26Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.555516 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:26Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.570241 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:26Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.594364 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:17Z\\\",\\\"message\\\":\\\"Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI0128 20:40:17.344732 6409 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:17.344766 6409 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0128 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:26Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.604928 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.604988 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.605002 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.605025 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.605043 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:26Z","lastTransitionTime":"2026-01-28T20:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.609605 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:26Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.624815 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:26Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.637932 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:26Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.653014 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:26Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.672302 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:26Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.689374 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:26Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.707563 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:26Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.707902 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.707986 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.708001 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.708046 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.708061 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:26Z","lastTransitionTime":"2026-01-28T20:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.726312 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:26Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.743060 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:26Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.811778 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.811887 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.811910 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.811938 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.811954 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:26Z","lastTransitionTime":"2026-01-28T20:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.833554 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 21:19:58.481801641 +0000 UTC Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.834985 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.835040 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.835332 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:26 crc kubenswrapper[4746]: E0128 20:40:26.835334 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:26 crc kubenswrapper[4746]: E0128 20:40:26.835493 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:26 crc kubenswrapper[4746]: E0128 20:40:26.835665 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.915896 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.915962 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.915972 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.915989 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:26 crc kubenswrapper[4746]: I0128 20:40:26.915998 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:26Z","lastTransitionTime":"2026-01-28T20:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.018753 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.018831 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.018851 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.018884 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.018904 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:27Z","lastTransitionTime":"2026-01-28T20:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.127699 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.127764 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.127776 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.127797 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.127815 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:27Z","lastTransitionTime":"2026-01-28T20:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.232356 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.232434 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.232452 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.232484 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.232509 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:27Z","lastTransitionTime":"2026-01-28T20:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.336976 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.337063 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.337115 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.337146 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.337166 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:27Z","lastTransitionTime":"2026-01-28T20:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.441146 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.441217 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.441239 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.441271 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.441294 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:27Z","lastTransitionTime":"2026-01-28T20:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.544598 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.544667 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.544686 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.544717 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.544738 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:27Z","lastTransitionTime":"2026-01-28T20:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.648189 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.648249 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.648267 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.648290 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.648310 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:27Z","lastTransitionTime":"2026-01-28T20:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.751573 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.751645 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.751659 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.751683 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.751696 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:27Z","lastTransitionTime":"2026-01-28T20:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.834129 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 03:40:56.948447192 +0000 UTC Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.835488 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:27 crc kubenswrapper[4746]: E0128 20:40:27.835659 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.855179 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.855222 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.855234 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.855252 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.855265 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:27Z","lastTransitionTime":"2026-01-28T20:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.958554 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.958628 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.958644 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.958667 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:27 crc kubenswrapper[4746]: I0128 20:40:27.958683 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:27Z","lastTransitionTime":"2026-01-28T20:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.063242 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.063301 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.063317 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.063343 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.063365 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:28Z","lastTransitionTime":"2026-01-28T20:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.166417 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.166470 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.166480 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.166498 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.166508 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:28Z","lastTransitionTime":"2026-01-28T20:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.270142 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.270179 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.270191 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.270212 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.270226 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:28Z","lastTransitionTime":"2026-01-28T20:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.375056 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.375269 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.375344 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.375470 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.375501 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:28Z","lastTransitionTime":"2026-01-28T20:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.480040 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.480162 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.480188 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.480225 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.480253 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:28Z","lastTransitionTime":"2026-01-28T20:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.583796 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.584237 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.584411 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.584627 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.584872 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:28Z","lastTransitionTime":"2026-01-28T20:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.688550 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.688620 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.688646 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.688676 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.688732 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:28Z","lastTransitionTime":"2026-01-28T20:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.792262 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.792338 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.792361 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.792409 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.792435 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:28Z","lastTransitionTime":"2026-01-28T20:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.834395 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 00:20:36.031067363 +0000 UTC Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.835769 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.835851 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.835768 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:28 crc kubenswrapper[4746]: E0128 20:40:28.836000 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:28 crc kubenswrapper[4746]: E0128 20:40:28.836186 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:28 crc kubenswrapper[4746]: E0128 20:40:28.836347 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.896050 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.896228 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.896240 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.896262 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:28 crc kubenswrapper[4746]: I0128 20:40:28.896275 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:28Z","lastTransitionTime":"2026-01-28T20:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.001022 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.001144 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.001181 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.001237 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.001262 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:29Z","lastTransitionTime":"2026-01-28T20:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.105565 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.105634 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.105651 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.105674 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.105689 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:29Z","lastTransitionTime":"2026-01-28T20:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.208075 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.208129 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.208140 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.208159 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.208171 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:29Z","lastTransitionTime":"2026-01-28T20:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.311657 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.311729 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.311745 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.311774 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.311797 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:29Z","lastTransitionTime":"2026-01-28T20:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.415623 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.415696 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.415721 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.415758 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.415784 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:29Z","lastTransitionTime":"2026-01-28T20:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.519298 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.519368 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.519382 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.519406 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.519421 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:29Z","lastTransitionTime":"2026-01-28T20:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.622551 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.622631 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.622655 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.622686 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.622708 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:29Z","lastTransitionTime":"2026-01-28T20:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.725458 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.725502 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.725512 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.725532 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.725543 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:29Z","lastTransitionTime":"2026-01-28T20:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.834868 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.834916 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.834936 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.834957 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.834975 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:29Z","lastTransitionTime":"2026-01-28T20:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.835455 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:29 crc kubenswrapper[4746]: E0128 20:40:29.835800 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.836423 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 21:49:34.959992864 +0000 UTC Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.937906 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.937943 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.937952 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.937966 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:29 crc kubenswrapper[4746]: I0128 20:40:29.937979 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:29Z","lastTransitionTime":"2026-01-28T20:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.040946 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.040977 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.040987 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.041000 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.041010 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:30Z","lastTransitionTime":"2026-01-28T20:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.076192 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.076221 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.076229 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.076243 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.076252 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:30Z","lastTransitionTime":"2026-01-28T20:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:30 crc kubenswrapper[4746]: E0128 20:40:30.090340 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:30Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.094446 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.094490 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.094502 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.094519 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.094530 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:30Z","lastTransitionTime":"2026-01-28T20:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:30 crc kubenswrapper[4746]: E0128 20:40:30.109213 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:30Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.113115 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.113157 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.113169 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.113185 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.113195 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:30Z","lastTransitionTime":"2026-01-28T20:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:30 crc kubenswrapper[4746]: E0128 20:40:30.125786 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:30Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.130012 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.130049 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.130060 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.130110 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.130126 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:30Z","lastTransitionTime":"2026-01-28T20:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:30 crc kubenswrapper[4746]: E0128 20:40:30.145001 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:30Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.148275 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.148307 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.148318 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.148333 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.148346 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:30Z","lastTransitionTime":"2026-01-28T20:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:30 crc kubenswrapper[4746]: E0128 20:40:30.159645 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:30Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:30 crc kubenswrapper[4746]: E0128 20:40:30.159749 4746 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.161203 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.161244 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.161256 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.161274 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.161287 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:30Z","lastTransitionTime":"2026-01-28T20:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.263917 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.263971 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.263987 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.264011 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.264027 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:30Z","lastTransitionTime":"2026-01-28T20:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.366278 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.366333 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.366341 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.366357 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.366368 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:30Z","lastTransitionTime":"2026-01-28T20:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.468676 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.468714 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.468722 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.468735 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.468744 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:30Z","lastTransitionTime":"2026-01-28T20:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.570975 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.571018 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.571029 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.571045 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.571056 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:30Z","lastTransitionTime":"2026-01-28T20:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.673230 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.673289 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.673306 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.673329 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.673348 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:30Z","lastTransitionTime":"2026-01-28T20:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.775853 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.775902 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.775914 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.775930 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.775942 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:30Z","lastTransitionTime":"2026-01-28T20:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.835451 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.835541 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.835455 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:30 crc kubenswrapper[4746]: E0128 20:40:30.835615 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:30 crc kubenswrapper[4746]: E0128 20:40:30.835946 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:30 crc kubenswrapper[4746]: E0128 20:40:30.836113 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.836297 4746 scope.go:117] "RemoveContainer" containerID="53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76" Jan 28 20:40:30 crc kubenswrapper[4746]: E0128 20:40:30.836453 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.837903 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 13:45:34.492034889 +0000 UTC Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.877723 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.877760 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.877768 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.877781 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.877791 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:30Z","lastTransitionTime":"2026-01-28T20:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.979763 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.979806 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.979814 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.979828 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:30 crc kubenswrapper[4746]: I0128 20:40:30.979837 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:30Z","lastTransitionTime":"2026-01-28T20:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.081847 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.081878 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.081887 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.081901 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.081910 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:31Z","lastTransitionTime":"2026-01-28T20:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.184137 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.184487 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.184569 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.184650 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.184725 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:31Z","lastTransitionTime":"2026-01-28T20:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.288754 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.288987 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.289123 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.289235 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.289323 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:31Z","lastTransitionTime":"2026-01-28T20:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.392808 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.393048 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.393163 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.393278 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.393357 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:31Z","lastTransitionTime":"2026-01-28T20:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.496468 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.496546 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.496569 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.496617 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.496638 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:31Z","lastTransitionTime":"2026-01-28T20:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.599486 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.599538 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.599549 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.599564 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.599573 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:31Z","lastTransitionTime":"2026-01-28T20:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.703452 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.703758 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.703859 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.703969 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.704109 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:31Z","lastTransitionTime":"2026-01-28T20:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.806583 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.806635 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.806648 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.806666 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.806679 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:31Z","lastTransitionTime":"2026-01-28T20:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.835194 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:31 crc kubenswrapper[4746]: E0128 20:40:31.835345 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.838427 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 11:08:08.677491922 +0000 UTC Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.908989 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.909350 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.909474 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.909575 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:31 crc kubenswrapper[4746]: I0128 20:40:31.909658 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:31Z","lastTransitionTime":"2026-01-28T20:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.012581 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.012852 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.012943 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.013068 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.013190 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:32Z","lastTransitionTime":"2026-01-28T20:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.116273 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.116320 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.116329 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.116344 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.116358 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:32Z","lastTransitionTime":"2026-01-28T20:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.218503 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.218797 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.218859 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.218921 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.218982 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:32Z","lastTransitionTime":"2026-01-28T20:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.321521 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.321556 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.321568 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.321583 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.321591 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:32Z","lastTransitionTime":"2026-01-28T20:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.424053 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.424117 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.424126 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.424140 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.424151 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:32Z","lastTransitionTime":"2026-01-28T20:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.526446 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.526499 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.526511 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.526530 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.526547 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:32Z","lastTransitionTime":"2026-01-28T20:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.629205 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.629522 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.629606 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.629676 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.629740 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:32Z","lastTransitionTime":"2026-01-28T20:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.732018 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.732056 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.732070 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.732102 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.732115 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:32Z","lastTransitionTime":"2026-01-28T20:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.833849 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.833896 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.833909 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.833927 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.833940 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:32Z","lastTransitionTime":"2026-01-28T20:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.834726 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.834796 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.834796 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:32 crc kubenswrapper[4746]: E0128 20:40:32.834954 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:32 crc kubenswrapper[4746]: E0128 20:40:32.835284 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:32 crc kubenswrapper[4746]: E0128 20:40:32.835384 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.842021 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 16:18:41.33540739 +0000 UTC Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.852855 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:32Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.872123 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:32Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.888551 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:32Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.904890 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:32Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.917594 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:32Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.935962 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.936031 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.936049 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.936110 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.936131 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:32Z","lastTransitionTime":"2026-01-28T20:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.943218 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:17Z\\\",\\\"message\\\":\\\"Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI0128 20:40:17.344732 6409 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:17.344766 6409 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0128 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:32Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.963608 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:32Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.979757 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:32Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:32 crc kubenswrapper[4746]: I0128 20:40:32.997118 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:32Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.016371 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:33Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.035305 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:33Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.039351 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.039400 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.039416 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.039432 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.039484 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:33Z","lastTransitionTime":"2026-01-28T20:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.049389 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:33Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.068537 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:33Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.086284 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:33Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.103823 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:33Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.122293 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:33Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.136821 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3553d828-b1a0-4f51-9e70-5f4d25f3ee42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba6f8586593282902571354d53a7bdc0945ea8d1970cac1bfc2f8cc4019a4ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d75277c1a1e8fdc34e37a3ae3d697e002007456fde3dae5d49c1c932a0a7f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6d80884e00b1a12051a7a97148a46fba0e8514a1233a180262392c302db77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:33Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.141805 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.141837 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.141845 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.141859 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.141869 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:33Z","lastTransitionTime":"2026-01-28T20:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.244269 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.244315 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.244326 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.244339 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.244350 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:33Z","lastTransitionTime":"2026-01-28T20:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.348052 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.348117 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.348131 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.348150 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.348164 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:33Z","lastTransitionTime":"2026-01-28T20:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.454031 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.454191 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.454676 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.454757 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.455375 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:33Z","lastTransitionTime":"2026-01-28T20:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.558186 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.558227 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.558238 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.558255 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.558266 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:33Z","lastTransitionTime":"2026-01-28T20:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.662361 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.662435 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.662461 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.662500 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.662526 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:33Z","lastTransitionTime":"2026-01-28T20:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.765784 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.766295 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.766313 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.766340 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.766358 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:33Z","lastTransitionTime":"2026-01-28T20:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.835519 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:33 crc kubenswrapper[4746]: E0128 20:40:33.835719 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.843148 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 21:48:37.819789714 +0000 UTC Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.871393 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.871434 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.871443 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.871461 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.871473 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:33Z","lastTransitionTime":"2026-01-28T20:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.974708 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.975075 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.975252 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.975400 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:33 crc kubenswrapper[4746]: I0128 20:40:33.975530 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:33Z","lastTransitionTime":"2026-01-28T20:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.079416 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.079459 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.079472 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.079490 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.079502 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:34Z","lastTransitionTime":"2026-01-28T20:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.182811 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.182856 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.182867 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.182887 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.182899 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:34Z","lastTransitionTime":"2026-01-28T20:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.285302 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.285332 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.285340 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.285352 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.285363 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:34Z","lastTransitionTime":"2026-01-28T20:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.387595 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.387624 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.387633 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.387655 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.387664 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:34Z","lastTransitionTime":"2026-01-28T20:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.490913 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.491194 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.491273 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.491356 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.491415 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:34Z","lastTransitionTime":"2026-01-28T20:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.594908 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.594975 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.594993 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.595022 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.595039 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:34Z","lastTransitionTime":"2026-01-28T20:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.698165 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.698223 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.698235 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.698258 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.698273 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:34Z","lastTransitionTime":"2026-01-28T20:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.799848 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.800244 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.800460 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.800625 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.800779 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:34Z","lastTransitionTime":"2026-01-28T20:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.835354 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.835405 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.835424 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:34 crc kubenswrapper[4746]: E0128 20:40:34.836026 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:34 crc kubenswrapper[4746]: E0128 20:40:34.836213 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:34 crc kubenswrapper[4746]: E0128 20:40:34.836307 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.843646 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 00:36:15.195010925 +0000 UTC Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.904658 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.905012 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.905265 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.905470 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:34 crc kubenswrapper[4746]: I0128 20:40:34.905659 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:34Z","lastTransitionTime":"2026-01-28T20:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.008352 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.008678 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.008748 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.008819 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.008879 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:35Z","lastTransitionTime":"2026-01-28T20:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.111360 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.111624 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.111748 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.111846 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.111915 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:35Z","lastTransitionTime":"2026-01-28T20:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.215046 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.215631 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.215784 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.215951 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.216071 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:35Z","lastTransitionTime":"2026-01-28T20:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.319231 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.319551 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.319609 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.319681 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.319744 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:35Z","lastTransitionTime":"2026-01-28T20:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.421806 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.421869 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.421884 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.421904 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.421917 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:35Z","lastTransitionTime":"2026-01-28T20:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.525573 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.525633 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.525647 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.525668 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.525682 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:35Z","lastTransitionTime":"2026-01-28T20:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.629029 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.629198 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.629217 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.629235 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.629248 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:35Z","lastTransitionTime":"2026-01-28T20:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.732154 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.732191 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.732201 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.732215 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.732225 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:35Z","lastTransitionTime":"2026-01-28T20:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.834392 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.834446 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.834456 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.834470 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.834482 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:35Z","lastTransitionTime":"2026-01-28T20:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.834696 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:35 crc kubenswrapper[4746]: E0128 20:40:35.834786 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.844263 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 19:12:45.972569531 +0000 UTC Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.938215 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.938289 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.938312 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.938345 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:35 crc kubenswrapper[4746]: I0128 20:40:35.938373 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:35Z","lastTransitionTime":"2026-01-28T20:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.041980 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.042054 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.042109 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.042148 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.042174 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:36Z","lastTransitionTime":"2026-01-28T20:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.145515 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.145577 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.145594 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.145623 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.145642 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:36Z","lastTransitionTime":"2026-01-28T20:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.248820 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.248903 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.248923 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.248958 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.248979 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:36Z","lastTransitionTime":"2026-01-28T20:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.352744 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.352795 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.352805 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.352823 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.352832 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:36Z","lastTransitionTime":"2026-01-28T20:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.456120 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.456160 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.456172 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.456187 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.456197 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:36Z","lastTransitionTime":"2026-01-28T20:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.558347 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.558386 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.558397 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.558409 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.558418 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:36Z","lastTransitionTime":"2026-01-28T20:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.661824 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.662557 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.662774 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.663004 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.663239 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:36Z","lastTransitionTime":"2026-01-28T20:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.767267 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.767332 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.767347 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.767370 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.767384 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:36Z","lastTransitionTime":"2026-01-28T20:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.835316 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.835429 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.836196 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:36 crc kubenswrapper[4746]: E0128 20:40:36.836502 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:36 crc kubenswrapper[4746]: E0128 20:40:36.836300 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:36 crc kubenswrapper[4746]: E0128 20:40:36.836937 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.844752 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 04:17:21.987975516 +0000 UTC Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.870274 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.870343 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.870360 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.870385 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.870399 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:36Z","lastTransitionTime":"2026-01-28T20:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.973928 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.974341 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.974611 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.974765 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:36 crc kubenswrapper[4746]: I0128 20:40:36.975293 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:36Z","lastTransitionTime":"2026-01-28T20:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.079013 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.079065 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.079089 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.079109 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.079122 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:37Z","lastTransitionTime":"2026-01-28T20:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.182246 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.182283 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.182291 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.182305 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.182313 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:37Z","lastTransitionTime":"2026-01-28T20:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.286987 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.287292 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.287323 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.287358 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.287398 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:37Z","lastTransitionTime":"2026-01-28T20:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.391136 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.391207 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.391220 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.391241 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.391256 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:37Z","lastTransitionTime":"2026-01-28T20:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.494689 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.494763 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.494780 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.494808 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.494828 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:37Z","lastTransitionTime":"2026-01-28T20:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.597686 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.597750 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.597762 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.597788 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.597803 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:37Z","lastTransitionTime":"2026-01-28T20:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.700714 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.700792 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.700806 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.700830 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.700849 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:37Z","lastTransitionTime":"2026-01-28T20:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.804697 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.804756 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.804768 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.804784 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.804796 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:37Z","lastTransitionTime":"2026-01-28T20:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.835871 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:37 crc kubenswrapper[4746]: E0128 20:40:37.836197 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.846399 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 04:07:14.032061533 +0000 UTC Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.907936 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.907989 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.908004 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.908027 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:37 crc kubenswrapper[4746]: I0128 20:40:37.908048 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:37Z","lastTransitionTime":"2026-01-28T20:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.010743 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.010825 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.010851 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.010892 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.010917 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:38Z","lastTransitionTime":"2026-01-28T20:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.113435 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.113479 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.113489 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.113549 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.113581 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:38Z","lastTransitionTime":"2026-01-28T20:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.216492 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.216539 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.216548 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.216562 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.216572 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:38Z","lastTransitionTime":"2026-01-28T20:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.318339 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.318389 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.318406 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.318431 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.318447 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:38Z","lastTransitionTime":"2026-01-28T20:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.420444 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.420506 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.420522 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.420537 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.420547 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:38Z","lastTransitionTime":"2026-01-28T20:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.522664 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.522702 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.522713 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.522726 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.522736 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:38Z","lastTransitionTime":"2026-01-28T20:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.625830 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.625910 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.625933 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.625970 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.625993 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:38Z","lastTransitionTime":"2026-01-28T20:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.728358 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.728577 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.728587 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.728601 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.728612 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:38Z","lastTransitionTime":"2026-01-28T20:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.831168 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.831200 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.831211 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.831225 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.831235 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:38Z","lastTransitionTime":"2026-01-28T20:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.836018 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.836066 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:38 crc kubenswrapper[4746]: E0128 20:40:38.836183 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.836028 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:38 crc kubenswrapper[4746]: E0128 20:40:38.836441 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:38 crc kubenswrapper[4746]: E0128 20:40:38.836506 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.846987 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 05:35:09.747723257 +0000 UTC Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.928223 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs\") pod \"network-metrics-daemon-2blg6\" (UID: \"f60a5487-5012-4cc9-ad94-5dfb4957d74e\") " pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:38 crc kubenswrapper[4746]: E0128 20:40:38.928536 4746 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 20:40:38 crc kubenswrapper[4746]: E0128 20:40:38.928716 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs podName:f60a5487-5012-4cc9-ad94-5dfb4957d74e nodeName:}" failed. No retries permitted until 2026-01-28 20:41:10.928666331 +0000 UTC m=+98.884852725 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs") pod "network-metrics-daemon-2blg6" (UID: "f60a5487-5012-4cc9-ad94-5dfb4957d74e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.934641 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.934709 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.934730 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.934759 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:38 crc kubenswrapper[4746]: I0128 20:40:38.934778 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:38Z","lastTransitionTime":"2026-01-28T20:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.038061 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.038155 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.038173 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.038199 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.038219 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:39Z","lastTransitionTime":"2026-01-28T20:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.142091 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.142144 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.142154 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.142172 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.142183 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:39Z","lastTransitionTime":"2026-01-28T20:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.244725 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.244776 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.244788 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.244810 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.244824 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:39Z","lastTransitionTime":"2026-01-28T20:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.347880 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.347973 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.347997 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.348038 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.348063 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:39Z","lastTransitionTime":"2026-01-28T20:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.450180 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.450222 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.450271 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.450293 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.450304 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:39Z","lastTransitionTime":"2026-01-28T20:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.553471 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.553563 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.553593 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.553637 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.553661 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:39Z","lastTransitionTime":"2026-01-28T20:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.655679 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.655711 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.655722 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.655736 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.655748 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:39Z","lastTransitionTime":"2026-01-28T20:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.758755 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.758798 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.758809 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.758824 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.758835 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:39Z","lastTransitionTime":"2026-01-28T20:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.835838 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:39 crc kubenswrapper[4746]: E0128 20:40:39.836105 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.847451 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 08:24:30.391597142 +0000 UTC Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.861163 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.861207 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.861219 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.861235 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.861247 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:39Z","lastTransitionTime":"2026-01-28T20:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.963611 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.963647 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.963655 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.963669 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:39 crc kubenswrapper[4746]: I0128 20:40:39.963679 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:39Z","lastTransitionTime":"2026-01-28T20:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.065991 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.066031 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.066042 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.066059 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.066069 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:40Z","lastTransitionTime":"2026-01-28T20:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.168378 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.168425 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.168437 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.168454 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.168466 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:40Z","lastTransitionTime":"2026-01-28T20:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.270766 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.270818 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.270830 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.270846 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.270859 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:40Z","lastTransitionTime":"2026-01-28T20:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.372667 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.372705 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.372713 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.372728 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.372742 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:40Z","lastTransitionTime":"2026-01-28T20:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.417387 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.417431 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.417441 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.417457 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.417469 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:40Z","lastTransitionTime":"2026-01-28T20:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:40 crc kubenswrapper[4746]: E0128 20:40:40.429580 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:40Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.436252 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.436295 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.436306 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.436326 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.436338 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:40Z","lastTransitionTime":"2026-01-28T20:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:40 crc kubenswrapper[4746]: E0128 20:40:40.449982 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:40Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.454274 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.454317 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.454328 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.454343 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.454351 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:40Z","lastTransitionTime":"2026-01-28T20:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:40 crc kubenswrapper[4746]: E0128 20:40:40.466499 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:40Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.470536 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.470613 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.470627 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.470652 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.470664 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:40Z","lastTransitionTime":"2026-01-28T20:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:40 crc kubenswrapper[4746]: E0128 20:40:40.483024 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:40Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.486575 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.486608 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.486622 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.486644 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.486660 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:40Z","lastTransitionTime":"2026-01-28T20:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:40 crc kubenswrapper[4746]: E0128 20:40:40.499286 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:40Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:40 crc kubenswrapper[4746]: E0128 20:40:40.499411 4746 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.501499 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.501538 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.501549 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.501565 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.501576 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:40Z","lastTransitionTime":"2026-01-28T20:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.604486 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.604793 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.604832 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.604845 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.604853 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:40Z","lastTransitionTime":"2026-01-28T20:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.707628 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.707687 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.707697 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.707718 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.707731 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:40Z","lastTransitionTime":"2026-01-28T20:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.811542 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.811599 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.811617 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.811647 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.811658 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:40Z","lastTransitionTime":"2026-01-28T20:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.835846 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.835882 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.835889 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:40 crc kubenswrapper[4746]: E0128 20:40:40.836171 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:40 crc kubenswrapper[4746]: E0128 20:40:40.836271 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:40 crc kubenswrapper[4746]: E0128 20:40:40.836179 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.848643 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 11:52:55.067153719 +0000 UTC Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.914502 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.914561 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.914573 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.914595 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:40 crc kubenswrapper[4746]: I0128 20:40:40.914611 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:40Z","lastTransitionTime":"2026-01-28T20:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.017910 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.017959 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.017977 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.018001 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.018018 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:41Z","lastTransitionTime":"2026-01-28T20:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.120841 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.120902 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.120912 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.120927 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.120939 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:41Z","lastTransitionTime":"2026-01-28T20:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.224226 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.224282 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.224296 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.224323 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.224341 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:41Z","lastTransitionTime":"2026-01-28T20:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.310189 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qhpvf_cdf26de0-b602-4bdf-b492-65b3b6b31434/kube-multus/0.log" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.310257 4746 generic.go:334] "Generic (PLEG): container finished" podID="cdf26de0-b602-4bdf-b492-65b3b6b31434" containerID="f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367" exitCode=1 Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.310300 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qhpvf" event={"ID":"cdf26de0-b602-4bdf-b492-65b3b6b31434","Type":"ContainerDied","Data":"f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367"} Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.310803 4746 scope.go:117] "RemoveContainer" containerID="f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.327207 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:41Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.327508 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.327556 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.327574 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.327596 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.327609 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:41Z","lastTransitionTime":"2026-01-28T20:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.342185 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:41Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.357612 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:41Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.374722 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:41Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.390298 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"2026-01-28T20:39:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_529a75ec-ce77-49ca-8f6a-9466be7c980e\\\\n2026-01-28T20:39:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_529a75ec-ce77-49ca-8f6a-9466be7c980e to /host/opt/cni/bin/\\\\n2026-01-28T20:39:55Z [verbose] multus-daemon started\\\\n2026-01-28T20:39:55Z [verbose] Readiness Indicator file check\\\\n2026-01-28T20:40:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:41Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.402926 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:41Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.417889 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:41Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.430181 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.430210 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.430220 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.430238 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.430249 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:41Z","lastTransitionTime":"2026-01-28T20:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.432439 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3553d828-b1a0-4f51-9e70-5f4d25f3ee42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba6f8586593282902571354d53a7bdc0945ea8d1970cac1bfc2f8cc4019a4ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d75277c1a1e8fdc34e37a3ae3d697e002007456fde3dae5d49c1c932a0a7f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6d80884e00b1a12051a7a97148a46fba0e8514a1233a180262392c302db77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:41Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.443730 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:41Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.458166 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:41Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.471235 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:41Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.489668 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:41Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.503271 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:41Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.516499 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:41Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.530012 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:41Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.533616 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.533661 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.533673 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.533692 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.533707 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:41Z","lastTransitionTime":"2026-01-28T20:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.541042 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:41Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.564617 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:17Z\\\",\\\"message\\\":\\\"Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI0128 20:40:17.344732 6409 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:17.344766 6409 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0128 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:41Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.636977 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.637022 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.637033 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.637054 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.637100 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:41Z","lastTransitionTime":"2026-01-28T20:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.741629 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.741676 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.741686 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.741707 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.741720 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:41Z","lastTransitionTime":"2026-01-28T20:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.835692 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:41 crc kubenswrapper[4746]: E0128 20:40:41.835904 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.845654 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.845700 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.845716 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.845735 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.845749 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:41Z","lastTransitionTime":"2026-01-28T20:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.849100 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 17:28:06.530771092 +0000 UTC Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.948301 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.948345 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.948353 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.948371 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:41 crc kubenswrapper[4746]: I0128 20:40:41.948381 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:41Z","lastTransitionTime":"2026-01-28T20:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.050977 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.051036 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.051054 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.051144 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.051174 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:42Z","lastTransitionTime":"2026-01-28T20:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.154268 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.154330 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.154346 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.154371 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.154384 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:42Z","lastTransitionTime":"2026-01-28T20:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.257270 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.257324 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.257338 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.257358 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.257372 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:42Z","lastTransitionTime":"2026-01-28T20:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.316589 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qhpvf_cdf26de0-b602-4bdf-b492-65b3b6b31434/kube-multus/0.log" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.316664 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qhpvf" event={"ID":"cdf26de0-b602-4bdf-b492-65b3b6b31434","Type":"ContainerStarted","Data":"9c88cd0b151f23677f83a944570f36a7a6edd30d9be47638a17cbb9098ec1c23"} Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.336143 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.352797 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.359962 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.360004 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.360017 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.360037 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.360049 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:42Z","lastTransitionTime":"2026-01-28T20:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.371885 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.392475 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.411980 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.425347 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.451209 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:17Z\\\",\\\"message\\\":\\\"Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI0128 20:40:17.344732 6409 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:17.344766 6409 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0128 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.462467 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.462538 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.462552 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.462574 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.462590 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:42Z","lastTransitionTime":"2026-01-28T20:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.470381 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.487220 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.503190 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.517248 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.531456 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.544350 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c88cd0b151f23677f83a944570f36a7a6edd30d9be47638a17cbb9098ec1c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"2026-01-28T20:39:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_529a75ec-ce77-49ca-8f6a-9466be7c980e\\\\n2026-01-28T20:39:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_529a75ec-ce77-49ca-8f6a-9466be7c980e to /host/opt/cni/bin/\\\\n2026-01-28T20:39:55Z [verbose] multus-daemon started\\\\n2026-01-28T20:39:55Z [verbose] Readiness Indicator file check\\\\n2026-01-28T20:40:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.555663 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.565119 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.565148 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.565157 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.565172 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.565181 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:42Z","lastTransitionTime":"2026-01-28T20:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.569183 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.582295 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3553d828-b1a0-4f51-9e70-5f4d25f3ee42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba6f8586593282902571354d53a7bdc0945ea8d1970cac1bfc2f8cc4019a4ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d75277c1a1e8fdc34e37a3ae3d697e002007456fde3dae5d49c1c932a0a7f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6d80884e00b1a12051a7a97148a46fba0e8514a1233a180262392c302db77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.592252 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.667023 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.667099 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.667109 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.667121 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.667131 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:42Z","lastTransitionTime":"2026-01-28T20:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.769152 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.769202 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.769213 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.769233 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.769246 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:42Z","lastTransitionTime":"2026-01-28T20:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.836391 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.836441 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.836472 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:42 crc kubenswrapper[4746]: E0128 20:40:42.836581 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:42 crc kubenswrapper[4746]: E0128 20:40:42.836839 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:42 crc kubenswrapper[4746]: E0128 20:40:42.837013 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.849438 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 18:28:18.027338531 +0000 UTC Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.851336 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.864801 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.871943 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.871978 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.871986 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.872001 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.872010 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:42Z","lastTransitionTime":"2026-01-28T20:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.879425 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.898948 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.915281 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.928543 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.948863 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:17Z\\\",\\\"message\\\":\\\"Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI0128 20:40:17.344732 6409 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:17.344766 6409 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0128 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.962142 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c88cd0b151f23677f83a944570f36a7a6edd30d9be47638a17cbb9098ec1c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"2026-01-28T20:39:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_529a75ec-ce77-49ca-8f6a-9466be7c980e\\\\n2026-01-28T20:39:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_529a75ec-ce77-49ca-8f6a-9466be7c980e to /host/opt/cni/bin/\\\\n2026-01-28T20:39:55Z [verbose] multus-daemon started\\\\n2026-01-28T20:39:55Z [verbose] Readiness Indicator file check\\\\n2026-01-28T20:40:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.973371 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.974037 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.974091 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.974100 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.974114 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.974124 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:42Z","lastTransitionTime":"2026-01-28T20:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:42 crc kubenswrapper[4746]: I0128 20:40:42.988364 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.000961 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:42Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.013883 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:43Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.023606 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:43Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.037064 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:43Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.048808 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:43Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.059994 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3553d828-b1a0-4f51-9e70-5f4d25f3ee42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba6f8586593282902571354d53a7bdc0945ea8d1970cac1bfc2f8cc4019a4ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d75277c1a1e8fdc34e37a3ae3d697e002007456fde3dae5d49c1c932a0a7f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6d80884e00b1a12051a7a97148a46fba0e8514a1233a180262392c302db77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:43Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.069690 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:43Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.076775 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.076855 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.076876 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.076905 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.076934 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:43Z","lastTransitionTime":"2026-01-28T20:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.178711 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.178760 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.178773 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.178791 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.178805 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:43Z","lastTransitionTime":"2026-01-28T20:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.281640 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.281681 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.281691 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.281710 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.281722 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:43Z","lastTransitionTime":"2026-01-28T20:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.384027 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.384099 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.384108 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.384122 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.384135 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:43Z","lastTransitionTime":"2026-01-28T20:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.486587 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.486629 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.486640 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.486658 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.486671 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:43Z","lastTransitionTime":"2026-01-28T20:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.589419 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.589475 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.589490 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.589511 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.589527 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:43Z","lastTransitionTime":"2026-01-28T20:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.692099 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.692145 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.692157 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.692173 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.692184 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:43Z","lastTransitionTime":"2026-01-28T20:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.795427 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.795476 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.795488 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.795506 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.795516 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:43Z","lastTransitionTime":"2026-01-28T20:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.835348 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:43 crc kubenswrapper[4746]: E0128 20:40:43.835627 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.836588 4746 scope.go:117] "RemoveContainer" containerID="53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.850071 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 01:30:57.31182851 +0000 UTC Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.898738 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.898829 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.898842 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.898867 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:43 crc kubenswrapper[4746]: I0128 20:40:43.898883 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:43Z","lastTransitionTime":"2026-01-28T20:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.002546 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.002955 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.002965 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.002984 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.002994 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:44Z","lastTransitionTime":"2026-01-28T20:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.105237 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.105286 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.105298 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.105317 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.105328 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:44Z","lastTransitionTime":"2026-01-28T20:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.214705 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.214750 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.214761 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.214777 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.214789 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:44Z","lastTransitionTime":"2026-01-28T20:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.317999 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.318044 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.318053 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.318069 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.318094 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:44Z","lastTransitionTime":"2026-01-28T20:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.324457 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovnkube-controller/2.log" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.327119 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerStarted","Data":"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c"} Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.327636 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.345650 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:44Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.363019 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:44Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.374950 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:44Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.396842 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:17Z\\\",\\\"message\\\":\\\"Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI0128 20:40:17.344732 6409 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:17.344766 6409 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0128 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:44Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.410663 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:44Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.420782 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.420813 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.420826 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.420841 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.420852 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:44Z","lastTransitionTime":"2026-01-28T20:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.424862 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:44Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.439424 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:44Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.459218 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:44Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.477489 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c88cd0b151f23677f83a944570f36a7a6edd30d9be47638a17cbb9098ec1c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"2026-01-28T20:39:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_529a75ec-ce77-49ca-8f6a-9466be7c980e\\\\n2026-01-28T20:39:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_529a75ec-ce77-49ca-8f6a-9466be7c980e to /host/opt/cni/bin/\\\\n2026-01-28T20:39:55Z [verbose] multus-daemon started\\\\n2026-01-28T20:39:55Z [verbose] Readiness Indicator file check\\\\n2026-01-28T20:40:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:44Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.493192 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:44Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.511154 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:44Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.523489 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.523543 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.523560 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.523588 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.523605 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:44Z","lastTransitionTime":"2026-01-28T20:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.525300 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3553d828-b1a0-4f51-9e70-5f4d25f3ee42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba6f8586593282902571354d53a7bdc0945ea8d1970cac1bfc2f8cc4019a4ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d75277c1a1e8fdc34e37a3ae3d697e002007456fde3dae5d49c1c932a0a7f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6d80884e00b1a12051a7a97148a46fba0e8514a1233a180262392c302db77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:44Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.538063 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:44Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.552238 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:44Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.566388 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:44Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.579334 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:44Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.595901 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:44Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.627186 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.627235 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.627249 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.627268 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.627284 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:44Z","lastTransitionTime":"2026-01-28T20:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.730584 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.730640 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.730654 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.730682 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.730697 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:44Z","lastTransitionTime":"2026-01-28T20:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.832920 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.832964 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.832974 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.832989 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.832999 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:44Z","lastTransitionTime":"2026-01-28T20:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.835383 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:44 crc kubenswrapper[4746]: E0128 20:40:44.835475 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.835553 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.835598 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:44 crc kubenswrapper[4746]: E0128 20:40:44.835797 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:44 crc kubenswrapper[4746]: E0128 20:40:44.835989 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.851181 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 03:36:06.322682142 +0000 UTC Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.936301 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.936363 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.936390 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.936425 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:44 crc kubenswrapper[4746]: I0128 20:40:44.936452 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:44Z","lastTransitionTime":"2026-01-28T20:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.040044 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.040159 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.040181 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.040215 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.040239 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:45Z","lastTransitionTime":"2026-01-28T20:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.142882 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.143273 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.143377 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.143472 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.143574 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:45Z","lastTransitionTime":"2026-01-28T20:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.247319 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.247402 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.247424 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.247456 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.247475 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:45Z","lastTransitionTime":"2026-01-28T20:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.333866 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovnkube-controller/3.log" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.334830 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovnkube-controller/2.log" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.338950 4746 generic.go:334] "Generic (PLEG): container finished" podID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerID="dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c" exitCode=1 Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.339013 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerDied","Data":"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c"} Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.339413 4746 scope.go:117] "RemoveContainer" containerID="53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.340516 4746 scope.go:117] "RemoveContainer" containerID="dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c" Jan 28 20:40:45 crc kubenswrapper[4746]: E0128 20:40:45.341198 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.353425 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.353483 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.353495 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.353517 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.353531 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:45Z","lastTransitionTime":"2026-01-28T20:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.360978 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3553d828-b1a0-4f51-9e70-5f4d25f3ee42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba6f8586593282902571354d53a7bdc0945ea8d1970cac1bfc2f8cc4019a4ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d75277c1a1e8fdc34e37a3ae3d697e002007456fde3dae5d49c1c932a0a7f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6d80884e00b1a12051a7a97148a46fba0e8514a1233a180262392c302db77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:45Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.379529 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:45Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.401596 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:45Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.421140 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:45Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.434422 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:45Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.453510 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:45Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.457105 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.457133 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.457142 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.457156 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.457166 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:45Z","lastTransitionTime":"2026-01-28T20:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.467458 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:45Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.483387 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:45Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.495966 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:45Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.520091 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53be073387414da7f1a5b73960206c0a04b4ef439d6c51bd22ceefbb5d723b76\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:17Z\\\",\\\"message\\\":\\\"Built service openshift-console/console LB template configs for network=default: []services.lbConfig(nil)\\\\nI0128 20:40:17.344732 6409 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0128 20:40:17.344766 6409 services_controller.go:451] Built service openshift-console/console cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console/console_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console/console\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.194\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0128 20\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:44Z\\\",\\\"message\\\":\\\"Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https-metrics,Protocol:TCP,Port:8443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: catalog-operator,},ClusterIP:10.217.5.204,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.204],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0128 20:40:44.722894 6800 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI0128 20:40:44.724571 6800 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name service/openshift-ovn-kubernetes/ovn-kubernetes-control-plane. OVN-Kubernetes controller took 0.108725535 seconds. No OVN measurement.\\\\nI0128 20:40:44.724576 6800 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 4.634502ms\\\\nI0128 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:45Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.533929 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:45Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.548697 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:45Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.561180 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:45Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.561698 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.561728 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.561744 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.561769 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.561781 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:45Z","lastTransitionTime":"2026-01-28T20:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.575915 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:45Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.589170 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c88cd0b151f23677f83a944570f36a7a6edd30d9be47638a17cbb9098ec1c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"2026-01-28T20:39:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_529a75ec-ce77-49ca-8f6a-9466be7c980e\\\\n2026-01-28T20:39:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_529a75ec-ce77-49ca-8f6a-9466be7c980e to /host/opt/cni/bin/\\\\n2026-01-28T20:39:55Z [verbose] multus-daemon started\\\\n2026-01-28T20:39:55Z [verbose] Readiness Indicator file check\\\\n2026-01-28T20:40:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:45Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.602581 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:45Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.618022 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:45Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.664499 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.664547 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.664557 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.664575 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.664588 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:45Z","lastTransitionTime":"2026-01-28T20:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.766758 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.766799 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.766809 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.766826 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.766837 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:45Z","lastTransitionTime":"2026-01-28T20:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.835697 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:45 crc kubenswrapper[4746]: E0128 20:40:45.835836 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.852109 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 19:06:56.325963216 +0000 UTC Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.870329 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.870399 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.870423 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.870453 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.870478 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:45Z","lastTransitionTime":"2026-01-28T20:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.972713 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.972774 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.972785 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.972805 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:45 crc kubenswrapper[4746]: I0128 20:40:45.972822 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:45Z","lastTransitionTime":"2026-01-28T20:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.077305 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.077362 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.077373 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.077394 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.077406 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:46Z","lastTransitionTime":"2026-01-28T20:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.181182 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.181221 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.181231 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.181258 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.181271 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:46Z","lastTransitionTime":"2026-01-28T20:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.283475 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.283546 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.283563 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.283584 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.283636 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:46Z","lastTransitionTime":"2026-01-28T20:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.345298 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovnkube-controller/3.log" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.349732 4746 scope.go:117] "RemoveContainer" containerID="dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c" Jan 28 20:40:46 crc kubenswrapper[4746]: E0128 20:40:46.349984 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.368709 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:44Z\\\",\\\"message\\\":\\\"Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https-metrics,Protocol:TCP,Port:8443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: catalog-operator,},ClusterIP:10.217.5.204,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.204],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0128 20:40:44.722894 6800 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI0128 20:40:44.724571 6800 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name service/openshift-ovn-kubernetes/ovn-kubernetes-control-plane. OVN-Kubernetes controller took 0.108725535 seconds. No OVN measurement.\\\\nI0128 20:40:44.724576 6800 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 4.634502ms\\\\nI0128 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:46Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.382976 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:46Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.385978 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.386015 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.386027 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.386044 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.386059 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:46Z","lastTransitionTime":"2026-01-28T20:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.396694 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:46Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.407826 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:46Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.422368 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:46Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.436601 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c88cd0b151f23677f83a944570f36a7a6edd30d9be47638a17cbb9098ec1c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"2026-01-28T20:39:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_529a75ec-ce77-49ca-8f6a-9466be7c980e\\\\n2026-01-28T20:39:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_529a75ec-ce77-49ca-8f6a-9466be7c980e to /host/opt/cni/bin/\\\\n2026-01-28T20:39:55Z [verbose] multus-daemon started\\\\n2026-01-28T20:39:55Z [verbose] Readiness Indicator file check\\\\n2026-01-28T20:40:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:46Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.447227 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:46Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.460819 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:46Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.474361 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:46Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.488142 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:46Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.489016 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.489051 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.489060 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.489074 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.489106 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:46Z","lastTransitionTime":"2026-01-28T20:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.501649 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:46Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.518017 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:46Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.532696 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3553d828-b1a0-4f51-9e70-5f4d25f3ee42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba6f8586593282902571354d53a7bdc0945ea8d1970cac1bfc2f8cc4019a4ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d75277c1a1e8fdc34e37a3ae3d697e002007456fde3dae5d49c1c932a0a7f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6d80884e00b1a12051a7a97148a46fba0e8514a1233a180262392c302db77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:46Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.545748 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:46Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.559262 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:46Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.576032 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:46Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.587353 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:46Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.591216 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.591239 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.591249 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.591263 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.591274 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:46Z","lastTransitionTime":"2026-01-28T20:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.694284 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.694360 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.694385 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.694421 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.694446 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:46Z","lastTransitionTime":"2026-01-28T20:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.798821 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.798886 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.798902 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.798924 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.798938 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:46Z","lastTransitionTime":"2026-01-28T20:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.835721 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:46 crc kubenswrapper[4746]: E0128 20:40:46.835870 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.836172 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:46 crc kubenswrapper[4746]: E0128 20:40:46.836250 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.836474 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:46 crc kubenswrapper[4746]: E0128 20:40:46.836782 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.852889 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 15:55:32.114944855 +0000 UTC Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.903622 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.903682 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.903697 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.903721 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:46 crc kubenswrapper[4746]: I0128 20:40:46.903735 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:46Z","lastTransitionTime":"2026-01-28T20:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.006993 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.007058 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.007074 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.007131 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.007150 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:47Z","lastTransitionTime":"2026-01-28T20:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.110677 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.110742 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.110761 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.110787 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.110804 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:47Z","lastTransitionTime":"2026-01-28T20:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.215182 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.215257 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.215277 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.215308 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.215329 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:47Z","lastTransitionTime":"2026-01-28T20:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.318545 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.318587 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.318597 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.318617 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.318627 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:47Z","lastTransitionTime":"2026-01-28T20:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.421754 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.421795 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.421804 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.421820 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.421830 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:47Z","lastTransitionTime":"2026-01-28T20:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.526537 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.526608 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.526627 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.526657 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.526680 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:47Z","lastTransitionTime":"2026-01-28T20:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.629949 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.629994 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.630005 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.630023 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.630033 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:47Z","lastTransitionTime":"2026-01-28T20:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.735065 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.735183 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.735210 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.735258 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.735290 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:47Z","lastTransitionTime":"2026-01-28T20:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.835137 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:47 crc kubenswrapper[4746]: E0128 20:40:47.835377 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.838323 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.838380 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.838400 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.838426 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.838448 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:47Z","lastTransitionTime":"2026-01-28T20:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.853683 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 23:51:14.387548662 +0000 UTC Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.942587 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.942674 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.942691 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.942720 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:47 crc kubenswrapper[4746]: I0128 20:40:47.942738 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:47Z","lastTransitionTime":"2026-01-28T20:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.045567 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.045617 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.045629 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.045649 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.045662 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:48Z","lastTransitionTime":"2026-01-28T20:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.150934 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.151025 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.151047 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.151107 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.151130 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:48Z","lastTransitionTime":"2026-01-28T20:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.254197 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.254253 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.254273 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.254299 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.254316 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:48Z","lastTransitionTime":"2026-01-28T20:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.357034 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.357117 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.357134 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.357160 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.357177 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:48Z","lastTransitionTime":"2026-01-28T20:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.460745 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.460869 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.460899 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.460946 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.460972 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:48Z","lastTransitionTime":"2026-01-28T20:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.564596 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.564660 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.564683 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.564724 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.564747 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:48Z","lastTransitionTime":"2026-01-28T20:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.668258 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.668291 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.668301 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.668314 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.668322 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:48Z","lastTransitionTime":"2026-01-28T20:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.772057 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.772127 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.772137 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.772154 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.772169 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:48Z","lastTransitionTime":"2026-01-28T20:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.835690 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:48 crc kubenswrapper[4746]: E0128 20:40:48.835845 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.835879 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.835941 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:48 crc kubenswrapper[4746]: E0128 20:40:48.836036 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:48 crc kubenswrapper[4746]: E0128 20:40:48.836239 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.854129 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 01:29:19.989267969 +0000 UTC Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.876139 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.876213 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.876234 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.876257 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.876277 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:48Z","lastTransitionTime":"2026-01-28T20:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.979178 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.979230 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.979240 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.979259 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:48 crc kubenswrapper[4746]: I0128 20:40:48.979273 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:48Z","lastTransitionTime":"2026-01-28T20:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.082256 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.082320 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.082340 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.082364 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.082382 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:49Z","lastTransitionTime":"2026-01-28T20:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.185045 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.185135 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.185149 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.185170 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.185183 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:49Z","lastTransitionTime":"2026-01-28T20:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.288479 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.288549 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.288568 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.288602 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.288622 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:49Z","lastTransitionTime":"2026-01-28T20:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.391702 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.391756 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.391767 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.391783 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.391796 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:49Z","lastTransitionTime":"2026-01-28T20:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.494141 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.494199 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.494211 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.494230 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.494242 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:49Z","lastTransitionTime":"2026-01-28T20:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.597116 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.597147 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.597155 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.597172 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.597184 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:49Z","lastTransitionTime":"2026-01-28T20:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.700024 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.700117 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.700135 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.700159 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.700175 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:49Z","lastTransitionTime":"2026-01-28T20:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.804907 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.804979 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.804999 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.805025 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.805046 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:49Z","lastTransitionTime":"2026-01-28T20:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.835295 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:49 crc kubenswrapper[4746]: E0128 20:40:49.835509 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.854886 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 00:26:08.882888492 +0000 UTC Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.909591 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.909679 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.909697 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.909724 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:49 crc kubenswrapper[4746]: I0128 20:40:49.909741 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:49Z","lastTransitionTime":"2026-01-28T20:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.012893 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.012992 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.013010 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.013044 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.013066 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:50Z","lastTransitionTime":"2026-01-28T20:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.116048 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.116132 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.116145 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.116163 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.116178 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:50Z","lastTransitionTime":"2026-01-28T20:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.219243 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.219295 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.219306 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.219321 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.219332 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:50Z","lastTransitionTime":"2026-01-28T20:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.323252 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.323307 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.323321 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.323338 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.323357 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:50Z","lastTransitionTime":"2026-01-28T20:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.425744 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.425834 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.425858 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.425893 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.425918 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:50Z","lastTransitionTime":"2026-01-28T20:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.531504 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.531573 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.531591 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.531618 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.531636 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:50Z","lastTransitionTime":"2026-01-28T20:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.635739 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.635792 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.635826 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.635861 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.635873 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:50Z","lastTransitionTime":"2026-01-28T20:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.738726 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.738787 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.738797 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.738822 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.738834 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:50Z","lastTransitionTime":"2026-01-28T20:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.744410 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.744473 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.744491 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.744550 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.744564 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:50Z","lastTransitionTime":"2026-01-28T20:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:50 crc kubenswrapper[4746]: E0128 20:40:50.758625 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:50Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.762268 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.762313 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.762325 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.762343 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.762356 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:50Z","lastTransitionTime":"2026-01-28T20:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:50 crc kubenswrapper[4746]: E0128 20:40:50.775549 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:50Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.780309 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.780361 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.780374 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.780397 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.780428 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:50Z","lastTransitionTime":"2026-01-28T20:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:50 crc kubenswrapper[4746]: E0128 20:40:50.792457 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:50Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.795755 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.795824 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.795835 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.795852 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.795864 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:50Z","lastTransitionTime":"2026-01-28T20:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:50 crc kubenswrapper[4746]: E0128 20:40:50.816579 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:50Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.821404 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.821442 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.821450 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.821467 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.821479 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:50Z","lastTransitionTime":"2026-01-28T20:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.835159 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.835198 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.835322 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:50 crc kubenswrapper[4746]: E0128 20:40:50.835522 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:50 crc kubenswrapper[4746]: E0128 20:40:50.835664 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:50 crc kubenswrapper[4746]: E0128 20:40:50.835764 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:50 crc kubenswrapper[4746]: E0128 20:40:50.836125 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:50Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:50 crc kubenswrapper[4746]: E0128 20:40:50.836342 4746 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.842067 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.842269 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.842339 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.842371 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.842404 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:50Z","lastTransitionTime":"2026-01-28T20:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.855121 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 07:28:25.470034774 +0000 UTC Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.946910 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.947024 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.947035 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.947052 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:50 crc kubenswrapper[4746]: I0128 20:40:50.947064 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:50Z","lastTransitionTime":"2026-01-28T20:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.050072 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.050141 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.050151 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.050208 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.050223 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:51Z","lastTransitionTime":"2026-01-28T20:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.153559 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.153640 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.153663 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.153695 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.153713 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:51Z","lastTransitionTime":"2026-01-28T20:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.257336 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.257372 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.257380 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.257396 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.257406 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:51Z","lastTransitionTime":"2026-01-28T20:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.360004 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.360101 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.360116 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.360133 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.360143 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:51Z","lastTransitionTime":"2026-01-28T20:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.462569 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.462612 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.462623 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.462640 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.462653 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:51Z","lastTransitionTime":"2026-01-28T20:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.566555 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.566626 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.566645 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.566706 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.566730 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:51Z","lastTransitionTime":"2026-01-28T20:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.670056 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.670131 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.670151 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.670171 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.670187 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:51Z","lastTransitionTime":"2026-01-28T20:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.773521 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.773617 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.773665 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.773707 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.773737 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:51Z","lastTransitionTime":"2026-01-28T20:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.835330 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:51 crc kubenswrapper[4746]: E0128 20:40:51.835519 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.855645 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 20:41:16.605977846 +0000 UTC Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.876868 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.876905 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.876914 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.876926 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.876935 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:51Z","lastTransitionTime":"2026-01-28T20:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.979506 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.979545 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.979556 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.979573 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:51 crc kubenswrapper[4746]: I0128 20:40:51.979583 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:51Z","lastTransitionTime":"2026-01-28T20:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.082101 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.082134 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.082142 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.082155 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.082165 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:52Z","lastTransitionTime":"2026-01-28T20:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.184656 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.184739 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.184761 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.184790 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.184809 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:52Z","lastTransitionTime":"2026-01-28T20:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.287991 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.288048 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.288057 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.288115 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.288128 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:52Z","lastTransitionTime":"2026-01-28T20:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.391035 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.391102 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.391112 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.391134 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.391151 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:52Z","lastTransitionTime":"2026-01-28T20:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.494861 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.494912 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.494924 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.494946 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.494976 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:52Z","lastTransitionTime":"2026-01-28T20:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.598339 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.598387 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.598404 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.598424 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.598436 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:52Z","lastTransitionTime":"2026-01-28T20:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.701166 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.701197 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.701208 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.701221 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.701231 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:52Z","lastTransitionTime":"2026-01-28T20:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.803906 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.803945 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.803953 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.803967 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.803977 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:52Z","lastTransitionTime":"2026-01-28T20:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.835836 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.836034 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:52 crc kubenswrapper[4746]: E0128 20:40:52.836158 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.836374 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:52 crc kubenswrapper[4746]: E0128 20:40:52.836485 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:52 crc kubenswrapper[4746]: E0128 20:40:52.836726 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.850037 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.855659 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:52Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.855910 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 17:30:19.376989006 +0000 UTC Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.874770 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:52Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.890430 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:52Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.906747 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.906783 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.906793 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.906811 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.906822 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:52Z","lastTransitionTime":"2026-01-28T20:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.909203 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:52Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.927524 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:52Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.942427 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:52Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.968480 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:44Z\\\",\\\"message\\\":\\\"Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https-metrics,Protocol:TCP,Port:8443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: catalog-operator,},ClusterIP:10.217.5.204,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.204],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0128 20:40:44.722894 6800 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI0128 20:40:44.724571 6800 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name service/openshift-ovn-kubernetes/ovn-kubernetes-control-plane. OVN-Kubernetes controller took 0.108725535 seconds. No OVN measurement.\\\\nI0128 20:40:44.724576 6800 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 4.634502ms\\\\nI0128 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:52Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.984695 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c88cd0b151f23677f83a944570f36a7a6edd30d9be47638a17cbb9098ec1c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"2026-01-28T20:39:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_529a75ec-ce77-49ca-8f6a-9466be7c980e\\\\n2026-01-28T20:39:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_529a75ec-ce77-49ca-8f6a-9466be7c980e to /host/opt/cni/bin/\\\\n2026-01-28T20:39:55Z [verbose] multus-daemon started\\\\n2026-01-28T20:39:55Z [verbose] Readiness Indicator file check\\\\n2026-01-28T20:40:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:52Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:52 crc kubenswrapper[4746]: I0128 20:40:52.997549 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:52Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.009706 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.009741 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.009752 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.009768 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.009779 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:53Z","lastTransitionTime":"2026-01-28T20:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.014788 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:53Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.028511 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:53Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.041975 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:53Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.053692 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:53Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.068918 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:53Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.083174 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:53Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.101133 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3553d828-b1a0-4f51-9e70-5f4d25f3ee42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba6f8586593282902571354d53a7bdc0945ea8d1970cac1bfc2f8cc4019a4ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d75277c1a1e8fdc34e37a3ae3d697e002007456fde3dae5d49c1c932a0a7f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6d80884e00b1a12051a7a97148a46fba0e8514a1233a180262392c302db77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:53Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.112499 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.112578 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.112598 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.112634 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.112669 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:53Z","lastTransitionTime":"2026-01-28T20:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.116701 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:40:53Z is after 2025-08-24T17:21:41Z" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.215920 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.216013 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.216045 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.216162 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.216195 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:53Z","lastTransitionTime":"2026-01-28T20:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.319248 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.319304 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.319313 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.319331 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.319341 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:53Z","lastTransitionTime":"2026-01-28T20:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.423075 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.423188 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.423210 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.423250 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.423273 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:53Z","lastTransitionTime":"2026-01-28T20:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.526794 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.527352 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.527493 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.527650 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.527787 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:53Z","lastTransitionTime":"2026-01-28T20:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.631653 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.631744 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.631758 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.631779 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.631806 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:53Z","lastTransitionTime":"2026-01-28T20:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.735168 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.735230 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.735246 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.735264 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.735275 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:53Z","lastTransitionTime":"2026-01-28T20:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.835481 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:53 crc kubenswrapper[4746]: E0128 20:40:53.835713 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.837831 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.837886 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.837904 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.837929 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.837948 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:53Z","lastTransitionTime":"2026-01-28T20:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.856417 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 20:49:44.084184107 +0000 UTC Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.941142 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.941193 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.941210 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.941232 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:53 crc kubenswrapper[4746]: I0128 20:40:53.941248 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:53Z","lastTransitionTime":"2026-01-28T20:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.044201 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.044260 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.044275 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.044299 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.044314 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:54Z","lastTransitionTime":"2026-01-28T20:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.147215 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.147296 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.147314 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.147335 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.147350 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:54Z","lastTransitionTime":"2026-01-28T20:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.251184 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.251246 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.251259 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.251280 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.251297 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:54Z","lastTransitionTime":"2026-01-28T20:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.354538 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.354629 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.354652 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.354685 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.354706 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:54Z","lastTransitionTime":"2026-01-28T20:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.458350 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.458449 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.458472 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.458508 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.458543 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:54Z","lastTransitionTime":"2026-01-28T20:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.567322 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.567369 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.567382 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.567406 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.567421 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:54Z","lastTransitionTime":"2026-01-28T20:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.670927 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.671602 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.671771 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.671964 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.672200 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:54Z","lastTransitionTime":"2026-01-28T20:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.776261 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.776351 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.776374 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.776409 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.776435 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:54Z","lastTransitionTime":"2026-01-28T20:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.835944 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.836523 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.836786 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:54 crc kubenswrapper[4746]: E0128 20:40:54.836957 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:54 crc kubenswrapper[4746]: E0128 20:40:54.837114 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:54 crc kubenswrapper[4746]: E0128 20:40:54.837224 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.858101 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 21:55:21.625427026 +0000 UTC Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.888722 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.888772 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.888788 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.888814 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.888829 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:54Z","lastTransitionTime":"2026-01-28T20:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.992043 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.992103 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.992117 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.992136 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:54 crc kubenswrapper[4746]: I0128 20:40:54.992148 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:54Z","lastTransitionTime":"2026-01-28T20:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.094950 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.094993 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.095003 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.095016 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.095027 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:55Z","lastTransitionTime":"2026-01-28T20:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.198145 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.198209 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.198219 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.198239 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.198251 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:55Z","lastTransitionTime":"2026-01-28T20:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.301256 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.301336 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.301361 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.301401 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.301468 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:55Z","lastTransitionTime":"2026-01-28T20:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.405422 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.405459 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.405467 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.405482 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.405493 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:55Z","lastTransitionTime":"2026-01-28T20:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.508958 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.509039 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.509057 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.509130 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.509251 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:55Z","lastTransitionTime":"2026-01-28T20:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.612572 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.612639 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.612659 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.612689 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.612708 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:55Z","lastTransitionTime":"2026-01-28T20:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.716649 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.716710 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.716726 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.716753 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.716775 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:55Z","lastTransitionTime":"2026-01-28T20:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.820599 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.820662 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.820683 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.820716 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.820737 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:55Z","lastTransitionTime":"2026-01-28T20:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.835339 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:55 crc kubenswrapper[4746]: E0128 20:40:55.835525 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.859046 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 19:33:58.780575058 +0000 UTC Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.924538 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.924626 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.924652 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.924685 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:55 crc kubenswrapper[4746]: I0128 20:40:55.924705 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:55Z","lastTransitionTime":"2026-01-28T20:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.028722 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.028824 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.028849 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.028886 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.028913 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:56Z","lastTransitionTime":"2026-01-28T20:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.132240 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.132323 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.132342 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.132375 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.132393 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:56Z","lastTransitionTime":"2026-01-28T20:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.236694 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.236837 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.236858 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.236889 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.236961 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:56Z","lastTransitionTime":"2026-01-28T20:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.341169 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.341233 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.341246 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.341275 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.341295 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:56Z","lastTransitionTime":"2026-01-28T20:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.445294 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.445340 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.445351 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.445369 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.445378 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:56Z","lastTransitionTime":"2026-01-28T20:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.549609 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.549709 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.549729 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.549761 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.549782 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:56Z","lastTransitionTime":"2026-01-28T20:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.631468 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:40:56 crc kubenswrapper[4746]: E0128 20:40:56.631999 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:00.631929566 +0000 UTC m=+148.588115950 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.632321 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.632408 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.632469 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:56 crc kubenswrapper[4746]: E0128 20:40:56.632638 4746 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 20:40:56 crc kubenswrapper[4746]: E0128 20:40:56.632678 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 20:40:56 crc kubenswrapper[4746]: E0128 20:40:56.632719 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 20:40:56 crc kubenswrapper[4746]: E0128 20:40:56.632744 4746 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:40:56 crc kubenswrapper[4746]: E0128 20:40:56.632766 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 20:42:00.63273004 +0000 UTC m=+148.588916434 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 28 20:40:56 crc kubenswrapper[4746]: E0128 20:40:56.632815 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-28 20:42:00.632796502 +0000 UTC m=+148.588982886 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:40:56 crc kubenswrapper[4746]: E0128 20:40:56.633071 4746 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 20:40:56 crc kubenswrapper[4746]: E0128 20:40:56.633229 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-28 20:42:00.633195914 +0000 UTC m=+148.589382458 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.655151 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.655204 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.655215 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.655233 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.655246 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:56Z","lastTransitionTime":"2026-01-28T20:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.734258 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:56 crc kubenswrapper[4746]: E0128 20:40:56.734636 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 28 20:40:56 crc kubenswrapper[4746]: E0128 20:40:56.734693 4746 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 28 20:40:56 crc kubenswrapper[4746]: E0128 20:40:56.734713 4746 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:40:56 crc kubenswrapper[4746]: E0128 20:40:56.734811 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-28 20:42:00.734779111 +0000 UTC m=+148.690965475 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.758298 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.758345 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.758362 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.758392 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.758410 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:56Z","lastTransitionTime":"2026-01-28T20:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.835219 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.835324 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:56 crc kubenswrapper[4746]: E0128 20:40:56.835364 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.835505 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:56 crc kubenswrapper[4746]: E0128 20:40:56.835586 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:56 crc kubenswrapper[4746]: E0128 20:40:56.835786 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.859912 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 07:54:42.687010922 +0000 UTC Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.861380 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.861419 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.861429 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.861444 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.861456 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:56Z","lastTransitionTime":"2026-01-28T20:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.964468 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.964564 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.964585 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.964622 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:56 crc kubenswrapper[4746]: I0128 20:40:56.964650 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:56Z","lastTransitionTime":"2026-01-28T20:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.077886 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.077980 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.078006 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.078044 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.078069 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:57Z","lastTransitionTime":"2026-01-28T20:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.180957 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.181043 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.181068 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.181141 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.181164 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:57Z","lastTransitionTime":"2026-01-28T20:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.285405 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.285512 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.285541 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.285577 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.285602 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:57Z","lastTransitionTime":"2026-01-28T20:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.389642 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.389708 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.389726 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.389758 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.389777 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:57Z","lastTransitionTime":"2026-01-28T20:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.493145 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.493243 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.493267 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.493305 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.493336 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:57Z","lastTransitionTime":"2026-01-28T20:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.597124 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.597953 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.598323 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.598500 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.598738 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:57Z","lastTransitionTime":"2026-01-28T20:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.702699 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.702778 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.702801 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.702835 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.702857 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:57Z","lastTransitionTime":"2026-01-28T20:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.805829 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.805894 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.805911 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.805932 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.805947 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:57Z","lastTransitionTime":"2026-01-28T20:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.835452 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:57 crc kubenswrapper[4746]: E0128 20:40:57.835655 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.861150 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 11:20:17.306041024 +0000 UTC Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.909646 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.909741 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.909762 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.909793 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:57 crc kubenswrapper[4746]: I0128 20:40:57.909814 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:57Z","lastTransitionTime":"2026-01-28T20:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.013406 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.014030 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.014328 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.014608 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.014884 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:58Z","lastTransitionTime":"2026-01-28T20:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.119543 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.119599 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.119611 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.119633 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.119648 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:58Z","lastTransitionTime":"2026-01-28T20:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.222544 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.222646 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.222675 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.222708 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.222730 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:58Z","lastTransitionTime":"2026-01-28T20:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.326279 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.326355 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.326371 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.326399 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.326417 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:58Z","lastTransitionTime":"2026-01-28T20:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.431137 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.431322 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.431344 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.431418 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.431446 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:58Z","lastTransitionTime":"2026-01-28T20:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.535684 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.535761 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.535782 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.535811 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.535832 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:58Z","lastTransitionTime":"2026-01-28T20:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.640398 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.640480 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.640499 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.640529 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.640548 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:58Z","lastTransitionTime":"2026-01-28T20:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.744487 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.744575 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.744596 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.744626 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.744645 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:58Z","lastTransitionTime":"2026-01-28T20:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.835002 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.835173 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.835238 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:40:58 crc kubenswrapper[4746]: E0128 20:40:58.835473 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:40:58 crc kubenswrapper[4746]: E0128 20:40:58.835586 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:40:58 crc kubenswrapper[4746]: E0128 20:40:58.835854 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.847857 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.847918 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.847936 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.847965 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.847989 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:58Z","lastTransitionTime":"2026-01-28T20:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.861316 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 08:59:59.281066523 +0000 UTC Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.952175 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.952240 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.952258 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.952293 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:58 crc kubenswrapper[4746]: I0128 20:40:58.952317 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:58Z","lastTransitionTime":"2026-01-28T20:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.055339 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.055422 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.055449 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.055488 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.055512 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:59Z","lastTransitionTime":"2026-01-28T20:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.159465 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.159557 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.159582 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.159621 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.159650 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:59Z","lastTransitionTime":"2026-01-28T20:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.263806 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.263890 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.263909 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.263938 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.263957 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:59Z","lastTransitionTime":"2026-01-28T20:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.368168 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.368273 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.368300 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.368354 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.368391 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:59Z","lastTransitionTime":"2026-01-28T20:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.472947 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.473001 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.473014 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.473038 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.473054 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:59Z","lastTransitionTime":"2026-01-28T20:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.576649 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.576731 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.576750 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.576782 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.576802 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:59Z","lastTransitionTime":"2026-01-28T20:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.680551 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.680616 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.680633 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.680662 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.680679 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:59Z","lastTransitionTime":"2026-01-28T20:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.783875 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.783953 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.783978 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.784012 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.784039 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:59Z","lastTransitionTime":"2026-01-28T20:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.836695 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:40:59 crc kubenswrapper[4746]: E0128 20:40:59.837020 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.861922 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 19:47:22.923013829 +0000 UTC Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.887799 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.887850 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.887860 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.887875 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.887886 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:59Z","lastTransitionTime":"2026-01-28T20:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.992277 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.992355 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.992376 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.992406 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:40:59 crc kubenswrapper[4746]: I0128 20:40:59.992430 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:40:59Z","lastTransitionTime":"2026-01-28T20:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.095762 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.095838 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.095857 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.095886 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.095905 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:00Z","lastTransitionTime":"2026-01-28T20:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.199484 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.199559 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.199578 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.199607 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.199629 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:00Z","lastTransitionTime":"2026-01-28T20:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.303363 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.303432 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.303451 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.303477 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.303496 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:00Z","lastTransitionTime":"2026-01-28T20:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.406306 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.406373 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.406390 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.406415 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.406429 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:00Z","lastTransitionTime":"2026-01-28T20:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.510480 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.510578 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.510600 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.510629 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.510648 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:00Z","lastTransitionTime":"2026-01-28T20:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.613481 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.613560 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.613584 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.613621 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.613647 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:00Z","lastTransitionTime":"2026-01-28T20:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.717419 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.717498 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.717520 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.717550 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.717569 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:00Z","lastTransitionTime":"2026-01-28T20:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.821064 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.821203 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.821233 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.821265 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.821288 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:00Z","lastTransitionTime":"2026-01-28T20:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.835455 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.835598 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.835642 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:00 crc kubenswrapper[4746]: E0128 20:41:00.836378 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:00 crc kubenswrapper[4746]: E0128 20:41:00.836505 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:00 crc kubenswrapper[4746]: E0128 20:41:00.836681 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.836990 4746 scope.go:117] "RemoveContainer" containerID="dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c" Jan 28 20:41:00 crc kubenswrapper[4746]: E0128 20:41:00.837381 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.862978 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 03:30:58.019097592 +0000 UTC Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.924975 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.925036 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.925057 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.925123 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:00 crc kubenswrapper[4746]: I0128 20:41:00.925143 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:00Z","lastTransitionTime":"2026-01-28T20:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.028756 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.028829 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.028858 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.028875 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.028886 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:01Z","lastTransitionTime":"2026-01-28T20:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.133417 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.133490 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.133508 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.133537 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.133558 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:01Z","lastTransitionTime":"2026-01-28T20:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.185641 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.185711 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.185730 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.185762 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.185785 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:01Z","lastTransitionTime":"2026-01-28T20:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:01 crc kubenswrapper[4746]: E0128 20:41:01.200627 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.205826 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.205889 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.205906 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.205938 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.205958 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:01Z","lastTransitionTime":"2026-01-28T20:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:01 crc kubenswrapper[4746]: E0128 20:41:01.218547 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.223488 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.223529 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.223538 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.223555 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.223570 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:01Z","lastTransitionTime":"2026-01-28T20:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:01 crc kubenswrapper[4746]: E0128 20:41:01.235245 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.240237 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.240276 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.240289 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.240309 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.240325 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:01Z","lastTransitionTime":"2026-01-28T20:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:01 crc kubenswrapper[4746]: E0128 20:41:01.258917 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.263958 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.264039 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.264063 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.264134 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.264157 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:01Z","lastTransitionTime":"2026-01-28T20:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:01 crc kubenswrapper[4746]: E0128 20:41:01.283048 4746 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-28T20:41:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"349dd4ef-f3ea-4c41-bfa2-75ea02498ab0\\\",\\\"systemUUID\\\":\\\"e89ecf32-8beb-4b41-b6df-0f1293ce0213\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:01Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:01 crc kubenswrapper[4746]: E0128 20:41:01.283312 4746 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.285703 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.285753 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.285773 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.285808 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.285829 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:01Z","lastTransitionTime":"2026-01-28T20:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.394147 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.394223 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.394241 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.394263 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.394278 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:01Z","lastTransitionTime":"2026-01-28T20:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.497962 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.498000 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.498010 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.498026 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.498058 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:01Z","lastTransitionTime":"2026-01-28T20:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.600958 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.601050 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.601065 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.601096 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.601106 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:01Z","lastTransitionTime":"2026-01-28T20:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.704613 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.704650 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.704659 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.704674 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.704684 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:01Z","lastTransitionTime":"2026-01-28T20:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.807650 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.807736 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.807745 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.807758 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.807766 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:01Z","lastTransitionTime":"2026-01-28T20:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.835295 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:01 crc kubenswrapper[4746]: E0128 20:41:01.835589 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.864021 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 22:01:14.287098617 +0000 UTC Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.911954 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.912028 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.912043 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.912067 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:01 crc kubenswrapper[4746]: I0128 20:41:01.912114 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:01Z","lastTransitionTime":"2026-01-28T20:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.017912 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.018004 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.018027 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.018056 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.018074 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:02Z","lastTransitionTime":"2026-01-28T20:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.121943 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.121989 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.121999 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.122017 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.122027 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:02Z","lastTransitionTime":"2026-01-28T20:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.225270 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.225948 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.226247 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.226428 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.226553 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:02Z","lastTransitionTime":"2026-01-28T20:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.329864 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.329918 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.329932 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.329956 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.329973 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:02Z","lastTransitionTime":"2026-01-28T20:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.432607 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.432665 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.432678 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.432698 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.432710 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:02Z","lastTransitionTime":"2026-01-28T20:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.535759 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.535820 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.535835 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.535860 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.535875 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:02Z","lastTransitionTime":"2026-01-28T20:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.638569 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.638609 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.638618 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.638632 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.638643 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:02Z","lastTransitionTime":"2026-01-28T20:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.740771 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.740889 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.740901 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.740915 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.740925 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:02Z","lastTransitionTime":"2026-01-28T20:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.835312 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.835526 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:02 crc kubenswrapper[4746]: E0128 20:41:02.835710 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:02 crc kubenswrapper[4746]: E0128 20:41:02.835853 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.836162 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:02 crc kubenswrapper[4746]: E0128 20:41:02.836266 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.842915 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.843032 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.843148 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.843341 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.843518 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:02Z","lastTransitionTime":"2026-01-28T20:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.849018 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64131d62eca6c3a42e9177d14fa39ed6435e536f60ef5d25ff6bf061d2cc0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.864863 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 06:36:58.045424827 +0000 UTC Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.867432 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://83a275ec9868f2fd9e24135d5a05c27caf843340c144070952291bb6a3715035\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.880408 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6dc8b546-9734-4082-b2b3-2bafe3f1564d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50ec1ea913ec63a096032bd968bda5c5c8879e16a08e6a57727750517d9af038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z84f9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6wrnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.899325 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"beb2b795-6bf4-4d38-89f7-bcb5512c3e61\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://564d42c4c25f218c1a1f52449d2f3d941c1dad920c8100d1d908cadf7cf46607\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d8bf207108ac2304cd04e34cd9843fcb755ac8610a3f88448538d1e99359f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://40c78d39cb70eba245771f4d85d2c8bc7b77427ea216ad3b89749d24fd550098\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://498028ca0bd260ca2c4d7b0231ea54a5a6ff5b302f720fbd209a4f640507258b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661d4a9031b6789f38ed1ac8e3c40d72182709130e1680ecdad86c07c1f34b7b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ffe0bbe4151671dc8803d7ab61cb9b3a2eae064b81e2830bc074787f4754e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd521e9ecb53f581b5732a4d018f92007063ce0912e6d760fe3c920218c23b14\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rr62b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-ht6hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.913904 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qhpvf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdf26de0-b602-4bdf-b492-65b3b6b31434\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c88cd0b151f23677f83a944570f36a7a6edd30d9be47638a17cbb9098ec1c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:40Z\\\",\\\"message\\\":\\\"2026-01-28T20:39:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_529a75ec-ce77-49ca-8f6a-9466be7c980e\\\\n2026-01-28T20:39:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_529a75ec-ce77-49ca-8f6a-9466be7c980e to /host/opt/cni/bin/\\\\n2026-01-28T20:39:55Z [verbose] multus-daemon started\\\\n2026-01-28T20:39:55Z [verbose] Readiness Indicator file check\\\\n2026-01-28T20:40:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jnpsl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qhpvf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.927912 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2blg6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f60a5487-5012-4cc9-ad94-5dfb4957d74e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2www\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:07Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2blg6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.946650 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64499f0d-5c10-43a9-b479-9b6c7ef2fb9c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0128 20:39:52.695680 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0128 20:39:52.695807 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0128 20:39:52.696767 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2500116620/tls.crt::/tmp/serving-cert-2500116620/tls.key\\\\\\\"\\\\nI0128 20:39:52.900914 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0128 20:39:52.915714 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0128 20:39:52.915750 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0128 20:39:52.916035 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0128 20:39:52.916045 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0128 20:39:52.925040 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0128 20:39:52.925098 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925104 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0128 20:39:52.925109 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0128 20:39:52.925113 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0128 20:39:52.925116 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0128 20:39:52.925119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0128 20:39:52.925416 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0128 20:39:52.926926 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.947366 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.947407 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.947420 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.947437 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.947694 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:02Z","lastTransitionTime":"2026-01-28T20:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.961144 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3553d828-b1a0-4f51-9e70-5f4d25f3ee42\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ba6f8586593282902571354d53a7bdc0945ea8d1970cac1bfc2f8cc4019a4ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00d75277c1a1e8fdc34e37a3ae3d697e002007456fde3dae5d49c1c932a0a7f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6d80884e00b1a12051a7a97148a46fba0e8514a1233a180262392c302db77f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4a7e5a6315ccaaa100eb33d1a23bf2481f4ee8aa1ea076927dec54027d4c4cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.973239 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f62da58-e6d0-4b56-80ab-4955ac619a24\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c996cd0a95ad18373212e715f575ffd233fb252c0c1701b6fd7d1c0fcade202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57975efbbe1fef93d88a55df5a0951063515ba898b928963f616fc23a05b5c09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57975efbbe1fef93d88a55df5a0951063515ba898b928963f616fc23a05b5c09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.985518 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d8rwq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"405572d8-50d2-4d7a-adf0-d8d6adea31ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eba305a172c49420674e97d32590a7b2e4c7c2cd7406257508e0636cf42ea244\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mb4h4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:56Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d8rwq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:02 crc kubenswrapper[4746]: I0128 20:41:02.999138 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"950d71b3-9b51-43dc-814a-d6a838723f78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0411ecf7c85f5397d751adb7d4905977dcbd3d2e169c56ebbe26017192765602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefda7da35cd9ff176512b5d1224ccadcb5debd0035a49822e82a9757d4a3f91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://145f4f882924f06172df81d94fc9bd683ea1fc04889de5f6cfb5caece8227fd0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:02Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.011977 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.026059 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4fb5f0-7c65-4fdb-8389-d2c8462e130b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:40:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://972f4c0079d8e04365abd93dc8d63e88dea04f427dbbbcf5e331cdeeff8c855c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2340a59c423445dd2b32da57d43146dd1cc354b737ae1b4d1875a0d40f62188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:40:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jvkqz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:40:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9zvm7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.041954 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.051144 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.051206 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.051220 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.051247 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.051262 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:03Z","lastTransitionTime":"2026-01-28T20:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.057557 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.072792 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dff3e2075b8bbfb6af77c72362ccc37e1d7810a2d97c096cf501681c98699ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://772e165efa6d865a669c172958e0aea3112146637e354e4587b7e664454112ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.086161 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gcrxx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c3c93a1-7ddf-4339-9ca3-79f3753943b4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bd8d8214aa8e85e50747643b6354fb6e026866c926310fb2dd79760f916f468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wn6hr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gcrxx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.114931 4746 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-28T20:40:44Z\\\",\\\"message\\\":\\\"Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https-metrics,Protocol:TCP,Port:8443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: catalog-operator,},ClusterIP:10.217.5.204,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.204],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0128 20:40:44.722894 6800 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/olm-operator-metrics\\\\\\\"}\\\\nI0128 20:40:44.724571 6800 ovnkube_controller.go:1292] Config duration recorder: kind/namespace/name service/openshift-ovn-kubernetes/ovn-kubernetes-control-plane. OVN-Kubernetes controller took 0.108725535 seconds. No OVN measurement.\\\\nI0128 20:40:44.724576 6800 services_controller.go:360] Finished syncing service olm-operator-metrics on namespace openshift-operator-lifecycle-manager for network=default : 4.634502ms\\\\nI0128 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-28T20:40:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-28T20:39:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-28T20:39:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-28T20:39:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-28T20:39:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8vmvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-28T20:41:03Z is after 2025-08-24T17:21:41Z" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.154349 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.154406 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.154417 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.154435 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.154447 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:03Z","lastTransitionTime":"2026-01-28T20:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.257576 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.257633 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.257644 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.257656 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.257665 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:03Z","lastTransitionTime":"2026-01-28T20:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.360847 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.360891 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.360901 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.360923 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.360934 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:03Z","lastTransitionTime":"2026-01-28T20:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.463604 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.463643 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.463654 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.463671 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.463681 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:03Z","lastTransitionTime":"2026-01-28T20:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.566513 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.566562 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.566576 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.566594 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.566607 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:03Z","lastTransitionTime":"2026-01-28T20:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.669033 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.669060 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.669069 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.669107 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.669118 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:03Z","lastTransitionTime":"2026-01-28T20:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.772920 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.772964 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.772974 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.772999 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.773012 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:03Z","lastTransitionTime":"2026-01-28T20:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.834955 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:03 crc kubenswrapper[4746]: E0128 20:41:03.835237 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.865340 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 05:35:58.236695679 +0000 UTC Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.875880 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.875966 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.875998 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.876034 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.876058 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:03Z","lastTransitionTime":"2026-01-28T20:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.978793 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.978851 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.978909 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.978946 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:03 crc kubenswrapper[4746]: I0128 20:41:03.978975 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:03Z","lastTransitionTime":"2026-01-28T20:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.081688 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.081728 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.081738 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.081759 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.081773 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:04Z","lastTransitionTime":"2026-01-28T20:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.185355 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.185410 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.185433 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.185468 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.185493 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:04Z","lastTransitionTime":"2026-01-28T20:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.288557 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.288603 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.288617 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.288633 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.288643 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:04Z","lastTransitionTime":"2026-01-28T20:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.392955 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.393066 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.393128 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.393188 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.393215 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:04Z","lastTransitionTime":"2026-01-28T20:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.496259 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.496300 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.496310 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.496325 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.496336 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:04Z","lastTransitionTime":"2026-01-28T20:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.598979 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.599061 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.599130 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.599168 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.599194 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:04Z","lastTransitionTime":"2026-01-28T20:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.705243 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.705325 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.705344 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.705373 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.705393 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:04Z","lastTransitionTime":"2026-01-28T20:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.808724 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.808787 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.808798 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.808816 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.808827 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:04Z","lastTransitionTime":"2026-01-28T20:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.835230 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.835291 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:04 crc kubenswrapper[4746]: E0128 20:41:04.835365 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.835291 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:04 crc kubenswrapper[4746]: E0128 20:41:04.835527 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:04 crc kubenswrapper[4746]: E0128 20:41:04.835632 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.866181 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 13:31:30.78609595 +0000 UTC Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.913221 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.913296 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.913313 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.913339 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:04 crc kubenswrapper[4746]: I0128 20:41:04.913359 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:04Z","lastTransitionTime":"2026-01-28T20:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.017690 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.017754 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.017764 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.017785 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.017807 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:05Z","lastTransitionTime":"2026-01-28T20:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.120719 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.120771 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.120783 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.120803 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.120818 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:05Z","lastTransitionTime":"2026-01-28T20:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.226047 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.226156 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.226168 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.226192 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.226204 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:05Z","lastTransitionTime":"2026-01-28T20:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.330643 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.330714 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.330734 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.330764 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.330795 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:05Z","lastTransitionTime":"2026-01-28T20:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.440351 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.440454 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.440476 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.440695 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.440715 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:05Z","lastTransitionTime":"2026-01-28T20:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.546168 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.546223 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.546232 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.546248 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.546261 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:05Z","lastTransitionTime":"2026-01-28T20:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.649790 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.649865 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.649884 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.649912 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.649931 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:05Z","lastTransitionTime":"2026-01-28T20:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.754111 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.754179 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.754191 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.754215 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.754230 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:05Z","lastTransitionTime":"2026-01-28T20:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.835705 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:05 crc kubenswrapper[4746]: E0128 20:41:05.835987 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.856132 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.856224 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.856252 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.856287 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.856312 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:05Z","lastTransitionTime":"2026-01-28T20:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.867256 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 01:34:15.329702273 +0000 UTC Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.959050 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.959124 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.959133 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.959149 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:05 crc kubenswrapper[4746]: I0128 20:41:05.959159 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:05Z","lastTransitionTime":"2026-01-28T20:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.062628 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.062723 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.062745 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.062777 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.062798 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:06Z","lastTransitionTime":"2026-01-28T20:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.166186 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.166245 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.166255 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.166274 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.166288 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:06Z","lastTransitionTime":"2026-01-28T20:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.270192 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.270266 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.270285 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.270312 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.270331 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:06Z","lastTransitionTime":"2026-01-28T20:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.373739 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.373803 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.373822 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.373852 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.373873 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:06Z","lastTransitionTime":"2026-01-28T20:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.476308 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.476341 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.476357 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.476373 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.476381 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:06Z","lastTransitionTime":"2026-01-28T20:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.580187 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.580274 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.580308 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.580342 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.580376 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:06Z","lastTransitionTime":"2026-01-28T20:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.683193 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.683226 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.683234 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.683246 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.683254 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:06Z","lastTransitionTime":"2026-01-28T20:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.785969 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.786014 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.786023 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.786038 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.786049 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:06Z","lastTransitionTime":"2026-01-28T20:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.835399 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:06 crc kubenswrapper[4746]: E0128 20:41:06.835543 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.835760 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:06 crc kubenswrapper[4746]: E0128 20:41:06.835808 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.835979 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:06 crc kubenswrapper[4746]: E0128 20:41:06.836037 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.868323 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 15:13:44.068096053 +0000 UTC Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.889376 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.889424 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.889438 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.889454 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.889465 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:06Z","lastTransitionTime":"2026-01-28T20:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.992586 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.992623 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.992632 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.992650 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:06 crc kubenswrapper[4746]: I0128 20:41:06.992660 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:06Z","lastTransitionTime":"2026-01-28T20:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.096802 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.096872 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.096893 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.096923 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.096946 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:07Z","lastTransitionTime":"2026-01-28T20:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.200499 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.200533 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.200543 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.200581 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.200590 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:07Z","lastTransitionTime":"2026-01-28T20:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.304303 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.304383 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.304409 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.304454 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.304483 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:07Z","lastTransitionTime":"2026-01-28T20:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.406998 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.407040 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.407052 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.407068 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.407107 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:07Z","lastTransitionTime":"2026-01-28T20:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.509347 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.509376 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.509383 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.509395 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.509404 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:07Z","lastTransitionTime":"2026-01-28T20:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.612709 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.612770 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.612788 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.612816 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.612835 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:07Z","lastTransitionTime":"2026-01-28T20:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.716437 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.716517 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.716538 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.716608 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.716631 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:07Z","lastTransitionTime":"2026-01-28T20:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.820360 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.820446 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.820470 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.820511 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.820569 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:07Z","lastTransitionTime":"2026-01-28T20:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.835112 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:07 crc kubenswrapper[4746]: E0128 20:41:07.835327 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.869422 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 01:48:02.313765398 +0000 UTC Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.923490 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.923556 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.923573 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.923608 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:07 crc kubenswrapper[4746]: I0128 20:41:07.923634 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:07Z","lastTransitionTime":"2026-01-28T20:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.027666 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.027764 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.027788 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.027824 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.027847 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:08Z","lastTransitionTime":"2026-01-28T20:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.131575 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.131711 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.131733 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.131761 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.131783 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:08Z","lastTransitionTime":"2026-01-28T20:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.235461 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.235549 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.235587 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.235624 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.235648 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:08Z","lastTransitionTime":"2026-01-28T20:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.339135 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.339186 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.339203 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.339232 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.339252 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:08Z","lastTransitionTime":"2026-01-28T20:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.450727 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.450807 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.450829 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.450861 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.450883 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:08Z","lastTransitionTime":"2026-01-28T20:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.554152 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.554235 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.554257 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.554290 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.554369 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:08Z","lastTransitionTime":"2026-01-28T20:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.657535 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.657610 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.657625 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.657646 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.657659 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:08Z","lastTransitionTime":"2026-01-28T20:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.760120 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.760164 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.760173 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.760188 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.760198 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:08Z","lastTransitionTime":"2026-01-28T20:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.835436 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.835439 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:08 crc kubenswrapper[4746]: E0128 20:41:08.835707 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:08 crc kubenswrapper[4746]: E0128 20:41:08.835767 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.835458 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:08 crc kubenswrapper[4746]: E0128 20:41:08.835856 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.863893 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.863956 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.863967 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.863983 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.863995 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:08Z","lastTransitionTime":"2026-01-28T20:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.869562 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 18:36:03.958564931 +0000 UTC Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.967120 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.967181 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.967192 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.967212 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:08 crc kubenswrapper[4746]: I0128 20:41:08.967225 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:08Z","lastTransitionTime":"2026-01-28T20:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.070927 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.070997 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.071015 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.071043 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.071062 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:09Z","lastTransitionTime":"2026-01-28T20:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.175555 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.175625 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.175690 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.175775 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.175804 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:09Z","lastTransitionTime":"2026-01-28T20:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.279533 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.279629 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.279658 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.279686 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.279709 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:09Z","lastTransitionTime":"2026-01-28T20:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.383089 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.383135 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.383144 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.383161 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.383170 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:09Z","lastTransitionTime":"2026-01-28T20:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.486472 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.486605 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.486633 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.486675 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.486728 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:09Z","lastTransitionTime":"2026-01-28T20:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.590681 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.590736 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.590749 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.590765 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.590776 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:09Z","lastTransitionTime":"2026-01-28T20:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.694231 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.694319 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.694342 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.694377 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.694400 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:09Z","lastTransitionTime":"2026-01-28T20:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.798727 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.798801 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.798823 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.798853 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.798877 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:09Z","lastTransitionTime":"2026-01-28T20:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.835394 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:09 crc kubenswrapper[4746]: E0128 20:41:09.835621 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.870149 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 23:27:09.882379993 +0000 UTC Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.902523 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.902604 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.902628 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.902666 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:09 crc kubenswrapper[4746]: I0128 20:41:09.902690 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:09Z","lastTransitionTime":"2026-01-28T20:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.007150 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.007239 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.007253 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.007277 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.007292 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:10Z","lastTransitionTime":"2026-01-28T20:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.110384 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.110446 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.110467 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.110494 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.110518 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:10Z","lastTransitionTime":"2026-01-28T20:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.214024 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.214072 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.214104 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.214121 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.214132 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:10Z","lastTransitionTime":"2026-01-28T20:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.318258 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.318323 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.318342 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.318372 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.318391 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:10Z","lastTransitionTime":"2026-01-28T20:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.422617 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.422932 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.422957 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.422994 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.423019 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:10Z","lastTransitionTime":"2026-01-28T20:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.526586 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.526667 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.526686 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.526719 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.526742 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:10Z","lastTransitionTime":"2026-01-28T20:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.629771 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.629823 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.629837 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.629853 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.629863 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:10Z","lastTransitionTime":"2026-01-28T20:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.732955 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.733007 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.733023 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.733050 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.733069 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:10Z","lastTransitionTime":"2026-01-28T20:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.835354 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.835715 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.835736 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.835779 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.835858 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.835877 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:10 crc kubenswrapper[4746]: E0128 20:41:10.835874 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.835904 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.835924 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:10Z","lastTransitionTime":"2026-01-28T20:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:10 crc kubenswrapper[4746]: E0128 20:41:10.835962 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:10 crc kubenswrapper[4746]: E0128 20:41:10.836015 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.870879 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 15:27:57.465262712 +0000 UTC Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.939051 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.939157 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.939193 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.939223 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:10 crc kubenswrapper[4746]: I0128 20:41:10.939246 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:10Z","lastTransitionTime":"2026-01-28T20:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.017988 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs\") pod \"network-metrics-daemon-2blg6\" (UID: \"f60a5487-5012-4cc9-ad94-5dfb4957d74e\") " pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:11 crc kubenswrapper[4746]: E0128 20:41:11.018249 4746 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 20:41:11 crc kubenswrapper[4746]: E0128 20:41:11.018355 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs podName:f60a5487-5012-4cc9-ad94-5dfb4957d74e nodeName:}" failed. No retries permitted until 2026-01-28 20:42:15.018325941 +0000 UTC m=+162.974512335 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs") pod "network-metrics-daemon-2blg6" (UID: "f60a5487-5012-4cc9-ad94-5dfb4957d74e") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.042182 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.042249 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.042266 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.042294 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.042313 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:11Z","lastTransitionTime":"2026-01-28T20:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.145692 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.145797 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.145825 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.145862 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.145900 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:11Z","lastTransitionTime":"2026-01-28T20:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.250599 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.250683 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.250711 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.250748 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.250770 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:11Z","lastTransitionTime":"2026-01-28T20:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.354243 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.354372 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.354397 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.354429 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.354449 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:11Z","lastTransitionTime":"2026-01-28T20:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.457244 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.457329 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.457349 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.457381 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.457410 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:11Z","lastTransitionTime":"2026-01-28T20:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.560134 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.560179 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.560187 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.560202 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.560219 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:11Z","lastTransitionTime":"2026-01-28T20:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.620444 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.620500 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.620510 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.620529 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.620538 4746 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-28T20:41:11Z","lastTransitionTime":"2026-01-28T20:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.694375 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq"] Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.694852 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.698355 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.698404 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.698845 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.698973 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.771978 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ht6hp" podStartSLOduration=79.771954077 podStartE2EDuration="1m19.771954077s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:11.769667307 +0000 UTC m=+99.725853671" watchObservedRunningTime="2026-01-28 20:41:11.771954077 +0000 UTC m=+99.728140431" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.772760 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podStartSLOduration=79.772753011 podStartE2EDuration="1m19.772753011s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:11.738657959 +0000 UTC m=+99.694844333" watchObservedRunningTime="2026-01-28 20:41:11.772753011 +0000 UTC m=+99.728939365" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.793318 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qhpvf" podStartSLOduration=79.793241198 podStartE2EDuration="1m19.793241198s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:11.79232104 +0000 UTC m=+99.748507394" watchObservedRunningTime="2026-01-28 20:41:11.793241198 +0000 UTC m=+99.749427592" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.830693 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/161e728c-9398-455b-a195-41bd4d53252b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ftvwq\" (UID: \"161e728c-9398-455b-a195-41bd4d53252b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.830758 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/161e728c-9398-455b-a195-41bd4d53252b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ftvwq\" (UID: \"161e728c-9398-455b-a195-41bd4d53252b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.830794 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/161e728c-9398-455b-a195-41bd4d53252b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ftvwq\" (UID: \"161e728c-9398-455b-a195-41bd4d53252b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.830812 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/161e728c-9398-455b-a195-41bd4d53252b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ftvwq\" (UID: \"161e728c-9398-455b-a195-41bd4d53252b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.830844 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/161e728c-9398-455b-a195-41bd4d53252b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ftvwq\" (UID: \"161e728c-9398-455b-a195-41bd4d53252b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.835290 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:11 crc kubenswrapper[4746]: E0128 20:41:11.835592 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.839474 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=77.839446121 podStartE2EDuration="1m17.839446121s" podCreationTimestamp="2026-01-28 20:39:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:11.834729837 +0000 UTC m=+99.790916231" watchObservedRunningTime="2026-01-28 20:41:11.839446121 +0000 UTC m=+99.795632515" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.871577 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 13:35:32.139666424 +0000 UTC Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.871659 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.886844 4746 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.898207 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=19.898172067 podStartE2EDuration="19.898172067s" podCreationTimestamp="2026-01-28 20:40:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:11.882874449 +0000 UTC m=+99.839060843" watchObservedRunningTime="2026-01-28 20:41:11.898172067 +0000 UTC m=+99.854358461" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.920287 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-d8rwq" podStartSLOduration=79.920255303 podStartE2EDuration="1m19.920255303s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:11.898415264 +0000 UTC m=+99.854601618" watchObservedRunningTime="2026-01-28 20:41:11.920255303 +0000 UTC m=+99.876441677" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.920591 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=78.920583292 podStartE2EDuration="1m18.920583292s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:11.919965473 +0000 UTC m=+99.876151867" watchObservedRunningTime="2026-01-28 20:41:11.920583292 +0000 UTC m=+99.876769656" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.932184 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/161e728c-9398-455b-a195-41bd4d53252b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ftvwq\" (UID: \"161e728c-9398-455b-a195-41bd4d53252b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.932257 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/161e728c-9398-455b-a195-41bd4d53252b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ftvwq\" (UID: \"161e728c-9398-455b-a195-41bd4d53252b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.932289 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/161e728c-9398-455b-a195-41bd4d53252b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ftvwq\" (UID: \"161e728c-9398-455b-a195-41bd4d53252b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.932334 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/161e728c-9398-455b-a195-41bd4d53252b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ftvwq\" (UID: \"161e728c-9398-455b-a195-41bd4d53252b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.932402 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/161e728c-9398-455b-a195-41bd4d53252b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ftvwq\" (UID: \"161e728c-9398-455b-a195-41bd4d53252b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.932413 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/161e728c-9398-455b-a195-41bd4d53252b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ftvwq\" (UID: \"161e728c-9398-455b-a195-41bd4d53252b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.932566 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/161e728c-9398-455b-a195-41bd4d53252b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ftvwq\" (UID: \"161e728c-9398-455b-a195-41bd4d53252b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.933827 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/161e728c-9398-455b-a195-41bd4d53252b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ftvwq\" (UID: \"161e728c-9398-455b-a195-41bd4d53252b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.941959 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/161e728c-9398-455b-a195-41bd4d53252b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ftvwq\" (UID: \"161e728c-9398-455b-a195-41bd4d53252b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.952848 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/161e728c-9398-455b-a195-41bd4d53252b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ftvwq\" (UID: \"161e728c-9398-455b-a195-41bd4d53252b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.968273 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=47.96825086 podStartE2EDuration="47.96825086s" podCreationTimestamp="2026-01-28 20:40:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:11.946615638 +0000 UTC m=+99.902802002" watchObservedRunningTime="2026-01-28 20:41:11.96825086 +0000 UTC m=+99.924437214" Jan 28 20:41:11 crc kubenswrapper[4746]: I0128 20:41:11.985797 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9zvm7" podStartSLOduration=78.98576898499999 podStartE2EDuration="1m18.985768985s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:11.968432155 +0000 UTC m=+99.924618509" watchObservedRunningTime="2026-01-28 20:41:11.985768985 +0000 UTC m=+99.941955349" Jan 28 20:41:12 crc kubenswrapper[4746]: I0128 20:41:12.024260 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" Jan 28 20:41:12 crc kubenswrapper[4746]: I0128 20:41:12.075798 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gcrxx" podStartSLOduration=80.075777348 podStartE2EDuration="1m20.075777348s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:12.041533761 +0000 UTC m=+99.997720165" watchObservedRunningTime="2026-01-28 20:41:12.075777348 +0000 UTC m=+100.031963702" Jan 28 20:41:12 crc kubenswrapper[4746]: I0128 20:41:12.449416 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" event={"ID":"161e728c-9398-455b-a195-41bd4d53252b","Type":"ContainerStarted","Data":"961c65775a2a4d97e0e20993ea6efe85fed24e018552658f85cd7f216be09df8"} Jan 28 20:41:12 crc kubenswrapper[4746]: I0128 20:41:12.449507 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" event={"ID":"161e728c-9398-455b-a195-41bd4d53252b","Type":"ContainerStarted","Data":"7a9715e263b3ee5560627429b52ae6602b94177c64eda2fdf3b0537a4cfa2c40"} Jan 28 20:41:12 crc kubenswrapper[4746]: I0128 20:41:12.471979 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ftvwq" podStartSLOduration=80.471940223 podStartE2EDuration="1m20.471940223s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:12.470077366 +0000 UTC m=+100.426263750" watchObservedRunningTime="2026-01-28 20:41:12.471940223 +0000 UTC m=+100.428126617" Jan 28 20:41:12 crc kubenswrapper[4746]: I0128 20:41:12.835512 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:12 crc kubenswrapper[4746]: I0128 20:41:12.835593 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:12 crc kubenswrapper[4746]: I0128 20:41:12.837376 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:12 crc kubenswrapper[4746]: E0128 20:41:12.837642 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:12 crc kubenswrapper[4746]: E0128 20:41:12.838344 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:12 crc kubenswrapper[4746]: E0128 20:41:12.838462 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:12 crc kubenswrapper[4746]: I0128 20:41:12.839289 4746 scope.go:117] "RemoveContainer" containerID="dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c" Jan 28 20:41:12 crc kubenswrapper[4746]: E0128 20:41:12.839614 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8vmvh_openshift-ovn-kubernetes(c4d15639-62fb-41b7-a1d4-6f51f3af6d99)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" Jan 28 20:41:13 crc kubenswrapper[4746]: I0128 20:41:13.835238 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:13 crc kubenswrapper[4746]: E0128 20:41:13.835400 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:14 crc kubenswrapper[4746]: I0128 20:41:14.835070 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:14 crc kubenswrapper[4746]: I0128 20:41:14.835093 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:14 crc kubenswrapper[4746]: I0128 20:41:14.835320 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:14 crc kubenswrapper[4746]: E0128 20:41:14.835472 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:14 crc kubenswrapper[4746]: E0128 20:41:14.835615 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:14 crc kubenswrapper[4746]: E0128 20:41:14.835795 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:15 crc kubenswrapper[4746]: I0128 20:41:15.835432 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:15 crc kubenswrapper[4746]: E0128 20:41:15.835579 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:15 crc kubenswrapper[4746]: I0128 20:41:15.857606 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 28 20:41:16 crc kubenswrapper[4746]: I0128 20:41:16.835861 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:16 crc kubenswrapper[4746]: E0128 20:41:16.835988 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:16 crc kubenswrapper[4746]: I0128 20:41:16.836059 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:16 crc kubenswrapper[4746]: E0128 20:41:16.836131 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:16 crc kubenswrapper[4746]: I0128 20:41:16.836463 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:16 crc kubenswrapper[4746]: E0128 20:41:16.836688 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:17 crc kubenswrapper[4746]: I0128 20:41:17.835504 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:17 crc kubenswrapper[4746]: E0128 20:41:17.835737 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:18 crc kubenswrapper[4746]: I0128 20:41:18.834846 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:18 crc kubenswrapper[4746]: I0128 20:41:18.834922 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:18 crc kubenswrapper[4746]: I0128 20:41:18.834945 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:18 crc kubenswrapper[4746]: E0128 20:41:18.835109 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:18 crc kubenswrapper[4746]: E0128 20:41:18.835218 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:18 crc kubenswrapper[4746]: E0128 20:41:18.835295 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:19 crc kubenswrapper[4746]: I0128 20:41:19.835296 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:19 crc kubenswrapper[4746]: E0128 20:41:19.835488 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:20 crc kubenswrapper[4746]: I0128 20:41:20.835980 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:20 crc kubenswrapper[4746]: I0128 20:41:20.836124 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:20 crc kubenswrapper[4746]: I0128 20:41:20.836221 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:20 crc kubenswrapper[4746]: E0128 20:41:20.836367 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:20 crc kubenswrapper[4746]: E0128 20:41:20.836534 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:20 crc kubenswrapper[4746]: E0128 20:41:20.836711 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:21 crc kubenswrapper[4746]: I0128 20:41:21.835805 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:21 crc kubenswrapper[4746]: E0128 20:41:21.836361 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:22 crc kubenswrapper[4746]: I0128 20:41:22.835242 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:22 crc kubenswrapper[4746]: I0128 20:41:22.835296 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:22 crc kubenswrapper[4746]: E0128 20:41:22.837627 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:22 crc kubenswrapper[4746]: I0128 20:41:22.837722 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:22 crc kubenswrapper[4746]: E0128 20:41:22.837872 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:22 crc kubenswrapper[4746]: E0128 20:41:22.837976 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:23 crc kubenswrapper[4746]: I0128 20:41:23.834855 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:23 crc kubenswrapper[4746]: E0128 20:41:23.835466 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:24 crc kubenswrapper[4746]: I0128 20:41:24.835277 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:24 crc kubenswrapper[4746]: I0128 20:41:24.835294 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:24 crc kubenswrapper[4746]: I0128 20:41:24.835412 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:24 crc kubenswrapper[4746]: E0128 20:41:24.835550 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:24 crc kubenswrapper[4746]: E0128 20:41:24.835766 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:24 crc kubenswrapper[4746]: E0128 20:41:24.835873 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:25 crc kubenswrapper[4746]: I0128 20:41:25.835158 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:25 crc kubenswrapper[4746]: E0128 20:41:25.835357 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:26 crc kubenswrapper[4746]: I0128 20:41:26.835633 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:26 crc kubenswrapper[4746]: E0128 20:41:26.836067 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:26 crc kubenswrapper[4746]: I0128 20:41:26.836494 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:26 crc kubenswrapper[4746]: E0128 20:41:26.836626 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:26 crc kubenswrapper[4746]: I0128 20:41:26.837698 4746 scope.go:117] "RemoveContainer" containerID="dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c" Jan 28 20:41:26 crc kubenswrapper[4746]: I0128 20:41:26.838226 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:26 crc kubenswrapper[4746]: E0128 20:41:26.838415 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:27 crc kubenswrapper[4746]: I0128 20:41:27.504864 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qhpvf_cdf26de0-b602-4bdf-b492-65b3b6b31434/kube-multus/1.log" Jan 28 20:41:27 crc kubenswrapper[4746]: I0128 20:41:27.505825 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qhpvf_cdf26de0-b602-4bdf-b492-65b3b6b31434/kube-multus/0.log" Jan 28 20:41:27 crc kubenswrapper[4746]: I0128 20:41:27.505861 4746 generic.go:334] "Generic (PLEG): container finished" podID="cdf26de0-b602-4bdf-b492-65b3b6b31434" containerID="9c88cd0b151f23677f83a944570f36a7a6edd30d9be47638a17cbb9098ec1c23" exitCode=1 Jan 28 20:41:27 crc kubenswrapper[4746]: I0128 20:41:27.505912 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qhpvf" event={"ID":"cdf26de0-b602-4bdf-b492-65b3b6b31434","Type":"ContainerDied","Data":"9c88cd0b151f23677f83a944570f36a7a6edd30d9be47638a17cbb9098ec1c23"} Jan 28 20:41:27 crc kubenswrapper[4746]: I0128 20:41:27.505958 4746 scope.go:117] "RemoveContainer" containerID="f80345dfd4de46e278611370e6c337968cb1ba7ed8275457deea5871704f8367" Jan 28 20:41:27 crc kubenswrapper[4746]: I0128 20:41:27.506369 4746 scope.go:117] "RemoveContainer" containerID="9c88cd0b151f23677f83a944570f36a7a6edd30d9be47638a17cbb9098ec1c23" Jan 28 20:41:27 crc kubenswrapper[4746]: E0128 20:41:27.506523 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-qhpvf_openshift-multus(cdf26de0-b602-4bdf-b492-65b3b6b31434)\"" pod="openshift-multus/multus-qhpvf" podUID="cdf26de0-b602-4bdf-b492-65b3b6b31434" Jan 28 20:41:27 crc kubenswrapper[4746]: I0128 20:41:27.512422 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovnkube-controller/3.log" Jan 28 20:41:27 crc kubenswrapper[4746]: I0128 20:41:27.518728 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerStarted","Data":"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b"} Jan 28 20:41:27 crc kubenswrapper[4746]: I0128 20:41:27.519538 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:41:27 crc kubenswrapper[4746]: I0128 20:41:27.525841 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=12.525828542 podStartE2EDuration="12.525828542s" podCreationTimestamp="2026-01-28 20:41:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:22.877548787 +0000 UTC m=+110.833735191" watchObservedRunningTime="2026-01-28 20:41:27.525828542 +0000 UTC m=+115.482014896" Jan 28 20:41:27 crc kubenswrapper[4746]: I0128 20:41:27.559585 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" podStartSLOduration=95.559568453 podStartE2EDuration="1m35.559568453s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:27.552569339 +0000 UTC m=+115.508755683" watchObservedRunningTime="2026-01-28 20:41:27.559568453 +0000 UTC m=+115.515754807" Jan 28 20:41:27 crc kubenswrapper[4746]: I0128 20:41:27.723621 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2blg6"] Jan 28 20:41:27 crc kubenswrapper[4746]: I0128 20:41:27.723764 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:27 crc kubenswrapper[4746]: E0128 20:41:27.723870 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:27 crc kubenswrapper[4746]: I0128 20:41:27.835870 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:27 crc kubenswrapper[4746]: E0128 20:41:27.836056 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:28 crc kubenswrapper[4746]: I0128 20:41:28.546283 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qhpvf_cdf26de0-b602-4bdf-b492-65b3b6b31434/kube-multus/1.log" Jan 28 20:41:28 crc kubenswrapper[4746]: I0128 20:41:28.835946 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:28 crc kubenswrapper[4746]: I0128 20:41:28.836059 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:28 crc kubenswrapper[4746]: E0128 20:41:28.836213 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:28 crc kubenswrapper[4746]: E0128 20:41:28.836394 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:29 crc kubenswrapper[4746]: I0128 20:41:29.835685 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:29 crc kubenswrapper[4746]: I0128 20:41:29.835724 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:29 crc kubenswrapper[4746]: E0128 20:41:29.836560 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:29 crc kubenswrapper[4746]: E0128 20:41:29.836665 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:30 crc kubenswrapper[4746]: I0128 20:41:30.835855 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:30 crc kubenswrapper[4746]: E0128 20:41:30.836348 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:30 crc kubenswrapper[4746]: I0128 20:41:30.835963 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:30 crc kubenswrapper[4746]: E0128 20:41:30.836601 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:31 crc kubenswrapper[4746]: I0128 20:41:31.835359 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:31 crc kubenswrapper[4746]: I0128 20:41:31.835359 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:31 crc kubenswrapper[4746]: E0128 20:41:31.835623 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:31 crc kubenswrapper[4746]: E0128 20:41:31.835762 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:32 crc kubenswrapper[4746]: E0128 20:41:32.785941 4746 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 28 20:41:32 crc kubenswrapper[4746]: I0128 20:41:32.835163 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:32 crc kubenswrapper[4746]: E0128 20:41:32.836287 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:32 crc kubenswrapper[4746]: I0128 20:41:32.836335 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:32 crc kubenswrapper[4746]: E0128 20:41:32.836468 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:32 crc kubenswrapper[4746]: E0128 20:41:32.940651 4746 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 20:41:33 crc kubenswrapper[4746]: I0128 20:41:33.835771 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:33 crc kubenswrapper[4746]: I0128 20:41:33.835934 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:33 crc kubenswrapper[4746]: E0128 20:41:33.835976 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:33 crc kubenswrapper[4746]: E0128 20:41:33.836289 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:34 crc kubenswrapper[4746]: I0128 20:41:34.835985 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:34 crc kubenswrapper[4746]: I0128 20:41:34.836138 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:34 crc kubenswrapper[4746]: E0128 20:41:34.836371 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:34 crc kubenswrapper[4746]: E0128 20:41:34.836579 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:35 crc kubenswrapper[4746]: I0128 20:41:35.835717 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:35 crc kubenswrapper[4746]: E0128 20:41:35.835883 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:35 crc kubenswrapper[4746]: I0128 20:41:35.835733 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:35 crc kubenswrapper[4746]: E0128 20:41:35.836031 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:36 crc kubenswrapper[4746]: I0128 20:41:36.835136 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:36 crc kubenswrapper[4746]: I0128 20:41:36.835233 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:36 crc kubenswrapper[4746]: E0128 20:41:36.835280 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:36 crc kubenswrapper[4746]: E0128 20:41:36.835387 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:37 crc kubenswrapper[4746]: I0128 20:41:37.834803 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:37 crc kubenswrapper[4746]: I0128 20:41:37.834879 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:37 crc kubenswrapper[4746]: E0128 20:41:37.834965 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:37 crc kubenswrapper[4746]: E0128 20:41:37.835319 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:37 crc kubenswrapper[4746]: E0128 20:41:37.941899 4746 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 20:41:38 crc kubenswrapper[4746]: I0128 20:41:38.835175 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:38 crc kubenswrapper[4746]: E0128 20:41:38.835781 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:38 crc kubenswrapper[4746]: I0128 20:41:38.836333 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:38 crc kubenswrapper[4746]: E0128 20:41:38.836671 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:39 crc kubenswrapper[4746]: I0128 20:41:39.835620 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:39 crc kubenswrapper[4746]: I0128 20:41:39.835678 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:39 crc kubenswrapper[4746]: E0128 20:41:39.835865 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:39 crc kubenswrapper[4746]: E0128 20:41:39.836007 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:40 crc kubenswrapper[4746]: I0128 20:41:40.835594 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:40 crc kubenswrapper[4746]: I0128 20:41:40.835594 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:40 crc kubenswrapper[4746]: E0128 20:41:40.835887 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:40 crc kubenswrapper[4746]: E0128 20:41:40.836046 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:41 crc kubenswrapper[4746]: I0128 20:41:41.835902 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:41 crc kubenswrapper[4746]: E0128 20:41:41.836224 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:41 crc kubenswrapper[4746]: I0128 20:41:41.836794 4746 scope.go:117] "RemoveContainer" containerID="9c88cd0b151f23677f83a944570f36a7a6edd30d9be47638a17cbb9098ec1c23" Jan 28 20:41:41 crc kubenswrapper[4746]: I0128 20:41:41.836889 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:41 crc kubenswrapper[4746]: E0128 20:41:41.837131 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:42 crc kubenswrapper[4746]: I0128 20:41:42.596216 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qhpvf_cdf26de0-b602-4bdf-b492-65b3b6b31434/kube-multus/1.log" Jan 28 20:41:42 crc kubenswrapper[4746]: I0128 20:41:42.596765 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qhpvf" event={"ID":"cdf26de0-b602-4bdf-b492-65b3b6b31434","Type":"ContainerStarted","Data":"7739b8614574ec2b9e7c1e6a6e443f85ce5fa487dd8be878363a899877ff42f3"} Jan 28 20:41:42 crc kubenswrapper[4746]: I0128 20:41:42.835607 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:42 crc kubenswrapper[4746]: I0128 20:41:42.835752 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:42 crc kubenswrapper[4746]: E0128 20:41:42.837729 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:42 crc kubenswrapper[4746]: E0128 20:41:42.837879 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:42 crc kubenswrapper[4746]: E0128 20:41:42.942865 4746 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 20:41:43 crc kubenswrapper[4746]: I0128 20:41:43.835559 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:43 crc kubenswrapper[4746]: I0128 20:41:43.835642 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:43 crc kubenswrapper[4746]: E0128 20:41:43.835807 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:43 crc kubenswrapper[4746]: E0128 20:41:43.835954 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:44 crc kubenswrapper[4746]: I0128 20:41:44.835507 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:44 crc kubenswrapper[4746]: I0128 20:41:44.835535 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:44 crc kubenswrapper[4746]: E0128 20:41:44.835838 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:44 crc kubenswrapper[4746]: E0128 20:41:44.836229 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:45 crc kubenswrapper[4746]: I0128 20:41:45.835345 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:45 crc kubenswrapper[4746]: I0128 20:41:45.835420 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:45 crc kubenswrapper[4746]: E0128 20:41:45.835691 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:45 crc kubenswrapper[4746]: E0128 20:41:45.835893 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:46 crc kubenswrapper[4746]: I0128 20:41:46.454775 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:41:46 crc kubenswrapper[4746]: I0128 20:41:46.835784 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:46 crc kubenswrapper[4746]: I0128 20:41:46.835817 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:46 crc kubenswrapper[4746]: E0128 20:41:46.835976 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 28 20:41:46 crc kubenswrapper[4746]: E0128 20:41:46.836364 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 28 20:41:47 crc kubenswrapper[4746]: I0128 20:41:47.835155 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:47 crc kubenswrapper[4746]: I0128 20:41:47.835228 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:47 crc kubenswrapper[4746]: E0128 20:41:47.835322 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 28 20:41:47 crc kubenswrapper[4746]: E0128 20:41:47.835492 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2blg6" podUID="f60a5487-5012-4cc9-ad94-5dfb4957d74e" Jan 28 20:41:48 crc kubenswrapper[4746]: I0128 20:41:48.836017 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:41:48 crc kubenswrapper[4746]: I0128 20:41:48.836033 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:41:48 crc kubenswrapper[4746]: I0128 20:41:48.841466 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 28 20:41:48 crc kubenswrapper[4746]: I0128 20:41:48.841872 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 28 20:41:49 crc kubenswrapper[4746]: I0128 20:41:49.835778 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:41:49 crc kubenswrapper[4746]: I0128 20:41:49.835888 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:41:49 crc kubenswrapper[4746]: I0128 20:41:49.837880 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 28 20:41:49 crc kubenswrapper[4746]: I0128 20:41:49.838040 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 28 20:41:49 crc kubenswrapper[4746]: I0128 20:41:49.838313 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 28 20:41:49 crc kubenswrapper[4746]: I0128 20:41:49.840323 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.628130 4746 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.672653 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lzj8l"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.673333 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lzj8l" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.676849 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.677398 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.678676 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.678844 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.679012 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.679332 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.680993 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xxnb8"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.681765 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xxnb8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.682905 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.683027 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.683164 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.683206 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.684369 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5d8cs"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.685189 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.685760 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4g4p7"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.686317 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.686656 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.687181 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.687376 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.687499 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.687833 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.688272 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8mmbg"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.688625 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-hcxv8"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.689041 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.689472 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.689794 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4cmsz"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.690627 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-4cmsz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.691981 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.692337 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.693727 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.694383 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.694449 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.694883 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.695843 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.696019 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.696055 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.696325 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.696589 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.698278 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-qrffw"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.698390 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.698862 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qrffw" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.702654 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.722507 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gjgg"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.731542 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.733063 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.734024 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.734190 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gjgg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.739740 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.739994 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.754749 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.755206 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.755479 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.755617 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.755756 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.755873 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.756003 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.756136 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.756592 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.756972 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.757180 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.757395 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.757561 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.757738 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.757889 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8w6v4"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.757916 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.758239 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.758417 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8w6v4" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.758459 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.758564 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.758961 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.759182 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.759464 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.759630 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.759913 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.760061 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.760213 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.760935 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.761279 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.761631 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.761766 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.761796 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.761894 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.762020 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.762058 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.762116 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.762142 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.762251 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.762366 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.763159 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.763342 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.763447 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.762260 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.764315 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.765903 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-rktng"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.766504 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rktng" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.768629 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.768822 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.768972 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.769775 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.770975 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.771182 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lwhk4"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.772200 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.776952 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.778007 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.783839 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.783949 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.784945 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.786980 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.788243 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.790533 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-562pf"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.792736 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q8fsc"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.803518 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.805452 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-562pf" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.809174 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.809997 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.810263 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.811383 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.812027 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.824056 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.826150 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.826344 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.826428 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.826510 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.829058 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-s8b4b"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.830727 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s8b4b" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.831622 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-q8fsc" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.832097 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.832257 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.832340 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.832096 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.832556 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.832706 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.835569 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.835996 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.838284 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.839008 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.844071 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lqlcg"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.846694 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dbwsb"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.848925 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mznpb"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.849373 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lqlcg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.849546 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gbwrh"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.849794 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.856119 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m6pvw"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.856496 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mznpb" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.856735 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.856803 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gbwrh" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.857162 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m6pvw" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.857454 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.858035 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859217 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-service-ca\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859252 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0ea55ab8-bec0-44e8-8105-c8c604fc5fa1-trusted-ca\") pod \"console-operator-58897d9998-4cmsz\" (UID: \"0ea55ab8-bec0-44e8-8105-c8c604fc5fa1\") " pod="openshift-console-operator/console-operator-58897d9998-4cmsz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859277 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-serving-cert\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859293 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859324 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e64bb6e-1131-431b-b87c-71e25d294fe1-config\") pod \"machine-api-operator-5694c8668f-lzj8l\" (UID: \"3e64bb6e-1131-431b-b87c-71e25d294fe1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lzj8l" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859340 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859360 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c7789bf-f2f7-4cf4-97f1-3d8a7438daca-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-z2vr7\" (UID: \"8c7789bf-f2f7-4cf4-97f1-3d8a7438daca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859379 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/447abd89-31fd-4bb6-a965-97d7954f47bb-client-ca\") pod \"route-controller-manager-6576b87f9c-sjqv2\" (UID: \"447abd89-31fd-4bb6-a965-97d7954f47bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859404 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859429 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctjhc\" (UniqueName: \"kubernetes.io/projected/e3e01537-7adb-4a81-a9cb-3deb73a1d5b3-kube-api-access-ctjhc\") pod \"cluster-samples-operator-665b6dd947-xxnb8\" (UID: \"e3e01537-7adb-4a81-a9cb-3deb73a1d5b3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xxnb8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859444 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c7789bf-f2f7-4cf4-97f1-3d8a7438daca-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-z2vr7\" (UID: \"8c7789bf-f2f7-4cf4-97f1-3d8a7438daca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859458 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-audit-policies\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859476 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac478ffc-e1e4-4b72-bf99-d35c2636a78d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8mmbg\" (UID: \"ac478ffc-e1e4-4b72-bf99-d35c2636a78d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859491 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859508 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859526 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3e01537-7adb-4a81-a9cb-3deb73a1d5b3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xxnb8\" (UID: \"e3e01537-7adb-4a81-a9cb-3deb73a1d5b3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xxnb8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859548 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c7789bf-f2f7-4cf4-97f1-3d8a7438daca-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-z2vr7\" (UID: \"8c7789bf-f2f7-4cf4-97f1-3d8a7438daca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859568 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-etcd-client\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859584 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6krvq\" (UniqueName: \"kubernetes.io/projected/6208130d-52bc-449e-b371-357b1cc21b22-kube-api-access-6krvq\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859603 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/94cef654-afbe-42c2-8069-5dbcb7294abb-console-oauth-config\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859620 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tknp\" (UniqueName: \"kubernetes.io/projected/94cef654-afbe-42c2-8069-5dbcb7294abb-kube-api-access-6tknp\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859641 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d66e245a-95d4-49c2-9172-2f095fab3e2b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5gjgg\" (UID: \"d66e245a-95d4-49c2-9172-2f095fab3e2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gjgg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859658 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6208130d-52bc-449e-b371-357b1cc21b22-audit-dir\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859678 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64bpv\" (UniqueName: \"kubernetes.io/projected/d66e245a-95d4-49c2-9172-2f095fab3e2b-kube-api-access-64bpv\") pod \"openshift-controller-manager-operator-756b6f6bc6-5gjgg\" (UID: \"d66e245a-95d4-49c2-9172-2f095fab3e2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gjgg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859695 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-console-config\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859712 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f0eb07a-e0b4-4702-89b0-d94e937471a5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5d8cs\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859728 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k8vv\" (UniqueName: \"kubernetes.io/projected/0f0eb07a-e0b4-4702-89b0-d94e937471a5-kube-api-access-2k8vv\") pod \"controller-manager-879f6c89f-5d8cs\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859747 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-trusted-ca-bundle\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859765 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0eb07a-e0b4-4702-89b0-d94e937471a5-config\") pod \"controller-manager-879f6c89f-5d8cs\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859781 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/94cef654-afbe-42c2-8069-5dbcb7294abb-console-serving-cert\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859797 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac478ffc-e1e4-4b72-bf99-d35c2636a78d-serving-cert\") pod \"authentication-operator-69f744f599-8mmbg\" (UID: \"ac478ffc-e1e4-4b72-bf99-d35c2636a78d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859812 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f0eb07a-e0b4-4702-89b0-d94e937471a5-client-ca\") pod \"controller-manager-879f6c89f-5d8cs\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859834 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9l2t\" (UniqueName: \"kubernetes.io/projected/6479c53a-0e30-4805-bdb8-314a66127b5c-kube-api-access-b9l2t\") pod \"machine-approver-56656f9798-v7zfx\" (UID: \"6479c53a-0e30-4805-bdb8-314a66127b5c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859861 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac478ffc-e1e4-4b72-bf99-d35c2636a78d-service-ca-bundle\") pod \"authentication-operator-69f744f599-8mmbg\" (UID: \"ac478ffc-e1e4-4b72-bf99-d35c2636a78d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859877 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea55ab8-bec0-44e8-8105-c8c604fc5fa1-serving-cert\") pod \"console-operator-58897d9998-4cmsz\" (UID: \"0ea55ab8-bec0-44e8-8105-c8c604fc5fa1\") " pod="openshift-console-operator/console-operator-58897d9998-4cmsz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859893 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-audit-dir\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859911 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7rz5\" (UniqueName: \"kubernetes.io/projected/594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960-kube-api-access-f7rz5\") pod \"openshift-apiserver-operator-796bbdcf4f-8w6v4\" (UID: \"594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8w6v4" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859926 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e64bb6e-1131-431b-b87c-71e25d294fe1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lzj8l\" (UID: \"3e64bb6e-1131-431b-b87c-71e25d294fe1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lzj8l" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859942 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mg5d\" (UniqueName: \"kubernetes.io/projected/ac478ffc-e1e4-4b72-bf99-d35c2636a78d-kube-api-access-2mg5d\") pod \"authentication-operator-69f744f599-8mmbg\" (UID: \"ac478ffc-e1e4-4b72-bf99-d35c2636a78d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859958 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8w6v4\" (UID: \"594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8w6v4" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859975 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.859992 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860009 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-encryption-config\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860025 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860042 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860059 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6479c53a-0e30-4805-bdb8-314a66127b5c-machine-approver-tls\") pod \"machine-approver-56656f9798-v7zfx\" (UID: \"6479c53a-0e30-4805-bdb8-314a66127b5c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860081 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjr7f\" (UniqueName: \"kubernetes.io/projected/b34266ce-b971-4f4b-b8b7-c54ff8b6212c-kube-api-access-qjr7f\") pod \"downloads-7954f5f757-qrffw\" (UID: \"b34266ce-b971-4f4b-b8b7-c54ff8b6212c\") " pod="openshift-console/downloads-7954f5f757-qrffw" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860152 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6479c53a-0e30-4805-bdb8-314a66127b5c-config\") pod \"machine-approver-56656f9798-v7zfx\" (UID: \"6479c53a-0e30-4805-bdb8-314a66127b5c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860169 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6zqh\" (UniqueName: \"kubernetes.io/projected/8c7789bf-f2f7-4cf4-97f1-3d8a7438daca-kube-api-access-h6zqh\") pod \"cluster-image-registry-operator-dc59b4c8b-z2vr7\" (UID: \"8c7789bf-f2f7-4cf4-97f1-3d8a7438daca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860187 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdh2x\" (UniqueName: \"kubernetes.io/projected/0ea55ab8-bec0-44e8-8105-c8c604fc5fa1-kube-api-access-cdh2x\") pod \"console-operator-58897d9998-4cmsz\" (UID: \"0ea55ab8-bec0-44e8-8105-c8c604fc5fa1\") " pod="openshift-console-operator/console-operator-58897d9998-4cmsz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860208 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6479c53a-0e30-4805-bdb8-314a66127b5c-auth-proxy-config\") pod \"machine-approver-56656f9798-v7zfx\" (UID: \"6479c53a-0e30-4805-bdb8-314a66127b5c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860226 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8w6v4\" (UID: \"594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8w6v4" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860240 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3e64bb6e-1131-431b-b87c-71e25d294fe1-images\") pod \"machine-api-operator-5694c8668f-lzj8l\" (UID: \"3e64bb6e-1131-431b-b87c-71e25d294fe1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lzj8l" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860258 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac478ffc-e1e4-4b72-bf99-d35c2636a78d-config\") pod \"authentication-operator-69f744f599-8mmbg\" (UID: \"ac478ffc-e1e4-4b72-bf99-d35c2636a78d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860278 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ea55ab8-bec0-44e8-8105-c8c604fc5fa1-config\") pod \"console-operator-58897d9998-4cmsz\" (UID: \"0ea55ab8-bec0-44e8-8105-c8c604fc5fa1\") " pod="openshift-console-operator/console-operator-58897d9998-4cmsz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860293 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-audit-policies\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860308 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860323 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860346 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvxmk\" (UniqueName: \"kubernetes.io/projected/3e64bb6e-1131-431b-b87c-71e25d294fe1-kube-api-access-tvxmk\") pod \"machine-api-operator-5694c8668f-lzj8l\" (UID: \"3e64bb6e-1131-431b-b87c-71e25d294fe1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lzj8l" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860363 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860383 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rktqc\" (UniqueName: \"kubernetes.io/projected/447abd89-31fd-4bb6-a965-97d7954f47bb-kube-api-access-rktqc\") pod \"route-controller-manager-6576b87f9c-sjqv2\" (UID: \"447abd89-31fd-4bb6-a965-97d7954f47bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860401 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f0eb07a-e0b4-4702-89b0-d94e937471a5-serving-cert\") pod \"controller-manager-879f6c89f-5d8cs\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860416 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/447abd89-31fd-4bb6-a965-97d7954f47bb-config\") pod \"route-controller-manager-6576b87f9c-sjqv2\" (UID: \"447abd89-31fd-4bb6-a965-97d7954f47bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860434 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/447abd89-31fd-4bb6-a965-97d7954f47bb-serving-cert\") pod \"route-controller-manager-6576b87f9c-sjqv2\" (UID: \"447abd89-31fd-4bb6-a965-97d7954f47bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860471 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbx4z\" (UniqueName: \"kubernetes.io/projected/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-kube-api-access-dbx4z\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860487 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-oauth-serving-cert\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860504 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d66e245a-95d4-49c2-9172-2f095fab3e2b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5gjgg\" (UID: \"d66e245a-95d4-49c2-9172-2f095fab3e2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gjgg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860519 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.860676 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.861433 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-rjmhd"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.861564 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.862135 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sk7xz"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.863573 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hlqz9"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.863684 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.863948 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.864259 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zzrh5"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.864553 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.864816 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hlqz9" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.866899 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-62cbt"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.867508 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.868151 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rbl7q"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.868700 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.868978 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zzrh5" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.869234 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-62cbt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.869800 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.870883 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.871396 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.875487 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vfcnn"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.875973 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.876419 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.876441 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.876664 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.876968 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.877596 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rbl7q" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.877932 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vfcnn" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.879962 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.887853 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.900754 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.903397 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sqh2q"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.905082 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lzj8l"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.905123 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xxnb8"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.905165 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5d8cs"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.905176 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4cmsz"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.905456 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.906168 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.916092 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.919744 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hcxv8"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.919822 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.929399 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8mmbg"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.930760 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-q9qqq"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.958562 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-q9qqq" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.959528 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.960112 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.961157 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q8fsc"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.961919 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slszm\" (UniqueName: \"kubernetes.io/projected/c068f936-795b-4eb3-83a8-e363131119e9-kube-api-access-slszm\") pod \"router-default-5444994796-rjmhd\" (UID: \"c068f936-795b-4eb3-83a8-e363131119e9\") " pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.962002 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9l2t\" (UniqueName: \"kubernetes.io/projected/6479c53a-0e30-4805-bdb8-314a66127b5c-kube-api-access-b9l2t\") pod \"machine-approver-56656f9798-v7zfx\" (UID: \"6479c53a-0e30-4805-bdb8-314a66127b5c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.962034 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f31f9ca0-e467-452d-90db-a28a4b69496e-etcd-service-ca\") pod \"etcd-operator-b45778765-dbwsb\" (UID: \"f31f9ca0-e467-452d-90db-a28a4b69496e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.962094 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cedee54-2c6e-44d4-a51a-b5f8d3ff0833-config\") pod \"kube-controller-manager-operator-78b949d7b-hlqz9\" (UID: \"0cedee54-2c6e-44d4-a51a-b5f8d3ff0833\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hlqz9" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.962152 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-audit\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.962189 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7rz5\" (UniqueName: \"kubernetes.io/projected/594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960-kube-api-access-f7rz5\") pod \"openshift-apiserver-operator-796bbdcf4f-8w6v4\" (UID: \"594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8w6v4" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.962237 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-audit-dir\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.962267 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjlmv\" (UniqueName: \"kubernetes.io/projected/aa7d6dd6-1d54-400f-a188-628f99083f93-kube-api-access-hjlmv\") pod \"multus-admission-controller-857f4d67dd-lqlcg\" (UID: \"aa7d6dd6-1d54-400f-a188-628f99083f93\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lqlcg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.962283 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8w6v4"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.962297 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/aa555a55-a0b0-47e3-959f-e2d8d387aae2-tmpfs\") pod \"packageserver-d55dfcdfc-8vvmj\" (UID: \"aa555a55-a0b0-47e3-959f-e2d8d387aae2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.962322 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-node-pullsecrets\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.962343 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mg5d\" (UniqueName: \"kubernetes.io/projected/ac478ffc-e1e4-4b72-bf99-d35c2636a78d-kube-api-access-2mg5d\") pod \"authentication-operator-69f744f599-8mmbg\" (UID: \"ac478ffc-e1e4-4b72-bf99-d35c2636a78d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.962371 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/18ad1fee-6a3a-4b98-83c5-6a13f22699db-available-featuregates\") pod \"openshift-config-operator-7777fb866f-rktng\" (UID: \"18ad1fee-6a3a-4b98-83c5-6a13f22699db\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rktng" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.962411 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-audit-dir\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.962422 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8w6v4\" (UID: \"594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8w6v4" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.962444 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.962515 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/460c9e37-a0c0-43ea-9607-8f716e2e92bf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gbwrh\" (UID: \"460c9e37-a0c0-43ea-9607-8f716e2e92bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gbwrh" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.964271 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-rktng"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.964722 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-encryption-config\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.965010 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4886a33e-8379-47d5-ac13-e58bc623d01c-profile-collector-cert\") pod \"catalog-operator-68c6474976-wb74c\" (UID: \"4886a33e-8379-47d5-ac13-e58bc623d01c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.964815 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.965442 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.965841 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lwhk4"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.966490 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6zqh\" (UniqueName: \"kubernetes.io/projected/8c7789bf-f2f7-4cf4-97f1-3d8a7438daca-kube-api-access-h6zqh\") pod \"cluster-image-registry-operator-dc59b4c8b-z2vr7\" (UID: \"8c7789bf-f2f7-4cf4-97f1-3d8a7438daca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.966753 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjr7f\" (UniqueName: \"kubernetes.io/projected/b34266ce-b971-4f4b-b8b7-c54ff8b6212c-kube-api-access-qjr7f\") pod \"downloads-7954f5f757-qrffw\" (UID: \"b34266ce-b971-4f4b-b8b7-c54ff8b6212c\") " pod="openshift-console/downloads-7954f5f757-qrffw" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.966818 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3cd630b-3fe5-497f-90da-f58bcb7aac8b-trusted-ca\") pod \"ingress-operator-5b745b69d9-h2q6n\" (UID: \"f3cd630b-3fe5-497f-90da-f58bcb7aac8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.966851 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b8a9bdb-0612-4627-ba36-98293308c32d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-562pf\" (UID: \"8b8a9bdb-0612-4627-ba36-98293308c32d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-562pf" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.966885 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6-secret-volume\") pod \"collect-profiles-29493870-4jnh6\" (UID: \"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.966909 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-trusted-ca-bundle\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.966934 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3cd630b-3fe5-497f-90da-f58bcb7aac8b-metrics-tls\") pod \"ingress-operator-5b745b69d9-h2q6n\" (UID: \"f3cd630b-3fe5-497f-90da-f58bcb7aac8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.966994 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.967019 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac478ffc-e1e4-4b72-bf99-d35c2636a78d-config\") pod \"authentication-operator-69f744f599-8mmbg\" (UID: \"ac478ffc-e1e4-4b72-bf99-d35c2636a78d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.967039 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ea55ab8-bec0-44e8-8105-c8c604fc5fa1-config\") pod \"console-operator-58897d9998-4cmsz\" (UID: \"0ea55ab8-bec0-44e8-8105-c8c604fc5fa1\") " pod="openshift-console-operator/console-operator-58897d9998-4cmsz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.967055 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-audit-policies\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.967088 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rktqc\" (UniqueName: \"kubernetes.io/projected/447abd89-31fd-4bb6-a965-97d7954f47bb-kube-api-access-rktqc\") pod \"route-controller-manager-6576b87f9c-sjqv2\" (UID: \"447abd89-31fd-4bb6-a965-97d7954f47bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969246 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-oauth-serving-cert\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969272 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f0eb07a-e0b4-4702-89b0-d94e937471a5-serving-cert\") pod \"controller-manager-879f6c89f-5d8cs\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969314 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/447abd89-31fd-4bb6-a965-97d7954f47bb-config\") pod \"route-controller-manager-6576b87f9c-sjqv2\" (UID: \"447abd89-31fd-4bb6-a965-97d7954f47bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969333 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/447abd89-31fd-4bb6-a965-97d7954f47bb-serving-cert\") pod \"route-controller-manager-6576b87f9c-sjqv2\" (UID: \"447abd89-31fd-4bb6-a965-97d7954f47bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.968764 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac478ffc-e1e4-4b72-bf99-d35c2636a78d-config\") pod \"authentication-operator-69f744f599-8mmbg\" (UID: \"ac478ffc-e1e4-4b72-bf99-d35c2636a78d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969353 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d66e245a-95d4-49c2-9172-2f095fab3e2b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5gjgg\" (UID: \"d66e245a-95d4-49c2-9172-2f095fab3e2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gjgg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.967244 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969400 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aa555a55-a0b0-47e3-959f-e2d8d387aae2-webhook-cert\") pod \"packageserver-d55dfcdfc-8vvmj\" (UID: \"aa555a55-a0b0-47e3-959f-e2d8d387aae2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969423 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tgv9\" (UniqueName: \"kubernetes.io/projected/4886a33e-8379-47d5-ac13-e58bc623d01c-kube-api-access-6tgv9\") pod \"catalog-operator-68c6474976-wb74c\" (UID: \"4886a33e-8379-47d5-ac13-e58bc623d01c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969465 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jq78\" (UniqueName: \"kubernetes.io/projected/c29355d7-4e9a-4e5a-838e-77a4df7c2fda-kube-api-access-5jq78\") pod \"dns-operator-744455d44c-q8fsc\" (UID: \"c29355d7-4e9a-4e5a-838e-77a4df7c2fda\") " pod="openshift-dns-operator/dns-operator-744455d44c-q8fsc" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969487 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c068f936-795b-4eb3-83a8-e363131119e9-metrics-certs\") pod \"router-default-5444994796-rjmhd\" (UID: \"c068f936-795b-4eb3-83a8-e363131119e9\") " pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969542 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969569 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c7789bf-f2f7-4cf4-97f1-3d8a7438daca-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-z2vr7\" (UID: \"8c7789bf-f2f7-4cf4-97f1-3d8a7438daca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969621 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfhc8\" (UniqueName: \"kubernetes.io/projected/f31f9ca0-e467-452d-90db-a28a4b69496e-kube-api-access-lfhc8\") pod \"etcd-operator-b45778765-dbwsb\" (UID: \"f31f9ca0-e467-452d-90db-a28a4b69496e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969642 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-etcd-serving-ca\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969665 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kglc\" (UniqueName: \"kubernetes.io/projected/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6-kube-api-access-9kglc\") pod \"collect-profiles-29493870-4jnh6\" (UID: \"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969709 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c7789bf-f2f7-4cf4-97f1-3d8a7438daca-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-z2vr7\" (UID: \"8c7789bf-f2f7-4cf4-97f1-3d8a7438daca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969730 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctjhc\" (UniqueName: \"kubernetes.io/projected/e3e01537-7adb-4a81-a9cb-3deb73a1d5b3-kube-api-access-ctjhc\") pod \"cluster-samples-operator-665b6dd947-xxnb8\" (UID: \"e3e01537-7adb-4a81-a9cb-3deb73a1d5b3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xxnb8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969749 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969792 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969814 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac478ffc-e1e4-4b72-bf99-d35c2636a78d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8mmbg\" (UID: \"ac478ffc-e1e4-4b72-bf99-d35c2636a78d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969832 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3e01537-7adb-4a81-a9cb-3deb73a1d5b3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xxnb8\" (UID: \"e3e01537-7adb-4a81-a9cb-3deb73a1d5b3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xxnb8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969870 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6krvq\" (UniqueName: \"kubernetes.io/projected/6208130d-52bc-449e-b371-357b1cc21b22-kube-api-access-6krvq\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969888 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d66e245a-95d4-49c2-9172-2f095fab3e2b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5gjgg\" (UID: \"d66e245a-95d4-49c2-9172-2f095fab3e2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gjgg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969905 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c29355d7-4e9a-4e5a-838e-77a4df7c2fda-metrics-tls\") pod \"dns-operator-744455d44c-q8fsc\" (UID: \"c29355d7-4e9a-4e5a-838e-77a4df7c2fda\") " pod="openshift-dns-operator/dns-operator-744455d44c-q8fsc" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969945 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdbcf\" (UniqueName: \"kubernetes.io/projected/aa555a55-a0b0-47e3-959f-e2d8d387aae2-kube-api-access-pdbcf\") pod \"packageserver-d55dfcdfc-8vvmj\" (UID: \"aa555a55-a0b0-47e3-959f-e2d8d387aae2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.967623 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-s8b4b"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969963 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-config\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.969981 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdpbf\" (UniqueName: \"kubernetes.io/projected/18ad1fee-6a3a-4b98-83c5-6a13f22699db-kube-api-access-xdpbf\") pod \"openshift-config-operator-7777fb866f-rktng\" (UID: \"18ad1fee-6a3a-4b98-83c5-6a13f22699db\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rktng" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970033 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f31f9ca0-e467-452d-90db-a28a4b69496e-serving-cert\") pod \"etcd-operator-b45778765-dbwsb\" (UID: \"f31f9ca0-e467-452d-90db-a28a4b69496e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970053 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-console-config\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970070 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64bpv\" (UniqueName: \"kubernetes.io/projected/d66e245a-95d4-49c2-9172-2f095fab3e2b-kube-api-access-64bpv\") pod \"openshift-controller-manager-operator-756b6f6bc6-5gjgg\" (UID: \"d66e245a-95d4-49c2-9172-2f095fab3e2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gjgg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970129 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0eb07a-e0b4-4702-89b0-d94e937471a5-config\") pod \"controller-manager-879f6c89f-5d8cs\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970163 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f3cd630b-3fe5-497f-90da-f58bcb7aac8b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-h2q6n\" (UID: \"f3cd630b-3fe5-497f-90da-f58bcb7aac8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970202 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-encryption-config\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970223 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr849\" (UniqueName: \"kubernetes.io/projected/71db2fbb-6c1a-417f-9cc4-a1ae041d0c0a-kube-api-access-gr849\") pod \"package-server-manager-789f6589d5-zzrh5\" (UID: \"71db2fbb-6c1a-417f-9cc4-a1ae041d0c0a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zzrh5" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970243 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac478ffc-e1e4-4b72-bf99-d35c2636a78d-serving-cert\") pod \"authentication-operator-69f744f599-8mmbg\" (UID: \"ac478ffc-e1e4-4b72-bf99-d35c2636a78d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970280 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/94cef654-afbe-42c2-8069-5dbcb7294abb-console-serving-cert\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970297 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f0eb07a-e0b4-4702-89b0-d94e937471a5-client-ca\") pod \"controller-manager-879f6c89f-5d8cs\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970315 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4tp6\" (UniqueName: \"kubernetes.io/projected/4e03d657-3b57-4eea-bb77-f5fe3a519cac-kube-api-access-g4tp6\") pod \"migrator-59844c95c7-s8b4b\" (UID: \"4e03d657-3b57-4eea-bb77-f5fe3a519cac\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s8b4b" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970334 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c068f936-795b-4eb3-83a8-e363131119e9-service-ca-bundle\") pod \"router-default-5444994796-rjmhd\" (UID: \"c068f936-795b-4eb3-83a8-e363131119e9\") " pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970350 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-etcd-client\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970370 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac478ffc-e1e4-4b72-bf99-d35c2636a78d-service-ca-bundle\") pod \"authentication-operator-69f744f599-8mmbg\" (UID: \"ac478ffc-e1e4-4b72-bf99-d35c2636a78d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970405 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea55ab8-bec0-44e8-8105-c8c604fc5fa1-serving-cert\") pod \"console-operator-58897d9998-4cmsz\" (UID: \"0ea55ab8-bec0-44e8-8105-c8c604fc5fa1\") " pod="openshift-console-operator/console-operator-58897d9998-4cmsz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970424 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f31f9ca0-e467-452d-90db-a28a4b69496e-etcd-client\") pod \"etcd-operator-b45778765-dbwsb\" (UID: \"f31f9ca0-e467-452d-90db-a28a4b69496e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970460 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/71db2fbb-6c1a-417f-9cc4-a1ae041d0c0a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zzrh5\" (UID: \"71db2fbb-6c1a-417f-9cc4-a1ae041d0c0a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zzrh5" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970521 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e64bb6e-1131-431b-b87c-71e25d294fe1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lzj8l\" (UID: \"3e64bb6e-1131-431b-b87c-71e25d294fe1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lzj8l" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970543 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970566 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970584 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6479c53a-0e30-4805-bdb8-314a66127b5c-machine-approver-tls\") pod \"machine-approver-56656f9798-v7zfx\" (UID: \"6479c53a-0e30-4805-bdb8-314a66127b5c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970602 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6479c53a-0e30-4805-bdb8-314a66127b5c-config\") pod \"machine-approver-56656f9798-v7zfx\" (UID: \"6479c53a-0e30-4805-bdb8-314a66127b5c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970633 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdh2x\" (UniqueName: \"kubernetes.io/projected/0ea55ab8-bec0-44e8-8105-c8c604fc5fa1-kube-api-access-cdh2x\") pod \"console-operator-58897d9998-4cmsz\" (UID: \"0ea55ab8-bec0-44e8-8105-c8c604fc5fa1\") " pod="openshift-console-operator/console-operator-58897d9998-4cmsz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970650 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6479c53a-0e30-4805-bdb8-314a66127b5c-auth-proxy-config\") pod \"machine-approver-56656f9798-v7zfx\" (UID: \"6479c53a-0e30-4805-bdb8-314a66127b5c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970666 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f31f9ca0-e467-452d-90db-a28a4b69496e-etcd-ca\") pod \"etcd-operator-b45778765-dbwsb\" (UID: \"f31f9ca0-e467-452d-90db-a28a4b69496e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970684 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b8a9bdb-0612-4627-ba36-98293308c32d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-562pf\" (UID: \"8b8a9bdb-0612-4627-ba36-98293308c32d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-562pf" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970702 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3e64bb6e-1131-431b-b87c-71e25d294fe1-images\") pod \"machine-api-operator-5694c8668f-lzj8l\" (UID: \"3e64bb6e-1131-431b-b87c-71e25d294fe1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lzj8l" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.970722 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8w6v4\" (UID: \"594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8w6v4" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971131 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971164 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvxmk\" (UniqueName: \"kubernetes.io/projected/3e64bb6e-1131-431b-b87c-71e25d294fe1-kube-api-access-tvxmk\") pod \"machine-api-operator-5694c8668f-lzj8l\" (UID: \"3e64bb6e-1131-431b-b87c-71e25d294fe1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lzj8l" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971196 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c068f936-795b-4eb3-83a8-e363131119e9-stats-auth\") pod \"router-default-5444994796-rjmhd\" (UID: \"c068f936-795b-4eb3-83a8-e363131119e9\") " pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971213 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6-config-volume\") pod \"collect-profiles-29493870-4jnh6\" (UID: \"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971231 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971247 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4886a33e-8379-47d5-ac13-e58bc623d01c-srv-cert\") pod \"catalog-operator-68c6474976-wb74c\" (UID: \"4886a33e-8379-47d5-ac13-e58bc623d01c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971266 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbx4z\" (UniqueName: \"kubernetes.io/projected/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-kube-api-access-dbx4z\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971286 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971302 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0cedee54-2c6e-44d4-a51a-b5f8d3ff0833-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-hlqz9\" (UID: \"0cedee54-2c6e-44d4-a51a-b5f8d3ff0833\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hlqz9" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971323 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-service-ca\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971343 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aa555a55-a0b0-47e3-959f-e2d8d387aae2-apiservice-cert\") pod \"packageserver-d55dfcdfc-8vvmj\" (UID: \"aa555a55-a0b0-47e3-959f-e2d8d387aae2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971407 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49tzx\" (UniqueName: \"kubernetes.io/projected/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-kube-api-access-49tzx\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971427 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0ea55ab8-bec0-44e8-8105-c8c604fc5fa1-trusted-ca\") pod \"console-operator-58897d9998-4cmsz\" (UID: \"0ea55ab8-bec0-44e8-8105-c8c604fc5fa1\") " pod="openshift-console-operator/console-operator-58897d9998-4cmsz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971479 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/72e0847f-0a87-4710-9765-a10282cc0529-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mznpb\" (UID: \"72e0847f-0a87-4710-9765-a10282cc0529\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mznpb" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971500 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-serving-cert\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971517 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c068f936-795b-4eb3-83a8-e363131119e9-default-certificate\") pod \"router-default-5444994796-rjmhd\" (UID: \"c068f936-795b-4eb3-83a8-e363131119e9\") " pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971535 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971552 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e64bb6e-1131-431b-b87c-71e25d294fe1-config\") pod \"machine-api-operator-5694c8668f-lzj8l\" (UID: \"3e64bb6e-1131-431b-b87c-71e25d294fe1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lzj8l" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971570 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cedee54-2c6e-44d4-a51a-b5f8d3ff0833-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-hlqz9\" (UID: \"0cedee54-2c6e-44d4-a51a-b5f8d3ff0833\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hlqz9" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971589 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/447abd89-31fd-4bb6-a965-97d7954f47bb-client-ca\") pod \"route-controller-manager-6576b87f9c-sjqv2\" (UID: \"447abd89-31fd-4bb6-a965-97d7954f47bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971606 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/aa7d6dd6-1d54-400f-a188-628f99083f93-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lqlcg\" (UID: \"aa7d6dd6-1d54-400f-a188-628f99083f93\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lqlcg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971624 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f31f9ca0-e467-452d-90db-a28a4b69496e-config\") pod \"etcd-operator-b45778765-dbwsb\" (UID: \"f31f9ca0-e467-452d-90db-a28a4b69496e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971641 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/460c9e37-a0c0-43ea-9607-8f716e2e92bf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gbwrh\" (UID: \"460c9e37-a0c0-43ea-9607-8f716e2e92bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gbwrh" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971662 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b8a9bdb-0612-4627-ba36-98293308c32d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-562pf\" (UID: \"8b8a9bdb-0612-4627-ba36-98293308c32d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-562pf" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971680 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-image-import-ca\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971746 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-audit-policies\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971839 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971859 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/460c9e37-a0c0-43ea-9607-8f716e2e92bf-config\") pod \"kube-apiserver-operator-766d6c64bb-gbwrh\" (UID: \"460c9e37-a0c0-43ea-9607-8f716e2e92bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gbwrh" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971914 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tknp\" (UniqueName: \"kubernetes.io/projected/94cef654-afbe-42c2-8069-5dbcb7294abb-kube-api-access-6tknp\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.971933 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrbz8\" (UniqueName: \"kubernetes.io/projected/f3cd630b-3fe5-497f-90da-f58bcb7aac8b-kube-api-access-lrbz8\") pod \"ingress-operator-5b745b69d9-h2q6n\" (UID: \"f3cd630b-3fe5-497f-90da-f58bcb7aac8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.972000 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c7789bf-f2f7-4cf4-97f1-3d8a7438daca-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-z2vr7\" (UID: \"8c7789bf-f2f7-4cf4-97f1-3d8a7438daca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.972036 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-etcd-client\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.972227 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/94cef654-afbe-42c2-8069-5dbcb7294abb-console-oauth-config\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.972299 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6208130d-52bc-449e-b371-357b1cc21b22-audit-dir\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.972332 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnkbm\" (UniqueName: \"kubernetes.io/projected/72e0847f-0a87-4710-9765-a10282cc0529-kube-api-access-vnkbm\") pod \"control-plane-machine-set-operator-78cbb6b69f-mznpb\" (UID: \"72e0847f-0a87-4710-9765-a10282cc0529\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mznpb" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.972376 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18ad1fee-6a3a-4b98-83c5-6a13f22699db-serving-cert\") pod \"openshift-config-operator-7777fb866f-rktng\" (UID: \"18ad1fee-6a3a-4b98-83c5-6a13f22699db\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rktng" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.972409 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-audit-dir\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.972462 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f0eb07a-e0b4-4702-89b0-d94e937471a5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5d8cs\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.972497 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k8vv\" (UniqueName: \"kubernetes.io/projected/0f0eb07a-e0b4-4702-89b0-d94e937471a5-kube-api-access-2k8vv\") pod \"controller-manager-879f6c89f-5d8cs\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.972535 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-trusted-ca-bundle\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.972567 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-serving-cert\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.973909 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sk7xz"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.973966 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dbwsb"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.973984 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.978468 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-audit-policies\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.978636 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-oauth-serving-cert\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.979108 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gjgg"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.979168 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hlqz9"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.979185 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qrffw"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.979661 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.980537 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-service-ca\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.980979 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.982341 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8w6v4\" (UID: \"594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8w6v4" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.982514 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f0eb07a-e0b4-4702-89b0-d94e937471a5-client-ca\") pod \"controller-manager-879f6c89f-5d8cs\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.983147 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.983199 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/447abd89-31fd-4bb6-a965-97d7954f47bb-config\") pod \"route-controller-manager-6576b87f9c-sjqv2\" (UID: \"447abd89-31fd-4bb6-a965-97d7954f47bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.983240 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6479c53a-0e30-4805-bdb8-314a66127b5c-auth-proxy-config\") pod \"machine-approver-56656f9798-v7zfx\" (UID: \"6479c53a-0e30-4805-bdb8-314a66127b5c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.983614 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.983931 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3e64bb6e-1131-431b-b87c-71e25d294fe1-images\") pod \"machine-api-operator-5694c8668f-lzj8l\" (UID: \"3e64bb6e-1131-431b-b87c-71e25d294fe1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lzj8l" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.984275 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mznpb"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.984310 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gbwrh"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.984325 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.984601 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d66e245a-95d4-49c2-9172-2f095fab3e2b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5gjgg\" (UID: \"d66e245a-95d4-49c2-9172-2f095fab3e2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gjgg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.984682 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8w6v4\" (UID: \"594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8w6v4" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.985317 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.985990 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-encryption-config\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.986947 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/447abd89-31fd-4bb6-a965-97d7954f47bb-client-ca\") pod \"route-controller-manager-6576b87f9c-sjqv2\" (UID: \"447abd89-31fd-4bb6-a965-97d7954f47bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.987891 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-audit-policies\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.988273 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8c7789bf-f2f7-4cf4-97f1-3d8a7438daca-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-z2vr7\" (UID: \"8c7789bf-f2f7-4cf4-97f1-3d8a7438daca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.988370 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f0eb07a-e0b4-4702-89b0-d94e937471a5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5d8cs\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.988453 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.988831 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.989370 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-etcd-client\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.989712 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.989861 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.989911 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m6pvw"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.989960 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.990185 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6208130d-52bc-449e-b371-357b1cc21b22-audit-dir\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.990933 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-console-config\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.991104 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6479c53a-0e30-4805-bdb8-314a66127b5c-config\") pod \"machine-approver-56656f9798-v7zfx\" (UID: \"6479c53a-0e30-4805-bdb8-314a66127b5c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.991141 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-trusted-ca-bundle\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.991857 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-89zbv"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.992199 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0eb07a-e0b4-4702-89b0-d94e937471a5-config\") pod \"controller-manager-879f6c89f-5d8cs\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.992460 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.993507 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.993571 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e64bb6e-1131-431b-b87c-71e25d294fe1-config\") pod \"machine-api-operator-5694c8668f-lzj8l\" (UID: \"3e64bb6e-1131-431b-b87c-71e25d294fe1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lzj8l" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.995710 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wpwwb"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.997672 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.997739 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac478ffc-e1e4-4b72-bf99-d35c2636a78d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8mmbg\" (UID: \"ac478ffc-e1e4-4b72-bf99-d35c2636a78d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.997926 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f0eb07a-e0b4-4702-89b0-d94e937471a5-serving-cert\") pod \"controller-manager-879f6c89f-5d8cs\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.998164 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.998254 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-62cbt"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.998275 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lqlcg"] Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.998305 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/94cef654-afbe-42c2-8069-5dbcb7294abb-console-serving-cert\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:52 crc kubenswrapper[4746]: I0128 20:41:52.998429 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wpwwb" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:52.998893 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/447abd89-31fd-4bb6-a965-97d7954f47bb-serving-cert\") pod \"route-controller-manager-6576b87f9c-sjqv2\" (UID: \"447abd89-31fd-4bb6-a965-97d7954f47bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:52.999164 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ea55ab8-bec0-44e8-8105-c8c604fc5fa1-config\") pod \"console-operator-58897d9998-4cmsz\" (UID: \"0ea55ab8-bec0-44e8-8105-c8c604fc5fa1\") " pod="openshift-console-operator/console-operator-58897d9998-4cmsz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:52.999769 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rbl7q"] Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.000050 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0ea55ab8-bec0-44e8-8105-c8c604fc5fa1-trusted-ca\") pod \"console-operator-58897d9998-4cmsz\" (UID: \"0ea55ab8-bec0-44e8-8105-c8c604fc5fa1\") " pod="openshift-console-operator/console-operator-58897d9998-4cmsz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.001019 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e64bb6e-1131-431b-b87c-71e25d294fe1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lzj8l\" (UID: \"3e64bb6e-1131-431b-b87c-71e25d294fe1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lzj8l" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.001037 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-562pf"] Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.001056 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/94cef654-afbe-42c2-8069-5dbcb7294abb-console-oauth-config\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.001672 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.001690 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea55ab8-bec0-44e8-8105-c8c604fc5fa1-serving-cert\") pod \"console-operator-58897d9998-4cmsz\" (UID: \"0ea55ab8-bec0-44e8-8105-c8c604fc5fa1\") " pod="openshift-console-operator/console-operator-58897d9998-4cmsz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.001772 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c7789bf-f2f7-4cf4-97f1-3d8a7438daca-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-z2vr7\" (UID: \"8c7789bf-f2f7-4cf4-97f1-3d8a7438daca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.001953 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zzrh5"] Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.002199 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d66e245a-95d4-49c2-9172-2f095fab3e2b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5gjgg\" (UID: \"d66e245a-95d4-49c2-9172-2f095fab3e2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gjgg" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.002447 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3e01537-7adb-4a81-a9cb-3deb73a1d5b3-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xxnb8\" (UID: \"e3e01537-7adb-4a81-a9cb-3deb73a1d5b3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xxnb8" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.002719 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-serving-cert\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.002984 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6"] Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.004161 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.004561 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6479c53a-0e30-4805-bdb8-314a66127b5c-machine-approver-tls\") pod \"machine-approver-56656f9798-v7zfx\" (UID: \"6479c53a-0e30-4805-bdb8-314a66127b5c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.005928 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq"] Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.008446 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4g4p7"] Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.009798 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac478ffc-e1e4-4b72-bf99-d35c2636a78d-service-ca-bundle\") pod \"authentication-operator-69f744f599-8mmbg\" (UID: \"ac478ffc-e1e4-4b72-bf99-d35c2636a78d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.010308 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac478ffc-e1e4-4b72-bf99-d35c2636a78d-serving-cert\") pod \"authentication-operator-69f744f599-8mmbg\" (UID: \"ac478ffc-e1e4-4b72-bf99-d35c2636a78d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.010363 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-89zbv"] Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.011883 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4"] Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.012977 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sqh2q"] Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.014187 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vfcnn"] Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.015272 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wpwwb"] Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.016744 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6hnhg"] Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.017659 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.017848 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6hnhg" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.020203 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6hnhg"] Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.035544 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.055471 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.073213 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c068f936-795b-4eb3-83a8-e363131119e9-default-certificate\") pod \"router-default-5444994796-rjmhd\" (UID: \"c068f936-795b-4eb3-83a8-e363131119e9\") " pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.073251 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cedee54-2c6e-44d4-a51a-b5f8d3ff0833-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-hlqz9\" (UID: \"0cedee54-2c6e-44d4-a51a-b5f8d3ff0833\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hlqz9" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.073274 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/aa7d6dd6-1d54-400f-a188-628f99083f93-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lqlcg\" (UID: \"aa7d6dd6-1d54-400f-a188-628f99083f93\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lqlcg" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.073294 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f31f9ca0-e467-452d-90db-a28a4b69496e-config\") pod \"etcd-operator-b45778765-dbwsb\" (UID: \"f31f9ca0-e467-452d-90db-a28a4b69496e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.073314 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/460c9e37-a0c0-43ea-9607-8f716e2e92bf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gbwrh\" (UID: \"460c9e37-a0c0-43ea-9607-8f716e2e92bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gbwrh" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.073330 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b8a9bdb-0612-4627-ba36-98293308c32d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-562pf\" (UID: \"8b8a9bdb-0612-4627-ba36-98293308c32d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-562pf" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.073348 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-image-import-ca\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.073365 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/460c9e37-a0c0-43ea-9607-8f716e2e92bf-config\") pod \"kube-apiserver-operator-766d6c64bb-gbwrh\" (UID: \"460c9e37-a0c0-43ea-9607-8f716e2e92bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gbwrh" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.073513 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrbz8\" (UniqueName: \"kubernetes.io/projected/f3cd630b-3fe5-497f-90da-f58bcb7aac8b-kube-api-access-lrbz8\") pod \"ingress-operator-5b745b69d9-h2q6n\" (UID: \"f3cd630b-3fe5-497f-90da-f58bcb7aac8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.073537 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnkbm\" (UniqueName: \"kubernetes.io/projected/72e0847f-0a87-4710-9765-a10282cc0529-kube-api-access-vnkbm\") pod \"control-plane-machine-set-operator-78cbb6b69f-mznpb\" (UID: \"72e0847f-0a87-4710-9765-a10282cc0529\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mznpb" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.073573 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18ad1fee-6a3a-4b98-83c5-6a13f22699db-serving-cert\") pod \"openshift-config-operator-7777fb866f-rktng\" (UID: \"18ad1fee-6a3a-4b98-83c5-6a13f22699db\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rktng" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.073593 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-audit-dir\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.073743 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-audit-dir\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.073978 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-serving-cert\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.074006 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slszm\" (UniqueName: \"kubernetes.io/projected/c068f936-795b-4eb3-83a8-e363131119e9-kube-api-access-slszm\") pod \"router-default-5444994796-rjmhd\" (UID: \"c068f936-795b-4eb3-83a8-e363131119e9\") " pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.074038 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f31f9ca0-e467-452d-90db-a28a4b69496e-etcd-service-ca\") pod \"etcd-operator-b45778765-dbwsb\" (UID: \"f31f9ca0-e467-452d-90db-a28a4b69496e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.074048 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b8a9bdb-0612-4627-ba36-98293308c32d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-562pf\" (UID: \"8b8a9bdb-0612-4627-ba36-98293308c32d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-562pf" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.074055 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cedee54-2c6e-44d4-a51a-b5f8d3ff0833-config\") pod \"kube-controller-manager-operator-78b949d7b-hlqz9\" (UID: \"0cedee54-2c6e-44d4-a51a-b5f8d3ff0833\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hlqz9" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.074158 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-node-pullsecrets\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.074184 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-audit\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.074222 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjlmv\" (UniqueName: \"kubernetes.io/projected/aa7d6dd6-1d54-400f-a188-628f99083f93-kube-api-access-hjlmv\") pod \"multus-admission-controller-857f4d67dd-lqlcg\" (UID: \"aa7d6dd6-1d54-400f-a188-628f99083f93\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lqlcg" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.074245 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/aa555a55-a0b0-47e3-959f-e2d8d387aae2-tmpfs\") pod \"packageserver-d55dfcdfc-8vvmj\" (UID: \"aa555a55-a0b0-47e3-959f-e2d8d387aae2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.074255 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-node-pullsecrets\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.074359 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/18ad1fee-6a3a-4b98-83c5-6a13f22699db-available-featuregates\") pod \"openshift-config-operator-7777fb866f-rktng\" (UID: \"18ad1fee-6a3a-4b98-83c5-6a13f22699db\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rktng" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.074985 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/aa555a55-a0b0-47e3-959f-e2d8d387aae2-tmpfs\") pod \"packageserver-d55dfcdfc-8vvmj\" (UID: \"aa555a55-a0b0-47e3-959f-e2d8d387aae2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.075356 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f31f9ca0-e467-452d-90db-a28a4b69496e-etcd-service-ca\") pod \"etcd-operator-b45778765-dbwsb\" (UID: \"f31f9ca0-e467-452d-90db-a28a4b69496e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.075627 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/18ad1fee-6a3a-4b98-83c5-6a13f22699db-available-featuregates\") pod \"openshift-config-operator-7777fb866f-rktng\" (UID: \"18ad1fee-6a3a-4b98-83c5-6a13f22699db\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rktng" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.076070 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.076216 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/460c9e37-a0c0-43ea-9607-8f716e2e92bf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gbwrh\" (UID: \"460c9e37-a0c0-43ea-9607-8f716e2e92bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gbwrh" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.076257 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4886a33e-8379-47d5-ac13-e58bc623d01c-profile-collector-cert\") pod \"catalog-operator-68c6474976-wb74c\" (UID: \"4886a33e-8379-47d5-ac13-e58bc623d01c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.076285 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b8a9bdb-0612-4627-ba36-98293308c32d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-562pf\" (UID: \"8b8a9bdb-0612-4627-ba36-98293308c32d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-562pf" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.076336 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3cd630b-3fe5-497f-90da-f58bcb7aac8b-trusted-ca\") pod \"ingress-operator-5b745b69d9-h2q6n\" (UID: \"f3cd630b-3fe5-497f-90da-f58bcb7aac8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.076366 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-trusted-ca-bundle\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.076398 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6-secret-volume\") pod \"collect-profiles-29493870-4jnh6\" (UID: \"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.076425 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3cd630b-3fe5-497f-90da-f58bcb7aac8b-metrics-tls\") pod \"ingress-operator-5b745b69d9-h2q6n\" (UID: \"f3cd630b-3fe5-497f-90da-f58bcb7aac8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.076471 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aa555a55-a0b0-47e3-959f-e2d8d387aae2-webhook-cert\") pod \"packageserver-d55dfcdfc-8vvmj\" (UID: \"aa555a55-a0b0-47e3-959f-e2d8d387aae2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.076494 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tgv9\" (UniqueName: \"kubernetes.io/projected/4886a33e-8379-47d5-ac13-e58bc623d01c-kube-api-access-6tgv9\") pod \"catalog-operator-68c6474976-wb74c\" (UID: \"4886a33e-8379-47d5-ac13-e58bc623d01c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.076518 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jq78\" (UniqueName: \"kubernetes.io/projected/c29355d7-4e9a-4e5a-838e-77a4df7c2fda-kube-api-access-5jq78\") pod \"dns-operator-744455d44c-q8fsc\" (UID: \"c29355d7-4e9a-4e5a-838e-77a4df7c2fda\") " pod="openshift-dns-operator/dns-operator-744455d44c-q8fsc" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.076829 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c068f936-795b-4eb3-83a8-e363131119e9-metrics-certs\") pod \"router-default-5444994796-rjmhd\" (UID: \"c068f936-795b-4eb3-83a8-e363131119e9\") " pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.076897 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfhc8\" (UniqueName: \"kubernetes.io/projected/f31f9ca0-e467-452d-90db-a28a4b69496e-kube-api-access-lfhc8\") pod \"etcd-operator-b45778765-dbwsb\" (UID: \"f31f9ca0-e467-452d-90db-a28a4b69496e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.076922 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-etcd-serving-ca\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.077028 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kglc\" (UniqueName: \"kubernetes.io/projected/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6-kube-api-access-9kglc\") pod \"collect-profiles-29493870-4jnh6\" (UID: \"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.077108 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c29355d7-4e9a-4e5a-838e-77a4df7c2fda-metrics-tls\") pod \"dns-operator-744455d44c-q8fsc\" (UID: \"c29355d7-4e9a-4e5a-838e-77a4df7c2fda\") " pod="openshift-dns-operator/dns-operator-744455d44c-q8fsc" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.077126 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdbcf\" (UniqueName: \"kubernetes.io/projected/aa555a55-a0b0-47e3-959f-e2d8d387aae2-kube-api-access-pdbcf\") pod \"packageserver-d55dfcdfc-8vvmj\" (UID: \"aa555a55-a0b0-47e3-959f-e2d8d387aae2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.078956 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18ad1fee-6a3a-4b98-83c5-6a13f22699db-serving-cert\") pod \"openshift-config-operator-7777fb866f-rktng\" (UID: \"18ad1fee-6a3a-4b98-83c5-6a13f22699db\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rktng" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.081861 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c29355d7-4e9a-4e5a-838e-77a4df7c2fda-metrics-tls\") pod \"dns-operator-744455d44c-q8fsc\" (UID: \"c29355d7-4e9a-4e5a-838e-77a4df7c2fda\") " pod="openshift-dns-operator/dns-operator-744455d44c-q8fsc" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.083927 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-config\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.083986 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdpbf\" (UniqueName: \"kubernetes.io/projected/18ad1fee-6a3a-4b98-83c5-6a13f22699db-kube-api-access-xdpbf\") pod \"openshift-config-operator-7777fb866f-rktng\" (UID: \"18ad1fee-6a3a-4b98-83c5-6a13f22699db\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rktng" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.084017 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f31f9ca0-e467-452d-90db-a28a4b69496e-serving-cert\") pod \"etcd-operator-b45778765-dbwsb\" (UID: \"f31f9ca0-e467-452d-90db-a28a4b69496e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.084064 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f3cd630b-3fe5-497f-90da-f58bcb7aac8b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-h2q6n\" (UID: \"f3cd630b-3fe5-497f-90da-f58bcb7aac8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.084101 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-encryption-config\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.084126 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr849\" (UniqueName: \"kubernetes.io/projected/71db2fbb-6c1a-417f-9cc4-a1ae041d0c0a-kube-api-access-gr849\") pod \"package-server-manager-789f6589d5-zzrh5\" (UID: \"71db2fbb-6c1a-417f-9cc4-a1ae041d0c0a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zzrh5" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.084156 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4tp6\" (UniqueName: \"kubernetes.io/projected/4e03d657-3b57-4eea-bb77-f5fe3a519cac-kube-api-access-g4tp6\") pod \"migrator-59844c95c7-s8b4b\" (UID: \"4e03d657-3b57-4eea-bb77-f5fe3a519cac\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s8b4b" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.084178 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c068f936-795b-4eb3-83a8-e363131119e9-service-ca-bundle\") pod \"router-default-5444994796-rjmhd\" (UID: \"c068f936-795b-4eb3-83a8-e363131119e9\") " pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.084204 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-etcd-client\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.084232 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f31f9ca0-e467-452d-90db-a28a4b69496e-etcd-client\") pod \"etcd-operator-b45778765-dbwsb\" (UID: \"f31f9ca0-e467-452d-90db-a28a4b69496e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.084257 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/71db2fbb-6c1a-417f-9cc4-a1ae041d0c0a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zzrh5\" (UID: \"71db2fbb-6c1a-417f-9cc4-a1ae041d0c0a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zzrh5" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.084317 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b8a9bdb-0612-4627-ba36-98293308c32d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-562pf\" (UID: \"8b8a9bdb-0612-4627-ba36-98293308c32d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-562pf" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.084336 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f31f9ca0-e467-452d-90db-a28a4b69496e-etcd-ca\") pod \"etcd-operator-b45778765-dbwsb\" (UID: \"f31f9ca0-e467-452d-90db-a28a4b69496e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.084384 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c068f936-795b-4eb3-83a8-e363131119e9-stats-auth\") pod \"router-default-5444994796-rjmhd\" (UID: \"c068f936-795b-4eb3-83a8-e363131119e9\") " pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.084410 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6-config-volume\") pod \"collect-profiles-29493870-4jnh6\" (UID: \"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.084435 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4886a33e-8379-47d5-ac13-e58bc623d01c-srv-cert\") pod \"catalog-operator-68c6474976-wb74c\" (UID: \"4886a33e-8379-47d5-ac13-e58bc623d01c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.084469 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0cedee54-2c6e-44d4-a51a-b5f8d3ff0833-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-hlqz9\" (UID: \"0cedee54-2c6e-44d4-a51a-b5f8d3ff0833\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hlqz9" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.084493 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aa555a55-a0b0-47e3-959f-e2d8d387aae2-apiservice-cert\") pod \"packageserver-d55dfcdfc-8vvmj\" (UID: \"aa555a55-a0b0-47e3-959f-e2d8d387aae2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.084516 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49tzx\" (UniqueName: \"kubernetes.io/projected/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-kube-api-access-49tzx\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.084537 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/72e0847f-0a87-4710-9765-a10282cc0529-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mznpb\" (UID: \"72e0847f-0a87-4710-9765-a10282cc0529\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mznpb" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.085267 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/aa7d6dd6-1d54-400f-a188-628f99083f93-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lqlcg\" (UID: \"aa7d6dd6-1d54-400f-a188-628f99083f93\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lqlcg" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.085877 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f31f9ca0-e467-452d-90db-a28a4b69496e-etcd-ca\") pod \"etcd-operator-b45778765-dbwsb\" (UID: \"f31f9ca0-e467-452d-90db-a28a4b69496e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.087291 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b8a9bdb-0612-4627-ba36-98293308c32d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-562pf\" (UID: \"8b8a9bdb-0612-4627-ba36-98293308c32d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-562pf" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.087638 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f31f9ca0-e467-452d-90db-a28a4b69496e-etcd-client\") pod \"etcd-operator-b45778765-dbwsb\" (UID: \"f31f9ca0-e467-452d-90db-a28a4b69496e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.088395 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f31f9ca0-e467-452d-90db-a28a4b69496e-serving-cert\") pod \"etcd-operator-b45778765-dbwsb\" (UID: \"f31f9ca0-e467-452d-90db-a28a4b69496e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.096398 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.115734 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.128876 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/72e0847f-0a87-4710-9765-a10282cc0529-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-mznpb\" (UID: \"72e0847f-0a87-4710-9765-a10282cc0529\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mznpb" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.135483 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.156285 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.176465 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.180295 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/460c9e37-a0c0-43ea-9607-8f716e2e92bf-config\") pod \"kube-apiserver-operator-766d6c64bb-gbwrh\" (UID: \"460c9e37-a0c0-43ea-9607-8f716e2e92bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gbwrh" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.188352 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/460c9e37-a0c0-43ea-9607-8f716e2e92bf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gbwrh\" (UID: \"460c9e37-a0c0-43ea-9607-8f716e2e92bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gbwrh" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.196254 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.216241 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.235688 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.256274 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.275849 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.296010 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.316254 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.337212 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.356951 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.362636 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3cd630b-3fe5-497f-90da-f58bcb7aac8b-metrics-tls\") pod \"ingress-operator-5b745b69d9-h2q6n\" (UID: \"f3cd630b-3fe5-497f-90da-f58bcb7aac8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.390010 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.396165 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.399485 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3cd630b-3fe5-497f-90da-f58bcb7aac8b-trusted-ca\") pod \"ingress-operator-5b745b69d9-h2q6n\" (UID: \"f3cd630b-3fe5-497f-90da-f58bcb7aac8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.416417 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.436333 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.456556 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.470773 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4886a33e-8379-47d5-ac13-e58bc623d01c-srv-cert\") pod \"catalog-operator-68c6474976-wb74c\" (UID: \"4886a33e-8379-47d5-ac13-e58bc623d01c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.477587 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.483060 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6-secret-volume\") pod \"collect-profiles-29493870-4jnh6\" (UID: \"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.493331 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4886a33e-8379-47d5-ac13-e58bc623d01c-profile-collector-cert\") pod \"catalog-operator-68c6474976-wb74c\" (UID: \"4886a33e-8379-47d5-ac13-e58bc623d01c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.496304 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.516046 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.525225 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f31f9ca0-e467-452d-90db-a28a4b69496e-config\") pod \"etcd-operator-b45778765-dbwsb\" (UID: \"f31f9ca0-e467-452d-90db-a28a4b69496e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.536964 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.543585 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aa555a55-a0b0-47e3-959f-e2d8d387aae2-webhook-cert\") pod \"packageserver-d55dfcdfc-8vvmj\" (UID: \"aa555a55-a0b0-47e3-959f-e2d8d387aae2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.553412 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aa555a55-a0b0-47e3-959f-e2d8d387aae2-apiservice-cert\") pod \"packageserver-d55dfcdfc-8vvmj\" (UID: \"aa555a55-a0b0-47e3-959f-e2d8d387aae2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.557209 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.567040 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c068f936-795b-4eb3-83a8-e363131119e9-service-ca-bundle\") pod \"router-default-5444994796-rjmhd\" (UID: \"c068f936-795b-4eb3-83a8-e363131119e9\") " pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.575616 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.591709 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c068f936-795b-4eb3-83a8-e363131119e9-stats-auth\") pod \"router-default-5444994796-rjmhd\" (UID: \"c068f936-795b-4eb3-83a8-e363131119e9\") " pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.595505 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.603765 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c068f936-795b-4eb3-83a8-e363131119e9-metrics-certs\") pod \"router-default-5444994796-rjmhd\" (UID: \"c068f936-795b-4eb3-83a8-e363131119e9\") " pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.616159 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.635150 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.648938 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c068f936-795b-4eb3-83a8-e363131119e9-default-certificate\") pod \"router-default-5444994796-rjmhd\" (UID: \"c068f936-795b-4eb3-83a8-e363131119e9\") " pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.656907 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.677943 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.695900 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.715926 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.719358 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-config\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.737016 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.757218 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.770436 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cedee54-2c6e-44d4-a51a-b5f8d3ff0833-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-hlqz9\" (UID: \"0cedee54-2c6e-44d4-a51a-b5f8d3ff0833\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hlqz9" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.777280 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.785851 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cedee54-2c6e-44d4-a51a-b5f8d3ff0833-config\") pod \"kube-controller-manager-operator-78b949d7b-hlqz9\" (UID: \"0cedee54-2c6e-44d4-a51a-b5f8d3ff0833\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hlqz9" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.797762 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.810296 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-serving-cert\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.816304 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.830683 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-encryption-config\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.835710 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.856292 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.871278 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-etcd-client\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.875138 4746 request.go:700] Waited for 1.008289679s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver/configmaps?fieldSelector=metadata.name%3Daudit-1&limit=500&resourceVersion=0 Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.876789 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.885151 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-audit\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.896303 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.898798 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-etcd-serving-ca\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.915602 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.925907 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-image-import-ca\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.948056 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.951769 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-trusted-ca-bundle\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.956366 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.976575 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 28 20:41:53 crc kubenswrapper[4746]: I0128 20:41:53.997364 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.006371 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6-config-volume\") pod \"collect-profiles-29493870-4jnh6\" (UID: \"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.017294 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.033809 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/71db2fbb-6c1a-417f-9cc4-a1ae041d0c0a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zzrh5\" (UID: \"71db2fbb-6c1a-417f-9cc4-a1ae041d0c0a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zzrh5" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.056400 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.077019 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.097231 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.116192 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.136992 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.157015 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.177628 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.196888 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.216967 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.236902 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.257262 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.277795 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.296542 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.317358 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.336638 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.356547 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.378387 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.396704 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.424737 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.436249 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.457483 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.476908 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.516705 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.536663 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.557895 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.603251 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9l2t\" (UniqueName: \"kubernetes.io/projected/6479c53a-0e30-4805-bdb8-314a66127b5c-kube-api-access-b9l2t\") pod \"machine-approver-56656f9798-v7zfx\" (UID: \"6479c53a-0e30-4805-bdb8-314a66127b5c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.617557 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7rz5\" (UniqueName: \"kubernetes.io/projected/594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960-kube-api-access-f7rz5\") pod \"openshift-apiserver-operator-796bbdcf4f-8w6v4\" (UID: \"594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8w6v4" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.630611 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mg5d\" (UniqueName: \"kubernetes.io/projected/ac478ffc-e1e4-4b72-bf99-d35c2636a78d-kube-api-access-2mg5d\") pod \"authentication-operator-69f744f599-8mmbg\" (UID: \"ac478ffc-e1e4-4b72-bf99-d35c2636a78d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.652751 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.656600 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjr7f\" (UniqueName: \"kubernetes.io/projected/b34266ce-b971-4f4b-b8b7-c54ff8b6212c-kube-api-access-qjr7f\") pod \"downloads-7954f5f757-qrffw\" (UID: \"b34266ce-b971-4f4b-b8b7-c54ff8b6212c\") " pod="openshift-console/downloads-7954f5f757-qrffw" Jan 28 20:41:54 crc kubenswrapper[4746]: W0128 20:41:54.673407 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6479c53a_0e30_4805_bdb8_314a66127b5c.slice/crio-db7eaa0676ac9f210f3bf461ac06761c61d194192f22b15807fe0b5ba496acac WatchSource:0}: Error finding container db7eaa0676ac9f210f3bf461ac06761c61d194192f22b15807fe0b5ba496acac: Status 404 returned error can't find the container with id db7eaa0676ac9f210f3bf461ac06761c61d194192f22b15807fe0b5ba496acac Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.682984 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rktqc\" (UniqueName: \"kubernetes.io/projected/447abd89-31fd-4bb6-a965-97d7954f47bb-kube-api-access-rktqc\") pod \"route-controller-manager-6576b87f9c-sjqv2\" (UID: \"447abd89-31fd-4bb6-a965-97d7954f47bb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.683278 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.691579 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6zqh\" (UniqueName: \"kubernetes.io/projected/8c7789bf-f2f7-4cf4-97f1-3d8a7438daca-kube-api-access-h6zqh\") pod \"cluster-image-registry-operator-dc59b4c8b-z2vr7\" (UID: \"8c7789bf-f2f7-4cf4-97f1-3d8a7438daca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.709496 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qrffw" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.710196 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8c7789bf-f2f7-4cf4-97f1-3d8a7438daca-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-z2vr7\" (UID: \"8c7789bf-f2f7-4cf4-97f1-3d8a7438daca\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.731067 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctjhc\" (UniqueName: \"kubernetes.io/projected/e3e01537-7adb-4a81-a9cb-3deb73a1d5b3-kube-api-access-ctjhc\") pod \"cluster-samples-operator-665b6dd947-xxnb8\" (UID: \"e3e01537-7adb-4a81-a9cb-3deb73a1d5b3\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xxnb8" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.760648 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6krvq\" (UniqueName: \"kubernetes.io/projected/6208130d-52bc-449e-b371-357b1cc21b22-kube-api-access-6krvq\") pod \"oauth-openshift-558db77b4-4g4p7\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.778568 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8w6v4" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.781070 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdh2x\" (UniqueName: \"kubernetes.io/projected/0ea55ab8-bec0-44e8-8105-c8c604fc5fa1-kube-api-access-cdh2x\") pod \"console-operator-58897d9998-4cmsz\" (UID: \"0ea55ab8-bec0-44e8-8105-c8c604fc5fa1\") " pod="openshift-console-operator/console-operator-58897d9998-4cmsz" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.854627 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbx4z\" (UniqueName: \"kubernetes.io/projected/f69da289-e18c-4baa-ad6b-4f2e3a44cda5-kube-api-access-dbx4z\") pod \"apiserver-7bbb656c7d-nz5l2\" (UID: \"f69da289-e18c-4baa-ad6b-4f2e3a44cda5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.857012 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xxnb8" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.874568 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvxmk\" (UniqueName: \"kubernetes.io/projected/3e64bb6e-1131-431b-b87c-71e25d294fe1-kube-api-access-tvxmk\") pod \"machine-api-operator-5694c8668f-lzj8l\" (UID: \"3e64bb6e-1131-431b-b87c-71e25d294fe1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lzj8l" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.877777 4746 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.880178 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k8vv\" (UniqueName: \"kubernetes.io/projected/0f0eb07a-e0b4-4702-89b0-d94e937471a5-kube-api-access-2k8vv\") pod \"controller-manager-879f6c89f-5d8cs\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.882824 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tknp\" (UniqueName: \"kubernetes.io/projected/94cef654-afbe-42c2-8069-5dbcb7294abb-kube-api-access-6tknp\") pod \"console-f9d7485db-hcxv8\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.885286 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.886313 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64bpv\" (UniqueName: \"kubernetes.io/projected/d66e245a-95d4-49c2-9172-2f095fab3e2b-kube-api-access-64bpv\") pod \"openshift-controller-manager-operator-756b6f6bc6-5gjgg\" (UID: \"d66e245a-95d4-49c2-9172-2f095fab3e2b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gjgg" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.894177 4746 request.go:700] Waited for 1.900417839s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.896546 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.903465 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.916399 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.928700 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.936311 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.945037 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-4cmsz" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.963761 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.971352 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.980366 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 28 20:41:54 crc kubenswrapper[4746]: I0128 20:41:54.996898 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.027435 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gjgg" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.027626 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.036450 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.054115 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2"] Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.056433 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.100094 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrbz8\" (UniqueName: \"kubernetes.io/projected/f3cd630b-3fe5-497f-90da-f58bcb7aac8b-kube-api-access-lrbz8\") pod \"ingress-operator-5b745b69d9-h2q6n\" (UID: \"f3cd630b-3fe5-497f-90da-f58bcb7aac8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.102947 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lzj8l" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.112284 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnkbm\" (UniqueName: \"kubernetes.io/projected/72e0847f-0a87-4710-9765-a10282cc0529-kube-api-access-vnkbm\") pod \"control-plane-machine-set-operator-78cbb6b69f-mznpb\" (UID: \"72e0847f-0a87-4710-9765-a10282cc0529\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mznpb" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.116808 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.133695 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8w6v4"] Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.140095 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slszm\" (UniqueName: \"kubernetes.io/projected/c068f936-795b-4eb3-83a8-e363131119e9-kube-api-access-slszm\") pod \"router-default-5444994796-rjmhd\" (UID: \"c068f936-795b-4eb3-83a8-e363131119e9\") " pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.149887 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjlmv\" (UniqueName: \"kubernetes.io/projected/aa7d6dd6-1d54-400f-a188-628f99083f93-kube-api-access-hjlmv\") pod \"multus-admission-controller-857f4d67dd-lqlcg\" (UID: \"aa7d6dd6-1d54-400f-a188-628f99083f93\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lqlcg" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.163880 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.164290 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lqlcg" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.177004 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xxnb8"] Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.186529 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/460c9e37-a0c0-43ea-9607-8f716e2e92bf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gbwrh\" (UID: \"460c9e37-a0c0-43ea-9607-8f716e2e92bf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gbwrh" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.196894 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mznpb" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.204964 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tgv9\" (UniqueName: \"kubernetes.io/projected/4886a33e-8379-47d5-ac13-e58bc623d01c-kube-api-access-6tgv9\") pod \"catalog-operator-68c6474976-wb74c\" (UID: \"4886a33e-8379-47d5-ac13-e58bc623d01c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.218018 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jq78\" (UniqueName: \"kubernetes.io/projected/c29355d7-4e9a-4e5a-838e-77a4df7c2fda-kube-api-access-5jq78\") pod \"dns-operator-744455d44c-q8fsc\" (UID: \"c29355d7-4e9a-4e5a-838e-77a4df7c2fda\") " pod="openshift-dns-operator/dns-operator-744455d44c-q8fsc" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.229144 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.234370 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qrffw"] Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.245449 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.261753 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfhc8\" (UniqueName: \"kubernetes.io/projected/f31f9ca0-e467-452d-90db-a28a4b69496e-kube-api-access-lfhc8\") pod \"etcd-operator-b45778765-dbwsb\" (UID: \"f31f9ca0-e467-452d-90db-a28a4b69496e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.267608 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4cmsz"] Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.268030 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdbcf\" (UniqueName: \"kubernetes.io/projected/aa555a55-a0b0-47e3-959f-e2d8d387aae2-kube-api-access-pdbcf\") pod \"packageserver-d55dfcdfc-8vvmj\" (UID: \"aa555a55-a0b0-47e3-959f-e2d8d387aae2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.280151 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gbwrh" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.293339 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kglc\" (UniqueName: \"kubernetes.io/projected/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6-kube-api-access-9kglc\") pod \"collect-profiles-29493870-4jnh6\" (UID: \"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.306389 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0cedee54-2c6e-44d4-a51a-b5f8d3ff0833-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-hlqz9\" (UID: \"0cedee54-2c6e-44d4-a51a-b5f8d3ff0833\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hlqz9" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.313633 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b8a9bdb-0612-4627-ba36-98293308c32d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-562pf\" (UID: \"8b8a9bdb-0612-4627-ba36-98293308c32d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-562pf" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.332387 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49tzx\" (UniqueName: \"kubernetes.io/projected/af3e8fcb-fbde-4b65-92ab-3d8b71b2de07-kube-api-access-49tzx\") pod \"apiserver-76f77b778f-sk7xz\" (UID: \"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07\") " pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.357419 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdpbf\" (UniqueName: \"kubernetes.io/projected/18ad1fee-6a3a-4b98-83c5-6a13f22699db-kube-api-access-xdpbf\") pod \"openshift-config-operator-7777fb866f-rktng\" (UID: \"18ad1fee-6a3a-4b98-83c5-6a13f22699db\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rktng" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.374474 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f3cd630b-3fe5-497f-90da-f58bcb7aac8b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-h2q6n\" (UID: \"f3cd630b-3fe5-497f-90da-f58bcb7aac8b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.406381 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr849\" (UniqueName: \"kubernetes.io/projected/71db2fbb-6c1a-417f-9cc4-a1ae041d0c0a-kube-api-access-gr849\") pod \"package-server-manager-789f6589d5-zzrh5\" (UID: \"71db2fbb-6c1a-417f-9cc4-a1ae041d0c0a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zzrh5" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.427258 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4tp6\" (UniqueName: \"kubernetes.io/projected/4e03d657-3b57-4eea-bb77-f5fe3a519cac-kube-api-access-g4tp6\") pod \"migrator-59844c95c7-s8b4b\" (UID: \"4e03d657-3b57-4eea-bb77-f5fe3a519cac\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s8b4b" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.439716 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rktng" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.447305 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-562pf" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.452118 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s8b4b" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.458985 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-q8fsc" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.463024 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hcxv8"] Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.473595 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c71e425-b304-49f8-ac53-5e1383f73eb7-config\") pod \"service-ca-operator-777779d784-62cbt\" (UID: \"3c71e425-b304-49f8-ac53-5e1383f73eb7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-62cbt" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.473637 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8lv5\" (UniqueName: \"kubernetes.io/projected/bcb19d77-0dbf-4d14-a86c-4cd7e65211e0-kube-api-access-t8lv5\") pod \"machine-config-controller-84d6567774-rbl7q\" (UID: \"bcb19d77-0dbf-4d14-a86c-4cd7e65211e0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rbl7q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.473685 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-registry-certificates\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.473814 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-registry-tls\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.473837 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.473893 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p4mh\" (UniqueName: \"kubernetes.io/projected/5b1d3e40-8458-4661-854f-c16ab4cd7596-kube-api-access-5p4mh\") pod \"kube-storage-version-migrator-operator-b67b599dd-m6pvw\" (UID: \"5b1d3e40-8458-4661-854f-c16ab4cd7596\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m6pvw" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.473918 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.473957 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz6vv\" (UniqueName: \"kubernetes.io/projected/3c71e425-b304-49f8-ac53-5e1383f73eb7-kube-api-access-sz6vv\") pod \"service-ca-operator-777779d784-62cbt\" (UID: \"3c71e425-b304-49f8-ac53-5e1383f73eb7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-62cbt" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.474009 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bcb19d77-0dbf-4d14-a86c-4cd7e65211e0-proxy-tls\") pod \"machine-config-controller-84d6567774-rbl7q\" (UID: \"bcb19d77-0dbf-4d14-a86c-4cd7e65211e0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rbl7q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.474041 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-trusted-ca\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.474067 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1d3e40-8458-4661-854f-c16ab4cd7596-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-m6pvw\" (UID: \"5b1d3e40-8458-4661-854f-c16ab4cd7596\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m6pvw" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.474158 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b1d3e40-8458-4661-854f-c16ab4cd7596-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-m6pvw\" (UID: \"5b1d3e40-8458-4661-854f-c16ab4cd7596\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m6pvw" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.474180 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c71e425-b304-49f8-ac53-5e1383f73eb7-serving-cert\") pod \"service-ca-operator-777779d784-62cbt\" (UID: \"3c71e425-b304-49f8-ac53-5e1383f73eb7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-62cbt" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.474205 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwdsp\" (UniqueName: \"kubernetes.io/projected/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-kube-api-access-lwdsp\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.474250 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-bound-sa-token\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.474268 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.474337 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bcb19d77-0dbf-4d14-a86c-4cd7e65211e0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rbl7q\" (UID: \"bcb19d77-0dbf-4d14-a86c-4cd7e65211e0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rbl7q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.478554 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" Jan 28 20:41:55 crc kubenswrapper[4746]: E0128 20:41:55.486461 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:55.986426634 +0000 UTC m=+143.942612988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.520051 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.538309 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.541714 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7"] Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.546892 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4g4p7"] Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.552770 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gjgg"] Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.555550 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8mmbg"] Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.553068 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hlqz9" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.558296 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.559960 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lzj8l"] Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.567493 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.574837 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zzrh5" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.580348 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.581279 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/250aa960-c630-48e1-b8a2-8b34917bccb1-config-volume\") pod \"dns-default-6hnhg\" (UID: \"250aa960-c630-48e1-b8a2-8b34917bccb1\") " pod="openshift-dns/dns-default-6hnhg" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.581481 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ptth\" (UniqueName: \"kubernetes.io/projected/250aa960-c630-48e1-b8a2-8b34917bccb1-kube-api-access-5ptth\") pod \"dns-default-6hnhg\" (UID: \"250aa960-c630-48e1-b8a2-8b34917bccb1\") " pod="openshift-dns/dns-default-6hnhg" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.581513 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-trusted-ca\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.581558 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1d3e40-8458-4661-854f-c16ab4cd7596-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-m6pvw\" (UID: \"5b1d3e40-8458-4661-854f-c16ab4cd7596\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m6pvw" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.581654 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/46800869-0687-4288-9a6e-3512b2e2c499-signing-cabundle\") pod \"service-ca-9c57cc56f-vfcnn\" (UID: \"46800869-0687-4288-9a6e-3512b2e2c499\") " pod="openshift-service-ca/service-ca-9c57cc56f-vfcnn" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.581749 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b1d3e40-8458-4661-854f-c16ab4cd7596-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-m6pvw\" (UID: \"5b1d3e40-8458-4661-854f-c16ab4cd7596\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m6pvw" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.581773 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/058433b4-653b-4170-83e2-ed7c5d753323-socket-dir\") pod \"csi-hostpathplugin-89zbv\" (UID: \"058433b4-653b-4170-83e2-ed7c5d753323\") " pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.581870 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c71e425-b304-49f8-ac53-5e1383f73eb7-serving-cert\") pod \"service-ca-operator-777779d784-62cbt\" (UID: \"3c71e425-b304-49f8-ac53-5e1383f73eb7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-62cbt" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582038 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csb4d\" (UniqueName: \"kubernetes.io/projected/ef8a3065-87eb-468f-a985-936f973c8f1a-kube-api-access-csb4d\") pod \"ingress-canary-wpwwb\" (UID: \"ef8a3065-87eb-468f-a985-936f973c8f1a\") " pod="openshift-ingress-canary/ingress-canary-wpwwb" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582065 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwdsp\" (UniqueName: \"kubernetes.io/projected/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-kube-api-access-lwdsp\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582107 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/46800869-0687-4288-9a6e-3512b2e2c499-signing-key\") pod \"service-ca-9c57cc56f-vfcnn\" (UID: \"46800869-0687-4288-9a6e-3512b2e2c499\") " pod="openshift-service-ca/service-ca-9c57cc56f-vfcnn" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582155 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-bound-sa-token\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582213 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582299 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3edaca00-e1a6-4b56-9290-cad6311263ee-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sqh2q\" (UID: \"3edaca00-e1a6-4b56-9290-cad6311263ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582377 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfnjg\" (UniqueName: \"kubernetes.io/projected/46800869-0687-4288-9a6e-3512b2e2c499-kube-api-access-dfnjg\") pod \"service-ca-9c57cc56f-vfcnn\" (UID: \"46800869-0687-4288-9a6e-3512b2e2c499\") " pod="openshift-service-ca/service-ca-9c57cc56f-vfcnn" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582418 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fx5j\" (UniqueName: \"kubernetes.io/projected/7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee-kube-api-access-9fx5j\") pod \"olm-operator-6b444d44fb-27vdq\" (UID: \"7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582470 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1f9d4298-e60e-4a66-94fd-20b80ac131cf-node-bootstrap-token\") pod \"machine-config-server-q9qqq\" (UID: \"1f9d4298-e60e-4a66-94fd-20b80ac131cf\") " pod="openshift-machine-config-operator/machine-config-server-q9qqq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582504 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgsjl\" (UniqueName: \"kubernetes.io/projected/89ebd250-beb4-4c8e-8889-bd221f68af5e-kube-api-access-fgsjl\") pod \"machine-config-operator-74547568cd-wh6h4\" (UID: \"89ebd250-beb4-4c8e-8889-bd221f68af5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582522 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3edaca00-e1a6-4b56-9290-cad6311263ee-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sqh2q\" (UID: \"3edaca00-e1a6-4b56-9290-cad6311263ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582654 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/89ebd250-beb4-4c8e-8889-bd221f68af5e-images\") pod \"machine-config-operator-74547568cd-wh6h4\" (UID: \"89ebd250-beb4-4c8e-8889-bd221f68af5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582684 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bcb19d77-0dbf-4d14-a86c-4cd7e65211e0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rbl7q\" (UID: \"bcb19d77-0dbf-4d14-a86c-4cd7e65211e0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rbl7q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582702 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee-srv-cert\") pod \"olm-operator-6b444d44fb-27vdq\" (UID: \"7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582764 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/250aa960-c630-48e1-b8a2-8b34917bccb1-metrics-tls\") pod \"dns-default-6hnhg\" (UID: \"250aa960-c630-48e1-b8a2-8b34917bccb1\") " pod="openshift-dns/dns-default-6hnhg" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582785 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/058433b4-653b-4170-83e2-ed7c5d753323-registration-dir\") pod \"csi-hostpathplugin-89zbv\" (UID: \"058433b4-653b-4170-83e2-ed7c5d753323\") " pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582845 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c71e425-b304-49f8-ac53-5e1383f73eb7-config\") pod \"service-ca-operator-777779d784-62cbt\" (UID: \"3c71e425-b304-49f8-ac53-5e1383f73eb7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-62cbt" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582863 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8lv5\" (UniqueName: \"kubernetes.io/projected/bcb19d77-0dbf-4d14-a86c-4cd7e65211e0-kube-api-access-t8lv5\") pod \"machine-config-controller-84d6567774-rbl7q\" (UID: \"bcb19d77-0dbf-4d14-a86c-4cd7e65211e0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rbl7q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582934 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-registry-certificates\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582967 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89ebd250-beb4-4c8e-8889-bd221f68af5e-proxy-tls\") pod \"machine-config-operator-74547568cd-wh6h4\" (UID: \"89ebd250-beb4-4c8e-8889-bd221f68af5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.582985 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq6tq\" (UniqueName: \"kubernetes.io/projected/1f9d4298-e60e-4a66-94fd-20b80ac131cf-kube-api-access-fq6tq\") pod \"machine-config-server-q9qqq\" (UID: \"1f9d4298-e60e-4a66-94fd-20b80ac131cf\") " pod="openshift-machine-config-operator/machine-config-server-q9qqq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.583010 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/058433b4-653b-4170-83e2-ed7c5d753323-plugins-dir\") pod \"csi-hostpathplugin-89zbv\" (UID: \"058433b4-653b-4170-83e2-ed7c5d753323\") " pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.583093 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-registry-tls\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.583110 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk9vt\" (UniqueName: \"kubernetes.io/projected/058433b4-653b-4170-83e2-ed7c5d753323-kube-api-access-tk9vt\") pod \"csi-hostpathplugin-89zbv\" (UID: \"058433b4-653b-4170-83e2-ed7c5d753323\") " pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.583140 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.583179 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/89ebd250-beb4-4c8e-8889-bd221f68af5e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wh6h4\" (UID: \"89ebd250-beb4-4c8e-8889-bd221f68af5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.583201 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee-profile-collector-cert\") pod \"olm-operator-6b444d44fb-27vdq\" (UID: \"7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.583217 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef8a3065-87eb-468f-a985-936f973c8f1a-cert\") pod \"ingress-canary-wpwwb\" (UID: \"ef8a3065-87eb-468f-a985-936f973c8f1a\") " pod="openshift-ingress-canary/ingress-canary-wpwwb" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.583232 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1f9d4298-e60e-4a66-94fd-20b80ac131cf-certs\") pod \"machine-config-server-q9qqq\" (UID: \"1f9d4298-e60e-4a66-94fd-20b80ac131cf\") " pod="openshift-machine-config-operator/machine-config-server-q9qqq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.583291 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p4mh\" (UniqueName: \"kubernetes.io/projected/5b1d3e40-8458-4661-854f-c16ab4cd7596-kube-api-access-5p4mh\") pod \"kube-storage-version-migrator-operator-b67b599dd-m6pvw\" (UID: \"5b1d3e40-8458-4661-854f-c16ab4cd7596\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m6pvw" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.583359 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbstf\" (UniqueName: \"kubernetes.io/projected/3edaca00-e1a6-4b56-9290-cad6311263ee-kube-api-access-kbstf\") pod \"marketplace-operator-79b997595-sqh2q\" (UID: \"3edaca00-e1a6-4b56-9290-cad6311263ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.583382 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/058433b4-653b-4170-83e2-ed7c5d753323-csi-data-dir\") pod \"csi-hostpathplugin-89zbv\" (UID: \"058433b4-653b-4170-83e2-ed7c5d753323\") " pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.583436 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz6vv\" (UniqueName: \"kubernetes.io/projected/3c71e425-b304-49f8-ac53-5e1383f73eb7-kube-api-access-sz6vv\") pod \"service-ca-operator-777779d784-62cbt\" (UID: \"3c71e425-b304-49f8-ac53-5e1383f73eb7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-62cbt" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.583546 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bcb19d77-0dbf-4d14-a86c-4cd7e65211e0-proxy-tls\") pod \"machine-config-controller-84d6567774-rbl7q\" (UID: \"bcb19d77-0dbf-4d14-a86c-4cd7e65211e0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rbl7q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.583618 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/058433b4-653b-4170-83e2-ed7c5d753323-mountpoint-dir\") pod \"csi-hostpathplugin-89zbv\" (UID: \"058433b4-653b-4170-83e2-ed7c5d753323\") " pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: E0128 20:41:55.583807 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:56.083746953 +0000 UTC m=+144.039933307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.584612 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bcb19d77-0dbf-4d14-a86c-4cd7e65211e0-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rbl7q\" (UID: \"bcb19d77-0dbf-4d14-a86c-4cd7e65211e0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rbl7q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.586339 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b1d3e40-8458-4661-854f-c16ab4cd7596-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-m6pvw\" (UID: \"5b1d3e40-8458-4661-854f-c16ab4cd7596\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m6pvw" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.587237 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-trusted-ca\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.587889 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.590486 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c71e425-b304-49f8-ac53-5e1383f73eb7-config\") pod \"service-ca-operator-777779d784-62cbt\" (UID: \"3c71e425-b304-49f8-ac53-5e1383f73eb7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-62cbt" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.595931 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-registry-certificates\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.601003 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bcb19d77-0dbf-4d14-a86c-4cd7e65211e0-proxy-tls\") pod \"machine-config-controller-84d6567774-rbl7q\" (UID: \"bcb19d77-0dbf-4d14-a86c-4cd7e65211e0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rbl7q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.601435 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b1d3e40-8458-4661-854f-c16ab4cd7596-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-m6pvw\" (UID: \"5b1d3e40-8458-4661-854f-c16ab4cd7596\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m6pvw" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.602307 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-registry-tls\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.602564 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c71e425-b304-49f8-ac53-5e1383f73eb7-serving-cert\") pod \"service-ca-operator-777779d784-62cbt\" (UID: \"3c71e425-b304-49f8-ac53-5e1383f73eb7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-62cbt" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.617073 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.655568 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4cmsz" event={"ID":"0ea55ab8-bec0-44e8-8105-c8c604fc5fa1","Type":"ContainerStarted","Data":"727716823ec0e733437aa6c1d25de73ec8cc728a877953eaa49b86e9bc7607db"} Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.660836 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwdsp\" (UniqueName: \"kubernetes.io/projected/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-kube-api-access-lwdsp\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.664649 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7" event={"ID":"8c7789bf-f2f7-4cf4-97f1-3d8a7438daca","Type":"ContainerStarted","Data":"5d337105a2a0191ec76d6f868c57949df765832dd481abfa8e50955596149432"} Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.671318 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-rjmhd" event={"ID":"c068f936-795b-4eb3-83a8-e363131119e9","Type":"ContainerStarted","Data":"94d0c6fd2320a0997dc9d81605843fecb4c744ccc7f8052d231377f07ae9365b"} Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.671380 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-rjmhd" event={"ID":"c068f936-795b-4eb3-83a8-e363131119e9","Type":"ContainerStarted","Data":"152665fe44da0a3b4d983ade4bd27c39ce9cd48ac5ba4fcd0bae0bb5c183fe4e"} Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.687769 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz6vv\" (UniqueName: \"kubernetes.io/projected/3c71e425-b304-49f8-ac53-5e1383f73eb7-kube-api-access-sz6vv\") pod \"service-ca-operator-777779d784-62cbt\" (UID: \"3c71e425-b304-49f8-ac53-5e1383f73eb7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-62cbt" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.688461 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p4mh\" (UniqueName: \"kubernetes.io/projected/5b1d3e40-8458-4661-854f-c16ab4cd7596-kube-api-access-5p4mh\") pod \"kube-storage-version-migrator-operator-b67b599dd-m6pvw\" (UID: \"5b1d3e40-8458-4661-854f-c16ab4cd7596\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m6pvw" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.689784 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/46800869-0687-4288-9a6e-3512b2e2c499-signing-cabundle\") pod \"service-ca-9c57cc56f-vfcnn\" (UID: \"46800869-0687-4288-9a6e-3512b2e2c499\") " pod="openshift-service-ca/service-ca-9c57cc56f-vfcnn" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.689840 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/058433b4-653b-4170-83e2-ed7c5d753323-socket-dir\") pod \"csi-hostpathplugin-89zbv\" (UID: \"058433b4-653b-4170-83e2-ed7c5d753323\") " pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.689877 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csb4d\" (UniqueName: \"kubernetes.io/projected/ef8a3065-87eb-468f-a985-936f973c8f1a-kube-api-access-csb4d\") pod \"ingress-canary-wpwwb\" (UID: \"ef8a3065-87eb-468f-a985-936f973c8f1a\") " pod="openshift-ingress-canary/ingress-canary-wpwwb" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.689910 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/46800869-0687-4288-9a6e-3512b2e2c499-signing-key\") pod \"service-ca-9c57cc56f-vfcnn\" (UID: \"46800869-0687-4288-9a6e-3512b2e2c499\") " pod="openshift-service-ca/service-ca-9c57cc56f-vfcnn" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.689956 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3edaca00-e1a6-4b56-9290-cad6311263ee-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sqh2q\" (UID: \"3edaca00-e1a6-4b56-9290-cad6311263ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.689993 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfnjg\" (UniqueName: \"kubernetes.io/projected/46800869-0687-4288-9a6e-3512b2e2c499-kube-api-access-dfnjg\") pod \"service-ca-9c57cc56f-vfcnn\" (UID: \"46800869-0687-4288-9a6e-3512b2e2c499\") " pod="openshift-service-ca/service-ca-9c57cc56f-vfcnn" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690024 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fx5j\" (UniqueName: \"kubernetes.io/projected/7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee-kube-api-access-9fx5j\") pod \"olm-operator-6b444d44fb-27vdq\" (UID: \"7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690059 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1f9d4298-e60e-4a66-94fd-20b80ac131cf-node-bootstrap-token\") pod \"machine-config-server-q9qqq\" (UID: \"1f9d4298-e60e-4a66-94fd-20b80ac131cf\") " pod="openshift-machine-config-operator/machine-config-server-q9qqq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690104 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3edaca00-e1a6-4b56-9290-cad6311263ee-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sqh2q\" (UID: \"3edaca00-e1a6-4b56-9290-cad6311263ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690138 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgsjl\" (UniqueName: \"kubernetes.io/projected/89ebd250-beb4-4c8e-8889-bd221f68af5e-kube-api-access-fgsjl\") pod \"machine-config-operator-74547568cd-wh6h4\" (UID: \"89ebd250-beb4-4c8e-8889-bd221f68af5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690193 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/89ebd250-beb4-4c8e-8889-bd221f68af5e-images\") pod \"machine-config-operator-74547568cd-wh6h4\" (UID: \"89ebd250-beb4-4c8e-8889-bd221f68af5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690220 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee-srv-cert\") pod \"olm-operator-6b444d44fb-27vdq\" (UID: \"7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690254 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/250aa960-c630-48e1-b8a2-8b34917bccb1-metrics-tls\") pod \"dns-default-6hnhg\" (UID: \"250aa960-c630-48e1-b8a2-8b34917bccb1\") " pod="openshift-dns/dns-default-6hnhg" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690282 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/058433b4-653b-4170-83e2-ed7c5d753323-registration-dir\") pod \"csi-hostpathplugin-89zbv\" (UID: \"058433b4-653b-4170-83e2-ed7c5d753323\") " pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690331 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89ebd250-beb4-4c8e-8889-bd221f68af5e-proxy-tls\") pod \"machine-config-operator-74547568cd-wh6h4\" (UID: \"89ebd250-beb4-4c8e-8889-bd221f68af5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690363 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq6tq\" (UniqueName: \"kubernetes.io/projected/1f9d4298-e60e-4a66-94fd-20b80ac131cf-kube-api-access-fq6tq\") pod \"machine-config-server-q9qqq\" (UID: \"1f9d4298-e60e-4a66-94fd-20b80ac131cf\") " pod="openshift-machine-config-operator/machine-config-server-q9qqq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690393 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/058433b4-653b-4170-83e2-ed7c5d753323-plugins-dir\") pod \"csi-hostpathplugin-89zbv\" (UID: \"058433b4-653b-4170-83e2-ed7c5d753323\") " pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690437 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk9vt\" (UniqueName: \"kubernetes.io/projected/058433b4-653b-4170-83e2-ed7c5d753323-kube-api-access-tk9vt\") pod \"csi-hostpathplugin-89zbv\" (UID: \"058433b4-653b-4170-83e2-ed7c5d753323\") " pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690482 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/89ebd250-beb4-4c8e-8889-bd221f68af5e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wh6h4\" (UID: \"89ebd250-beb4-4c8e-8889-bd221f68af5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690516 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee-profile-collector-cert\") pod \"olm-operator-6b444d44fb-27vdq\" (UID: \"7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690547 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef8a3065-87eb-468f-a985-936f973c8f1a-cert\") pod \"ingress-canary-wpwwb\" (UID: \"ef8a3065-87eb-468f-a985-936f973c8f1a\") " pod="openshift-ingress-canary/ingress-canary-wpwwb" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690573 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1f9d4298-e60e-4a66-94fd-20b80ac131cf-certs\") pod \"machine-config-server-q9qqq\" (UID: \"1f9d4298-e60e-4a66-94fd-20b80ac131cf\") " pod="openshift-machine-config-operator/machine-config-server-q9qqq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690614 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690645 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbstf\" (UniqueName: \"kubernetes.io/projected/3edaca00-e1a6-4b56-9290-cad6311263ee-kube-api-access-kbstf\") pod \"marketplace-operator-79b997595-sqh2q\" (UID: \"3edaca00-e1a6-4b56-9290-cad6311263ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690677 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/058433b4-653b-4170-83e2-ed7c5d753323-csi-data-dir\") pod \"csi-hostpathplugin-89zbv\" (UID: \"058433b4-653b-4170-83e2-ed7c5d753323\") " pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690711 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/058433b4-653b-4170-83e2-ed7c5d753323-mountpoint-dir\") pod \"csi-hostpathplugin-89zbv\" (UID: \"058433b4-653b-4170-83e2-ed7c5d753323\") " pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690742 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/250aa960-c630-48e1-b8a2-8b34917bccb1-config-volume\") pod \"dns-default-6hnhg\" (UID: \"250aa960-c630-48e1-b8a2-8b34917bccb1\") " pod="openshift-dns/dns-default-6hnhg" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.690770 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ptth\" (UniqueName: \"kubernetes.io/projected/250aa960-c630-48e1-b8a2-8b34917bccb1-kube-api-access-5ptth\") pod \"dns-default-6hnhg\" (UID: \"250aa960-c630-48e1-b8a2-8b34917bccb1\") " pod="openshift-dns/dns-default-6hnhg" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.691839 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/46800869-0687-4288-9a6e-3512b2e2c499-signing-cabundle\") pod \"service-ca-9c57cc56f-vfcnn\" (UID: \"46800869-0687-4288-9a6e-3512b2e2c499\") " pod="openshift-service-ca/service-ca-9c57cc56f-vfcnn" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.692113 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/058433b4-653b-4170-83e2-ed7c5d753323-socket-dir\") pod \"csi-hostpathplugin-89zbv\" (UID: \"058433b4-653b-4170-83e2-ed7c5d753323\") " pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.697035 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/058433b4-653b-4170-83e2-ed7c5d753323-csi-data-dir\") pod \"csi-hostpathplugin-89zbv\" (UID: \"058433b4-653b-4170-83e2-ed7c5d753323\") " pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: E0128 20:41:55.697580 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:56.197557224 +0000 UTC m=+144.153743578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.697995 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" event={"ID":"ac478ffc-e1e4-4b72-bf99-d35c2636a78d","Type":"ContainerStarted","Data":"53f03a07874ae3c21d80e58722b5901501ebcda41599e534c0224bd732d7a895"} Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.698239 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/058433b4-653b-4170-83e2-ed7c5d753323-mountpoint-dir\") pod \"csi-hostpathplugin-89zbv\" (UID: \"058433b4-653b-4170-83e2-ed7c5d753323\") " pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.698926 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/89ebd250-beb4-4c8e-8889-bd221f68af5e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-wh6h4\" (UID: \"89ebd250-beb4-4c8e-8889-bd221f68af5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.698990 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/058433b4-653b-4170-83e2-ed7c5d753323-plugins-dir\") pod \"csi-hostpathplugin-89zbv\" (UID: \"058433b4-653b-4170-83e2-ed7c5d753323\") " pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.699621 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3edaca00-e1a6-4b56-9290-cad6311263ee-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sqh2q\" (UID: \"3edaca00-e1a6-4b56-9290-cad6311263ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.700205 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/250aa960-c630-48e1-b8a2-8b34917bccb1-config-volume\") pod \"dns-default-6hnhg\" (UID: \"250aa960-c630-48e1-b8a2-8b34917bccb1\") " pod="openshift-dns/dns-default-6hnhg" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.711729 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/058433b4-653b-4170-83e2-ed7c5d753323-registration-dir\") pod \"csi-hostpathplugin-89zbv\" (UID: \"058433b4-653b-4170-83e2-ed7c5d753323\") " pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.713544 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/89ebd250-beb4-4c8e-8889-bd221f68af5e-images\") pod \"machine-config-operator-74547568cd-wh6h4\" (UID: \"89ebd250-beb4-4c8e-8889-bd221f68af5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.717181 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee-srv-cert\") pod \"olm-operator-6b444d44fb-27vdq\" (UID: \"7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.721868 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef8a3065-87eb-468f-a985-936f973c8f1a-cert\") pod \"ingress-canary-wpwwb\" (UID: \"ef8a3065-87eb-468f-a985-936f973c8f1a\") " pod="openshift-ingress-canary/ingress-canary-wpwwb" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.721911 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1f9d4298-e60e-4a66-94fd-20b80ac131cf-certs\") pod \"machine-config-server-q9qqq\" (UID: \"1f9d4298-e60e-4a66-94fd-20b80ac131cf\") " pod="openshift-machine-config-operator/machine-config-server-q9qqq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.721964 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/46800869-0687-4288-9a6e-3512b2e2c499-signing-key\") pod \"service-ca-9c57cc56f-vfcnn\" (UID: \"46800869-0687-4288-9a6e-3512b2e2c499\") " pod="openshift-service-ca/service-ca-9c57cc56f-vfcnn" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.723715 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee-profile-collector-cert\") pod \"olm-operator-6b444d44fb-27vdq\" (UID: \"7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.727625 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3edaca00-e1a6-4b56-9290-cad6311263ee-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sqh2q\" (UID: \"3edaca00-e1a6-4b56-9290-cad6311263ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.733391 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89ebd250-beb4-4c8e-8889-bd221f68af5e-proxy-tls\") pod \"machine-config-operator-74547568cd-wh6h4\" (UID: \"89ebd250-beb4-4c8e-8889-bd221f68af5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.733903 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8w6v4" event={"ID":"594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960","Type":"ContainerStarted","Data":"977c99526cfe8e12ebf6d63b814c83de1b2962f34324b61b6b8e56d3d287d6b5"} Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.733970 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8w6v4" event={"ID":"594b8ae9-dfcf-4b89-ad6b-7f7bfaa13960","Type":"ContainerStarted","Data":"6cf47fbaebdbb398902b7e1e42fd954fb5d3f702c9e8552a2a34c66e2e901a1c"} Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.786021 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/250aa960-c630-48e1-b8a2-8b34917bccb1-metrics-tls\") pod \"dns-default-6hnhg\" (UID: \"250aa960-c630-48e1-b8a2-8b34917bccb1\") " pod="openshift-dns/dns-default-6hnhg" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.786520 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-bound-sa-token\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.787924 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1f9d4298-e60e-4a66-94fd-20b80ac131cf-node-bootstrap-token\") pod \"machine-config-server-q9qqq\" (UID: \"1f9d4298-e60e-4a66-94fd-20b80ac131cf\") " pod="openshift-machine-config-operator/machine-config-server-q9qqq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.791741 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8lv5\" (UniqueName: \"kubernetes.io/projected/bcb19d77-0dbf-4d14-a86c-4cd7e65211e0-kube-api-access-t8lv5\") pod \"machine-config-controller-84d6567774-rbl7q\" (UID: \"bcb19d77-0dbf-4d14-a86c-4cd7e65211e0\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rbl7q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.792173 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:55 crc kubenswrapper[4746]: E0128 20:41:55.793014 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:56.29299782 +0000 UTC m=+144.249184174 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.794309 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gjgg" event={"ID":"d66e245a-95d4-49c2-9172-2f095fab3e2b","Type":"ContainerStarted","Data":"b8fc01124cd6ccdffa6ddd4fb232e9b90bc748c504a0a263b84dba8f512a2b48"} Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.810611 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lzj8l" event={"ID":"3e64bb6e-1131-431b-b87c-71e25d294fe1","Type":"ContainerStarted","Data":"06c7c2e119382d7e24542796ec1d1f2a1613f487c568341c91fa68390671ae57"} Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.814697 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ptth\" (UniqueName: \"kubernetes.io/projected/250aa960-c630-48e1-b8a2-8b34917bccb1-kube-api-access-5ptth\") pod \"dns-default-6hnhg\" (UID: \"250aa960-c630-48e1-b8a2-8b34917bccb1\") " pod="openshift-dns/dns-default-6hnhg" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.815007 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m6pvw" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.823199 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qrffw" event={"ID":"b34266ce-b971-4f4b-b8b7-c54ff8b6212c","Type":"ContainerStarted","Data":"4f249ef05512a076d87b0e500ee114e613757948008a0fe76059ffa7110ff3f9"} Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.825331 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qrffw" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.826120 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgsjl\" (UniqueName: \"kubernetes.io/projected/89ebd250-beb4-4c8e-8889-bd221f68af5e-kube-api-access-fgsjl\") pod \"machine-config-operator-74547568cd-wh6h4\" (UID: \"89ebd250-beb4-4c8e-8889-bd221f68af5e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.830853 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c"] Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.837663 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq6tq\" (UniqueName: \"kubernetes.io/projected/1f9d4298-e60e-4a66-94fd-20b80ac131cf-kube-api-access-fq6tq\") pod \"machine-config-server-q9qqq\" (UID: \"1f9d4298-e60e-4a66-94fd-20b80ac131cf\") " pod="openshift-machine-config-operator/machine-config-server-q9qqq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.838193 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csb4d\" (UniqueName: \"kubernetes.io/projected/ef8a3065-87eb-468f-a985-936f973c8f1a-kube-api-access-csb4d\") pod \"ingress-canary-wpwwb\" (UID: \"ef8a3065-87eb-468f-a985-936f973c8f1a\") " pod="openshift-ingress-canary/ingress-canary-wpwwb" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.847803 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" event={"ID":"6208130d-52bc-449e-b371-357b1cc21b22","Type":"ContainerStarted","Data":"a474433d408f6ddbb5883924433e9db5dbc565a20c96add848c696d601d0784f"} Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.850685 4746 patch_prober.go:28] interesting pod/downloads-7954f5f757-qrffw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.851985 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qrffw" podUID="b34266ce-b971-4f4b-b8b7-c54ff8b6212c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.858694 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfnjg\" (UniqueName: \"kubernetes.io/projected/46800869-0687-4288-9a6e-3512b2e2c499-kube-api-access-dfnjg\") pod \"service-ca-9c57cc56f-vfcnn\" (UID: \"46800869-0687-4288-9a6e-3512b2e2c499\") " pod="openshift-service-ca/service-ca-9c57cc56f-vfcnn" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.858835 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbstf\" (UniqueName: \"kubernetes.io/projected/3edaca00-e1a6-4b56-9290-cad6311263ee-kube-api-access-kbstf\") pod \"marketplace-operator-79b997595-sqh2q\" (UID: \"3edaca00-e1a6-4b56-9290-cad6311263ee\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.861264 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mznpb"] Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.868417 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hcxv8" event={"ID":"94cef654-afbe-42c2-8069-5dbcb7294abb","Type":"ContainerStarted","Data":"9472ab79081e1688ed461a49bc6fb1a958f71fca59cf6edb83c041bbba201d4d"} Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.886827 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-62cbt" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.893946 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rbl7q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.894960 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: E0128 20:41:55.895335 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:56.395321262 +0000 UTC m=+144.351507616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.906525 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vfcnn" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.906992 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5d8cs"] Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.911482 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fx5j\" (UniqueName: \"kubernetes.io/projected/7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee-kube-api-access-9fx5j\") pod \"olm-operator-6b444d44fb-27vdq\" (UID: \"7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.912015 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.919436 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk9vt\" (UniqueName: \"kubernetes.io/projected/058433b4-653b-4170-83e2-ed7c5d753323-kube-api-access-tk9vt\") pod \"csi-hostpathplugin-89zbv\" (UID: \"058433b4-653b-4170-83e2-ed7c5d753323\") " pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.920946 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.932184 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.937156 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lqlcg"] Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.937215 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-q9qqq" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.937359 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2"] Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.945153 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" event={"ID":"447abd89-31fd-4bb6-a965-97d7954f47bb","Type":"ContainerStarted","Data":"e3cc08b656c04dd728a73d49429e4cfe55f037ac18b352729b49f276d53467d5"} Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.945537 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" event={"ID":"447abd89-31fd-4bb6-a965-97d7954f47bb","Type":"ContainerStarted","Data":"bc5b0dcf6777f7974c3190d1ac497859037c09767ca8f552acdb4d092538fe0b"} Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.946492 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.955822 4746 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-sjqv2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.955914 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" podUID="447abd89-31fd-4bb6-a965-97d7954f47bb" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.956023 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-89zbv" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.959479 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gbwrh"] Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.963357 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wpwwb" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.972446 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6hnhg" Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.996199 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:55 crc kubenswrapper[4746]: E0128 20:41:55.996397 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:56.496364188 +0000 UTC m=+144.452550542 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:55 crc kubenswrapper[4746]: I0128 20:41:55.996627 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:55 crc kubenswrapper[4746]: E0128 20:41:55.997032 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:56.497023918 +0000 UTC m=+144.453210272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.021217 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-s8b4b"] Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.045693 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xxnb8" event={"ID":"e3e01537-7adb-4a81-a9cb-3deb73a1d5b3","Type":"ContainerStarted","Data":"ff09703eba44090f642cdd7d899366d739deaa3342e104c27124b0873481726d"} Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.052895 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx" event={"ID":"6479c53a-0e30-4805-bdb8-314a66127b5c","Type":"ContainerStarted","Data":"ab4717b24b7677365122d8f6b34c8c7e83fdb3005e3f920d2d0069c661a5d569"} Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.052943 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx" event={"ID":"6479c53a-0e30-4805-bdb8-314a66127b5c","Type":"ContainerStarted","Data":"35097c3a8004d9b947cf0a2abee912ed1fa0e63034ee25c30e0f735140fa3596"} Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.052954 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx" event={"ID":"6479c53a-0e30-4805-bdb8-314a66127b5c","Type":"ContainerStarted","Data":"db7eaa0676ac9f210f3bf461ac06761c61d194192f22b15807fe0b5ba496acac"} Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.094362 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-rktng"] Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.110944 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:56 crc kubenswrapper[4746]: E0128 20:41:56.112692 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:56.612646289 +0000 UTC m=+144.568832643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.212009 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:56 crc kubenswrapper[4746]: E0128 20:41:56.213253 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:56.713231582 +0000 UTC m=+144.669417996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.246408 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.281650 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-sk7xz"] Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.313758 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:56 crc kubenswrapper[4746]: E0128 20:41:56.315066 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:56.81503173 +0000 UTC m=+144.771218124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.319257 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:41:56 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:41:56 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:41:56 crc kubenswrapper[4746]: healthz check failed Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.319298 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.319326 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:41:56 crc kubenswrapper[4746]: E0128 20:41:56.319743 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:56.819731124 +0000 UTC m=+144.775917478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.327445 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-rjmhd" podStartSLOduration=123.327419294 podStartE2EDuration="2m3.327419294s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:56.314764613 +0000 UTC m=+144.270950967" watchObservedRunningTime="2026-01-28 20:41:56.327419294 +0000 UTC m=+144.283605658" Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.370241 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dbwsb"] Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.393397 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-v7zfx" podStartSLOduration=124.393364988 podStartE2EDuration="2m4.393364988s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:56.392619786 +0000 UTC m=+144.348806160" watchObservedRunningTime="2026-01-28 20:41:56.393364988 +0000 UTC m=+144.349551342" Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.421608 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:56 crc kubenswrapper[4746]: E0128 20:41:56.421841 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:56.92182142 +0000 UTC m=+144.878007774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.424566 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:56 crc kubenswrapper[4746]: E0128 20:41:56.424946 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:56.924936869 +0000 UTC m=+144.881123223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.456297 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" podStartSLOduration=123.456262404 podStartE2EDuration="2m3.456262404s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:56.433988468 +0000 UTC m=+144.390174822" watchObservedRunningTime="2026-01-28 20:41:56.456262404 +0000 UTC m=+144.412448758" Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.526819 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:56 crc kubenswrapper[4746]: E0128 20:41:56.532998 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:57.032969594 +0000 UTC m=+144.989155948 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.533252 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:56 crc kubenswrapper[4746]: E0128 20:41:56.533656 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:57.033647154 +0000 UTC m=+144.989833508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.635593 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:56 crc kubenswrapper[4746]: E0128 20:41:56.636597 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:57.136575774 +0000 UTC m=+145.092762138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.638525 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-qrffw" podStartSLOduration=124.63851171 podStartE2EDuration="2m4.63851171s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:56.637991464 +0000 UTC m=+144.594177818" watchObservedRunningTime="2026-01-28 20:41:56.63851171 +0000 UTC m=+144.594698064" Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.739226 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:56 crc kubenswrapper[4746]: E0128 20:41:56.739744 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:57.23972861 +0000 UTC m=+145.195914954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.787205 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-562pf"] Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.790231 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6"] Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.794994 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj"] Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.841230 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:56 crc kubenswrapper[4746]: E0128 20:41:56.841334 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:57.341311032 +0000 UTC m=+145.297497396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.841967 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:56 crc kubenswrapper[4746]: E0128 20:41:56.861922 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:57.36190214 +0000 UTC m=+145.318088494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.884511 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q8fsc"] Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.887381 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zzrh5"] Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.929112 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n"] Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.931635 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hlqz9"] Jan 28 20:41:56 crc kubenswrapper[4746]: I0128 20:41:56.944762 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:56 crc kubenswrapper[4746]: E0128 20:41:56.945173 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:57.445156418 +0000 UTC m=+145.401342772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:57 crc kubenswrapper[4746]: W0128 20:41:57.002362 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cedee54_2c6e_44d4_a51a_b5f8d3ff0833.slice/crio-f20ebec3ad0320e71810494c1c7eb1713165db4de5c6faf6196e5ed59aee7689 WatchSource:0}: Error finding container f20ebec3ad0320e71810494c1c7eb1713165db4de5c6faf6196e5ed59aee7689: Status 404 returned error can't find the container with id f20ebec3ad0320e71810494c1c7eb1713165db4de5c6faf6196e5ed59aee7689 Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.046182 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:57 crc kubenswrapper[4746]: E0128 20:41:57.046815 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:57.546800691 +0000 UTC m=+145.502987055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.109587 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rbl7q"] Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.123912 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m6pvw"] Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.163434 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hcxv8" event={"ID":"94cef654-afbe-42c2-8069-5dbcb7294abb","Type":"ContainerStarted","Data":"bf07bf85734432391e1902898b2e3ff5cd6265ec338b9d98a12b388e93d34a0d"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.179791 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:57 crc kubenswrapper[4746]: E0128 20:41:57.180563 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:57.680511 +0000 UTC m=+145.636697354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:57 crc kubenswrapper[4746]: W0128 20:41:57.188005 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcb19d77_0dbf_4d14_a86c_4cd7e65211e0.slice/crio-b08a22aeb1f03f1684fdef8782de3c00878b60347363739b7a516ad25de3f4c7 WatchSource:0}: Error finding container b08a22aeb1f03f1684fdef8782de3c00878b60347363739b7a516ad25de3f4c7: Status 404 returned error can't find the container with id b08a22aeb1f03f1684fdef8782de3c00878b60347363739b7a516ad25de3f4c7 Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.190227 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lqlcg" event={"ID":"aa7d6dd6-1d54-400f-a188-628f99083f93","Type":"ContainerStarted","Data":"28cac94f273bb6153b7f2fbaa041a3e49745c294e5d851b4da0e75be43187a3b"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.193906 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:57 crc kubenswrapper[4746]: E0128 20:41:57.196295 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:57.69626779 +0000 UTC m=+145.652454334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.201787 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gjgg" event={"ID":"d66e245a-95d4-49c2-9172-2f095fab3e2b","Type":"ContainerStarted","Data":"01684bd12fc394a6c934acafeb5527af1cde46b2f2f1543cd0d8d68548c54a52"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.204548 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hlqz9" event={"ID":"0cedee54-2c6e-44d4-a51a-b5f8d3ff0833","Type":"ContainerStarted","Data":"f20ebec3ad0320e71810494c1c7eb1713165db4de5c6faf6196e5ed59aee7689"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.207495 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rktng" event={"ID":"18ad1fee-6a3a-4b98-83c5-6a13f22699db","Type":"ContainerStarted","Data":"80be6036dad2d7919d79cdab52fb4220506e96dc5985c43cba4add610204e399"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.261965 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:41:57 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:41:57 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:41:57 crc kubenswrapper[4746]: healthz check failed Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.262044 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:41:57 crc kubenswrapper[4746]: W0128 20:41:57.295555 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b1d3e40_8458_4661_854f_c16ab4cd7596.slice/crio-0818f37ea6a4be6dccfd2518c7e266c34a0bf86a520d4ffd73da767c47aaa3f1 WatchSource:0}: Error finding container 0818f37ea6a4be6dccfd2518c7e266c34a0bf86a520d4ffd73da767c47aaa3f1: Status 404 returned error can't find the container with id 0818f37ea6a4be6dccfd2518c7e266c34a0bf86a520d4ffd73da767c47aaa3f1 Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.295997 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:57 crc kubenswrapper[4746]: E0128 20:41:57.300329 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:57.800306731 +0000 UTC m=+145.756493075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.301357 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lzj8l" event={"ID":"3e64bb6e-1131-431b-b87c-71e25d294fe1","Type":"ContainerStarted","Data":"c8c97944575682bc88efd84979a2f337d1c7a345c1ad3c8f9002ea30753ec0ca"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.330250 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" event={"ID":"f69da289-e18c-4baa-ad6b-4f2e3a44cda5","Type":"ContainerStarted","Data":"79d6afaae628b0d231851d2fa985bbc4bf92fe98604009df180c1902a9820465"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.334722 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qrffw" event={"ID":"b34266ce-b971-4f4b-b8b7-c54ff8b6212c","Type":"ContainerStarted","Data":"2b4d75386f3b7bc2a8e41ac3bfb4b4f345fd11f7e4032c67a8ff4f8fe2775f1d"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.335508 4746 patch_prober.go:28] interesting pod/downloads-7954f5f757-qrffw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.335544 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qrffw" podUID="b34266ce-b971-4f4b-b8b7-c54ff8b6212c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.353983 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" event={"ID":"0f0eb07a-e0b4-4702-89b0-d94e937471a5","Type":"ContainerStarted","Data":"8888ab26476e69ebc3b97c4249886b7c64b042ab0cf20bf596eb2c3ca5c61513"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.374182 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n" event={"ID":"f3cd630b-3fe5-497f-90da-f58bcb7aac8b","Type":"ContainerStarted","Data":"a7094372f957db9597eccf80cd961cbf1e00c7df6afac6ec1e5375020583647f"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.374419 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8w6v4" podStartSLOduration=125.374394957 podStartE2EDuration="2m5.374394957s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:57.353786459 +0000 UTC m=+145.309972813" watchObservedRunningTime="2026-01-28 20:41:57.374394957 +0000 UTC m=+145.330581311" Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.380795 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wpwwb"] Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.398449 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:57 crc kubenswrapper[4746]: E0128 20:41:57.399171 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:57.899159025 +0000 UTC m=+145.855345379 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.437029 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gbwrh" event={"ID":"460c9e37-a0c0-43ea-9607-8f716e2e92bf","Type":"ContainerStarted","Data":"eb0e30f2bc125f854bb304d7f4fa525845a1bd3ae153defe74f34158e63d98ee"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.457038 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6hnhg"] Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.462336 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xxnb8" event={"ID":"e3e01537-7adb-4a81-a9cb-3deb73a1d5b3","Type":"ContainerStarted","Data":"78f2a4d9ba444db45180ff5fa5a5964bfc3398ba5db73acfbdb41bd30a952851"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.467878 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4"] Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.473643 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vfcnn"] Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.476341 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c" event={"ID":"4886a33e-8379-47d5-ac13-e58bc623d01c","Type":"ContainerStarted","Data":"b7842248ca0581ba6e3bdb031a3d5b25637dfc3986f3caa320b4de0040525cc6"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.490941 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sqh2q"] Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.496290 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" event={"ID":"ac478ffc-e1e4-4b72-bf99-d35c2636a78d","Type":"ContainerStarted","Data":"4a7686a0bee5e98326bc17e58d9ec6212ddeed98482a321277f79e2866855194"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.496465 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5gjgg" podStartSLOduration=125.496438854 podStartE2EDuration="2m5.496438854s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:57.481427635 +0000 UTC m=+145.437613989" watchObservedRunningTime="2026-01-28 20:41:57.496438854 +0000 UTC m=+145.452625208" Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.499390 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:57 crc kubenswrapper[4746]: E0128 20:41:57.500477 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:58.000460928 +0000 UTC m=+145.956647282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.508049 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-562pf" event={"ID":"8b8a9bdb-0612-4627-ba36-98293308c32d","Type":"ContainerStarted","Data":"dbae96ef3ae63f691310a4f840a7b75f90537e11070eddadf3cc16cbf47bce23"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.510418 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6" event={"ID":"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6","Type":"ContainerStarted","Data":"b18e900b3e8378a4deae407829dbf120d084c32a5a579ef60953929086a0cc81"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.519277 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4cmsz" event={"ID":"0ea55ab8-bec0-44e8-8105-c8c604fc5fa1","Type":"ContainerStarted","Data":"a4a7d8eac8f0a983fdec4440ae3abc6c7413feedafa268fc9165ab4d0fa40e05"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.519472 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-4cmsz" Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.556108 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7" event={"ID":"8c7789bf-f2f7-4cf4-97f1-3d8a7438daca","Type":"ContainerStarted","Data":"01e6e96d1ec48979c39797025d835a7d8dcbecf1903eb2b8ead14aafeb1cc1c1"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.560424 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mznpb" event={"ID":"72e0847f-0a87-4710-9765-a10282cc0529","Type":"ContainerStarted","Data":"b43c936f086ae0966dc1aaeb434ebd1e385b89dc7a25889d0a3635a4e4c02f8f"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.567706 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" event={"ID":"f31f9ca0-e467-452d-90db-a28a4b69496e","Type":"ContainerStarted","Data":"3cee42c678939b3a02052dca79fa6136408026f6e7f2003f656c79c2aead754f"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.573073 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" event={"ID":"aa555a55-a0b0-47e3-959f-e2d8d387aae2","Type":"ContainerStarted","Data":"f80c81d13de8279a7cf0751e7c3f33d9d0321439c775af600701ee05f74e0611"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.574301 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-q9qqq" event={"ID":"1f9d4298-e60e-4a66-94fd-20b80ac131cf","Type":"ContainerStarted","Data":"80872626f8f1f08bc9df70e24a3676d8a2e852bf03ceeb0f3b31aa62ef966091"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.576798 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q8fsc" event={"ID":"c29355d7-4e9a-4e5a-838e-77a4df7c2fda","Type":"ContainerStarted","Data":"ff2a8cfae5fa050c44e21d6125de1c3d4e07f4fbbedce928d5fea7d1c9bce832"} Jan 28 20:41:57 crc kubenswrapper[4746]: W0128 20:41:57.600474 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod250aa960_c630_48e1_b8a2_8b34917bccb1.slice/crio-45c5e68b01fd8a5f616c32c1d62e08965a0745fc777fd583ee881a1b4ce20f71 WatchSource:0}: Error finding container 45c5e68b01fd8a5f616c32c1d62e08965a0745fc777fd583ee881a1b4ce20f71: Status 404 returned error can't find the container with id 45c5e68b01fd8a5f616c32c1d62e08965a0745fc777fd583ee881a1b4ce20f71 Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.600922 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.607136 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" event={"ID":"6208130d-52bc-449e-b371-357b1cc21b22","Type":"ContainerStarted","Data":"ab259627a2560868a6fdbf9d2cca4fb91aee52f63f4afc20e810c3c81ba3aca9"} Jan 28 20:41:57 crc kubenswrapper[4746]: E0128 20:41:57.607392 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:58.107364542 +0000 UTC m=+146.063550896 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.607875 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.616408 4746 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-4g4p7 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" start-of-body= Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.616498 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" podUID="6208130d-52bc-449e-b371-357b1cc21b22" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.616992 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" event={"ID":"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07","Type":"ContainerStarted","Data":"4df9dc3c81a90cf27ae1025e0c5de92e9b6d64343b73a8e124836339cabf2ef4"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.626315 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s8b4b" event={"ID":"4e03d657-3b57-4eea-bb77-f5fe3a519cac","Type":"ContainerStarted","Data":"6f6dd704010fa72d15871b01ad767452012a71030e30863eb161c117ae59811e"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.646525 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zzrh5" event={"ID":"71db2fbb-6c1a-417f-9cc4-a1ae041d0c0a","Type":"ContainerStarted","Data":"01658c7c20d82569cf1f25b53f20d10890e0c2f1d4b069acd47c915c48b99076"} Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.656111 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.705278 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:57 crc kubenswrapper[4746]: E0128 20:41:57.707564 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:58.207518832 +0000 UTC m=+146.163705186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.716246 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-62cbt"] Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.731273 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq"] Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.762435 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:57 crc kubenswrapper[4746]: E0128 20:41:57.768039 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:58.26801302 +0000 UTC m=+146.224199564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.793314 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-89zbv"] Jan 28 20:41:57 crc kubenswrapper[4746]: W0128 20:41:57.805071 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d1acc2c_a42e_40a6_8ef5_c7e10ce548ee.slice/crio-81286b2a9042b4d2b39427e4ab24713dfb969df805cf806d3f0da92549258bed WatchSource:0}: Error finding container 81286b2a9042b4d2b39427e4ab24713dfb969df805cf806d3f0da92549258bed: Status 404 returned error can't find the container with id 81286b2a9042b4d2b39427e4ab24713dfb969df805cf806d3f0da92549258bed Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.806040 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-hcxv8" podStartSLOduration=125.806002125 podStartE2EDuration="2m5.806002125s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:57.765710094 +0000 UTC m=+145.721896468" watchObservedRunningTime="2026-01-28 20:41:57.806002125 +0000 UTC m=+145.762188479" Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.864503 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:57 crc kubenswrapper[4746]: E0128 20:41:57.866467 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:58.366429151 +0000 UTC m=+146.322615685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.886480 4746 csr.go:261] certificate signing request csr-2jpqf is approved, waiting to be issued Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.902422 4746 csr.go:257] certificate signing request csr-2jpqf is issued Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.939149 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xxnb8" podStartSLOduration=125.939131878 podStartE2EDuration="2m5.939131878s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:57.903103808 +0000 UTC m=+145.859290162" watchObservedRunningTime="2026-01-28 20:41:57.939131878 +0000 UTC m=+145.895318222" Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.965269 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" podStartSLOduration=125.965252673 podStartE2EDuration="2m5.965252673s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:57.964192374 +0000 UTC m=+145.920378728" watchObservedRunningTime="2026-01-28 20:41:57.965252673 +0000 UTC m=+145.921439027" Jan 28 20:41:57 crc kubenswrapper[4746]: I0128 20:41:57.967046 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:57 crc kubenswrapper[4746]: E0128 20:41:57.969209 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:58.469196546 +0000 UTC m=+146.425382900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.035973 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mznpb" podStartSLOduration=125.035958523 podStartE2EDuration="2m5.035958523s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:58.033499213 +0000 UTC m=+145.989685567" watchObservedRunningTime="2026-01-28 20:41:58.035958523 +0000 UTC m=+145.992144877" Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.037562 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-z2vr7" podStartSLOduration=126.037554929 podStartE2EDuration="2m6.037554929s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:58.002907899 +0000 UTC m=+145.959094253" watchObservedRunningTime="2026-01-28 20:41:58.037554929 +0000 UTC m=+145.993741283" Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.073630 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:58 crc kubenswrapper[4746]: E0128 20:41:58.074271 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:58.574254117 +0000 UTC m=+146.530440471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.101723 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-8mmbg" podStartSLOduration=126.101704641 podStartE2EDuration="2m6.101704641s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:58.100244809 +0000 UTC m=+146.056431173" watchObservedRunningTime="2026-01-28 20:41:58.101704641 +0000 UTC m=+146.057890995" Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.106903 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-4cmsz" podStartSLOduration=126.106888629 podStartE2EDuration="2m6.106888629s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:58.065827716 +0000 UTC m=+146.022014070" watchObservedRunningTime="2026-01-28 20:41:58.106888629 +0000 UTC m=+146.063074993" Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.176135 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:58 crc kubenswrapper[4746]: E0128 20:41:58.176575 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:58.676559228 +0000 UTC m=+146.632745582 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.252563 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:41:58 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:41:58 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:41:58 crc kubenswrapper[4746]: healthz check failed Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.252649 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.277547 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:58 crc kubenswrapper[4746]: E0128 20:41:58.277877 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:58.777852022 +0000 UTC m=+146.734038366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.278172 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:58 crc kubenswrapper[4746]: E0128 20:41:58.278592 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:58.778585853 +0000 UTC m=+146.734772207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.328491 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-4cmsz" Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.389766 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:58 crc kubenswrapper[4746]: E0128 20:41:58.390247 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:58.890228822 +0000 UTC m=+146.846415176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.390369 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:58 crc kubenswrapper[4746]: E0128 20:41:58.390678 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:58.890670594 +0000 UTC m=+146.846856948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.499565 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:58 crc kubenswrapper[4746]: E0128 20:41:58.499936 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:58.999916014 +0000 UTC m=+146.956102368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.601421 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:58 crc kubenswrapper[4746]: E0128 20:41:58.601837 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:59.101816865 +0000 UTC m=+147.058003209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.705301 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:58 crc kubenswrapper[4746]: E0128 20:41:58.705639 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:59.20562459 +0000 UTC m=+147.161810944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.706747 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q8fsc" event={"ID":"c29355d7-4e9a-4e5a-838e-77a4df7c2fda","Type":"ContainerStarted","Data":"98d8cd1b4a9c31b2764ec7cdc2b33f757ed71d335d633b20eba9e3c223cdf07a"} Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.722852 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xxnb8" event={"ID":"e3e01537-7adb-4a81-a9cb-3deb73a1d5b3","Type":"ContainerStarted","Data":"d4a5ec88c1a7633e6ad3458c60b83382afa6f65dacd5644d5d18e46330e7824a"} Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.741679 4746 generic.go:334] "Generic (PLEG): container finished" podID="18ad1fee-6a3a-4b98-83c5-6a13f22699db" containerID="276e1e307217c0afe791c0db313991dbc78691c97eb996df884cd3f4d8f47d79" exitCode=0 Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.742615 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rktng" event={"ID":"18ad1fee-6a3a-4b98-83c5-6a13f22699db","Type":"ContainerDied","Data":"276e1e307217c0afe791c0db313991dbc78691c97eb996df884cd3f4d8f47d79"} Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.758154 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-62cbt" event={"ID":"3c71e425-b304-49f8-ac53-5e1383f73eb7","Type":"ContainerStarted","Data":"462157ae4f6c1252455beb4bec4fa89d65b0a757db0d7d3b6544194cec94ea5c"} Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.771952 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" event={"ID":"aa555a55-a0b0-47e3-959f-e2d8d387aae2","Type":"ContainerStarted","Data":"6e84f4f7e31d4eb10037433904d9d2f5b3733f97857c411095b9ff9f53ca9e1c"} Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.773405 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.777854 4746 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-8vvmj container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" start-of-body= Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.777900 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" podUID="aa555a55-a0b0-47e3-959f-e2d8d387aae2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.780286 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lzj8l" event={"ID":"3e64bb6e-1131-431b-b87c-71e25d294fe1","Type":"ContainerStarted","Data":"a9ff8f783e00f678f3c6cfd384a11021dfa39a6f8b14b994f83a50f6624cb9b8"} Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.794497 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" event={"ID":"0f0eb07a-e0b4-4702-89b0-d94e937471a5","Type":"ContainerStarted","Data":"df9bf263c305c8676e726e87dedce4bc509ccb4a4b19146e7680dedbbb7271af"} Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.795835 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.801304 4746 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-5d8cs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.801386 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" podUID="0f0eb07a-e0b4-4702-89b0-d94e937471a5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.807383 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq" event={"ID":"7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee","Type":"ContainerStarted","Data":"81286b2a9042b4d2b39427e4ab24713dfb969df805cf806d3f0da92549258bed"} Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.809719 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:58 crc kubenswrapper[4746]: E0128 20:41:58.810173 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:59.310156566 +0000 UTC m=+147.266342910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.908692 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" podStartSLOduration=125.908663009 podStartE2EDuration="2m5.908663009s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:58.88666872 +0000 UTC m=+146.842855074" watchObservedRunningTime="2026-01-28 20:41:58.908663009 +0000 UTC m=+146.864849363" Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.909398 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-lzj8l" podStartSLOduration=125.909354899 podStartE2EDuration="2m5.909354899s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:58.843052925 +0000 UTC m=+146.799239279" watchObservedRunningTime="2026-01-28 20:41:58.909354899 +0000 UTC m=+146.865541253" Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.910414 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-28 20:36:57 +0000 UTC, rotation deadline is 2026-12-19 22:59:30.481565578 +0000 UTC Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.910463 4746 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7802h17m31.571106658s for next certificate rotation Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.911044 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:58 crc kubenswrapper[4746]: E0128 20:41:58.913319 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:59.413303881 +0000 UTC m=+147.369490235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.929926 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" podStartSLOduration=126.929892955 podStartE2EDuration="2m6.929892955s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:58.917729498 +0000 UTC m=+146.873915852" watchObservedRunningTime="2026-01-28 20:41:58.929892955 +0000 UTC m=+146.886079309" Jan 28 20:41:58 crc kubenswrapper[4746]: I0128 20:41:58.957834 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lqlcg" event={"ID":"aa7d6dd6-1d54-400f-a188-628f99083f93","Type":"ContainerStarted","Data":"065036a2fd8bbca647796504f06c5db821eaa425db3c86ac07b8f0eb7fd4fac0"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.014545 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:59 crc kubenswrapper[4746]: E0128 20:41:59.015526 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:59.5155144 +0000 UTC m=+147.471700754 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.030015 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-562pf" event={"ID":"8b8a9bdb-0612-4627-ba36-98293308c32d","Type":"ContainerStarted","Data":"a465b794c245fb4df23e1df1050f55e558549b1bb5364340c858aa084d28bdeb"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.061371 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-562pf" podStartSLOduration=126.0613578 podStartE2EDuration="2m6.0613578s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:59.060727772 +0000 UTC m=+147.016914126" watchObservedRunningTime="2026-01-28 20:41:59.0613578 +0000 UTC m=+147.017544154" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.094195 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4" event={"ID":"89ebd250-beb4-4c8e-8889-bd221f68af5e","Type":"ContainerStarted","Data":"52c6baff7a792fd88fa30cf9a8f5609803bb62212752c932fd3eeabf1881a750"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.116047 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:59 crc kubenswrapper[4746]: E0128 20:41:59.116265 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:59.616237457 +0000 UTC m=+147.572423811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.116591 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:59 crc kubenswrapper[4746]: E0128 20:41:59.117966 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:59.617954067 +0000 UTC m=+147.574140421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.118455 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s8b4b" event={"ID":"4e03d657-3b57-4eea-bb77-f5fe3a519cac","Type":"ContainerStarted","Data":"2d486b67c9903c95e156433ba72a87b362147a61d4f0c0221e76824fc3df7884"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.135695 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-mznpb" event={"ID":"72e0847f-0a87-4710-9765-a10282cc0529","Type":"ContainerStarted","Data":"76fa19d9f11e023ca835e22fa6b799969c6f8ec6a129c87d21ec3b4c3a3c02dd"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.159248 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hlqz9" event={"ID":"0cedee54-2c6e-44d4-a51a-b5f8d3ff0833","Type":"ContainerStarted","Data":"60dad2cd691811a3c347ef64d6a80300642ba37a8530d772cab87536707ef762"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.172506 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6hnhg" event={"ID":"250aa960-c630-48e1-b8a2-8b34917bccb1","Type":"ContainerStarted","Data":"45c5e68b01fd8a5f616c32c1d62e08965a0745fc777fd583ee881a1b4ce20f71"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.192692 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-q9qqq" event={"ID":"1f9d4298-e60e-4a66-94fd-20b80ac131cf","Type":"ContainerStarted","Data":"d1cee12b62b4010e545d05dc3414e1c1de3b52eac0350cfef9f93864549465fc"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.193333 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s8b4b" podStartSLOduration=126.193316769 podStartE2EDuration="2m6.193316769s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:59.144233057 +0000 UTC m=+147.100419421" watchObservedRunningTime="2026-01-28 20:41:59.193316769 +0000 UTC m=+147.149503123" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.193431 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hlqz9" podStartSLOduration=126.193426533 podStartE2EDuration="2m6.193426533s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:59.192859866 +0000 UTC m=+147.149046220" watchObservedRunningTime="2026-01-28 20:41:59.193426533 +0000 UTC m=+147.149612887" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.206002 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" event={"ID":"3edaca00-e1a6-4b56-9290-cad6311263ee","Type":"ContainerStarted","Data":"bf46698bee7daf86d3e201e06503c2f58587dd4227307d61a9381a6addafc3cf"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.208421 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.208703 4746 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-sqh2q container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.210313 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" podUID="3edaca00-e1a6-4b56-9290-cad6311263ee" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.218400 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:59 crc kubenswrapper[4746]: E0128 20:41:59.221133 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:59.721109643 +0000 UTC m=+147.677295997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.256899 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m6pvw" event={"ID":"5b1d3e40-8458-4661-854f-c16ab4cd7596","Type":"ContainerStarted","Data":"0818f37ea6a4be6dccfd2518c7e266c34a0bf86a520d4ffd73da767c47aaa3f1"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.262633 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-q9qqq" podStartSLOduration=7.262617269 podStartE2EDuration="7.262617269s" podCreationTimestamp="2026-01-28 20:41:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:59.256505434 +0000 UTC m=+147.212691788" watchObservedRunningTime="2026-01-28 20:41:59.262617269 +0000 UTC m=+147.218803623" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.262916 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:41:59 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:41:59 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:41:59 crc kubenswrapper[4746]: healthz check failed Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.266284 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.279008 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rbl7q" event={"ID":"bcb19d77-0dbf-4d14-a86c-4cd7e65211e0","Type":"ContainerStarted","Data":"529669d694418486f2c76e589f0a5e73637c389116cc8f4c33e65880473b0400"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.279050 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rbl7q" event={"ID":"bcb19d77-0dbf-4d14-a86c-4cd7e65211e0","Type":"ContainerStarted","Data":"b08a22aeb1f03f1684fdef8782de3c00878b60347363739b7a516ad25de3f4c7"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.287533 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" podStartSLOduration=126.28751892 podStartE2EDuration="2m6.28751892s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:59.286416708 +0000 UTC m=+147.242603062" watchObservedRunningTime="2026-01-28 20:41:59.28751892 +0000 UTC m=+147.243705274" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.314376 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m6pvw" podStartSLOduration=126.314355826 podStartE2EDuration="2m6.314355826s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:59.313891123 +0000 UTC m=+147.270077477" watchObservedRunningTime="2026-01-28 20:41:59.314355826 +0000 UTC m=+147.270542180" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.321541 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:59 crc kubenswrapper[4746]: E0128 20:41:59.323739 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:41:59.823724644 +0000 UTC m=+147.779911098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.341453 4746 generic.go:334] "Generic (PLEG): container finished" podID="af3e8fcb-fbde-4b65-92ab-3d8b71b2de07" containerID="b2e96f60f43c325910fa88498c7fc4a008a03f91a0cb794b21e65a9533629dfd" exitCode=0 Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.346605 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" event={"ID":"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07","Type":"ContainerDied","Data":"b2e96f60f43c325910fa88498c7fc4a008a03f91a0cb794b21e65a9533629dfd"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.422140 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.423461 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zzrh5" event={"ID":"71db2fbb-6c1a-417f-9cc4-a1ae041d0c0a","Type":"ContainerStarted","Data":"03262c9b75e343a64a0507d48d5aa33980519349ad25c827f8779eeb9b78d0d8"} Jan 28 20:41:59 crc kubenswrapper[4746]: E0128 20:41:59.423552 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:41:59.923521944 +0000 UTC m=+147.879708298 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.432188 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-89zbv" event={"ID":"058433b4-653b-4170-83e2-ed7c5d753323","Type":"ContainerStarted","Data":"2c1f80bd7e56a7c91ef6b7ce17cd9bba86da61cd6955d590b36bff4996926cc7"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.447183 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gbwrh" event={"ID":"460c9e37-a0c0-43ea-9607-8f716e2e92bf","Type":"ContainerStarted","Data":"cd584fe2c251b15442e2d42ea32ed91133a3b4d207b40db7d59b6fe2817c81c7"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.485761 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" event={"ID":"f31f9ca0-e467-452d-90db-a28a4b69496e","Type":"ContainerStarted","Data":"0161c09355e56b9b0c0b06db78d514858270c842821091ae9bd7b6c81f0bd9fe"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.519230 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n" event={"ID":"f3cd630b-3fe5-497f-90da-f58bcb7aac8b","Type":"ContainerStarted","Data":"b3aee6f76b0934310345eb08ac001057517c1e989e2d681d5810c417f6a6e98c"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.529991 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gbwrh" podStartSLOduration=126.529972494 podStartE2EDuration="2m6.529972494s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:59.478652058 +0000 UTC m=+147.434838412" watchObservedRunningTime="2026-01-28 20:41:59.529972494 +0000 UTC m=+147.486158848" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.530950 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-dbwsb" podStartSLOduration=127.530945562 podStartE2EDuration="2m7.530945562s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:59.527544815 +0000 UTC m=+147.483731179" watchObservedRunningTime="2026-01-28 20:41:59.530945562 +0000 UTC m=+147.487131916" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.534829 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:59 crc kubenswrapper[4746]: E0128 20:41:59.536446 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:00.036434239 +0000 UTC m=+147.992620593 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.540205 4746 generic.go:334] "Generic (PLEG): container finished" podID="f69da289-e18c-4baa-ad6b-4f2e3a44cda5" containerID="539bda1fd4ba15eb6686c27f697e8ddf4ab4cf368f1589cc29f59d7e4def4806" exitCode=0 Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.540264 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" event={"ID":"f69da289-e18c-4baa-ad6b-4f2e3a44cda5","Type":"ContainerDied","Data":"539bda1fd4ba15eb6686c27f697e8ddf4ab4cf368f1589cc29f59d7e4def4806"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.562307 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c" event={"ID":"4886a33e-8379-47d5-ac13-e58bc623d01c","Type":"ContainerStarted","Data":"180c0671018935165c04a792845788babbd329784498a9f0721c079febcc2dd2"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.563328 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.573349 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n" podStartSLOduration=126.573334193 podStartE2EDuration="2m6.573334193s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:59.570468851 +0000 UTC m=+147.526655205" watchObservedRunningTime="2026-01-28 20:41:59.573334193 +0000 UTC m=+147.529520547" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.579006 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wpwwb" event={"ID":"ef8a3065-87eb-468f-a985-936f973c8f1a","Type":"ContainerStarted","Data":"d1804da84ca85b59d212c1b7ade665c7dfda4d63fe3a4ad76428d9262a3e2f38"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.611343 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6" event={"ID":"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6","Type":"ContainerStarted","Data":"0285afbcb142cdf5b9d8c281aa5e2ffa73afb12d29dd61ed8bc2f60ab072fa58"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.638397 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vfcnn" event={"ID":"46800869-0687-4288-9a6e-3512b2e2c499","Type":"ContainerStarted","Data":"912b7bdced54de3a781bf80b76829f73ccf41310369d2a147606b7e779a844bb"} Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.638528 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.639986 4746 patch_prober.go:28] interesting pod/downloads-7954f5f757-qrffw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.640029 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qrffw" podUID="b34266ce-b971-4f4b-b8b7-c54ff8b6212c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.641372 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:59 crc kubenswrapper[4746]: E0128 20:41:59.642527 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:00.142512498 +0000 UTC m=+148.098698852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.651604 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb74c" podStartSLOduration=126.651590308 podStartE2EDuration="2m6.651590308s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:59.648971633 +0000 UTC m=+147.605157987" watchObservedRunningTime="2026-01-28 20:41:59.651590308 +0000 UTC m=+147.607776662" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.659738 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.705530 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-vfcnn" podStartSLOduration=126.705512388 podStartE2EDuration="2m6.705512388s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:59.681527363 +0000 UTC m=+147.637713717" watchObservedRunningTime="2026-01-28 20:41:59.705512388 +0000 UTC m=+147.661698742" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.727054 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6" podStartSLOduration=126.727031453 podStartE2EDuration="2m6.727031453s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:59.724984294 +0000 UTC m=+147.681170648" watchObservedRunningTime="2026-01-28 20:41:59.727031453 +0000 UTC m=+147.683217807" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.746385 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:59 crc kubenswrapper[4746]: E0128 20:41:59.751118 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:00.25109007 +0000 UTC m=+148.207276424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.793917 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wpwwb" podStartSLOduration=7.793898592 podStartE2EDuration="7.793898592s" podCreationTimestamp="2026-01-28 20:41:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:41:59.748470295 +0000 UTC m=+147.704656649" watchObservedRunningTime="2026-01-28 20:41:59.793898592 +0000 UTC m=+147.750084946" Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.847736 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:41:59 crc kubenswrapper[4746]: E0128 20:41:59.848173 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:00.348153432 +0000 UTC m=+148.304339786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:41:59 crc kubenswrapper[4746]: I0128 20:41:59.949733 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:41:59 crc kubenswrapper[4746]: E0128 20:41:59.950142 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:00.450130655 +0000 UTC m=+148.406317009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.050769 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:00 crc kubenswrapper[4746]: E0128 20:42:00.051293 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:00.551268973 +0000 UTC m=+148.507455327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.051857 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:00 crc kubenswrapper[4746]: E0128 20:42:00.052296 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:00.552280432 +0000 UTC m=+148.508466786 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.153527 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:00 crc kubenswrapper[4746]: E0128 20:42:00.153669 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:00.653650507 +0000 UTC m=+148.609836861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.154061 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:00 crc kubenswrapper[4746]: E0128 20:42:00.154373 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:00.654364638 +0000 UTC m=+148.610550992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.257391 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:42:00 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:42:00 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:42:00 crc kubenswrapper[4746]: healthz check failed Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.257455 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.258701 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:00 crc kubenswrapper[4746]: E0128 20:42:00.259600 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:00.759582773 +0000 UTC m=+148.715769127 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.361058 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:00 crc kubenswrapper[4746]: E0128 20:42:00.361489 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:00.861461353 +0000 UTC m=+148.817647707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.462899 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:00 crc kubenswrapper[4746]: E0128 20:42:00.463183 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:00.963147397 +0000 UTC m=+148.919333751 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.463907 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:00 crc kubenswrapper[4746]: E0128 20:42:00.464299 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:00.96429001 +0000 UTC m=+148.920476364 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.566048 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:00 crc kubenswrapper[4746]: E0128 20:42:00.566631 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:01.066607432 +0000 UTC m=+149.022793786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.650698 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rbl7q" event={"ID":"bcb19d77-0dbf-4d14-a86c-4cd7e65211e0","Type":"ContainerStarted","Data":"44e0ad5ebb14fb9d388f16505c3fc068912bfc5d438ba3275855a1d8bb8c636b"} Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.654287 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-s8b4b" event={"ID":"4e03d657-3b57-4eea-bb77-f5fe3a519cac","Type":"ContainerStarted","Data":"e71184b1543cda3ca302fc5c2af1bc381f9a647b710996644ec8ab557c0a999d"} Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.656980 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zzrh5" event={"ID":"71db2fbb-6c1a-417f-9cc4-a1ae041d0c0a","Type":"ContainerStarted","Data":"b0e26b95c12964cce763bf828c22eb06bbbd5b13f5b5ed9e608dcc3db32f2bee"} Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.657197 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zzrh5" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.667846 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" event={"ID":"f69da289-e18c-4baa-ad6b-4f2e3a44cda5","Type":"ContainerStarted","Data":"2e8a00c1f12b7e21d791aa45c66feb85d9ea6a813ad1d7087babf7248e0c2635"} Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.668031 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.668096 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.668121 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.668164 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:42:00 crc kubenswrapper[4746]: E0128 20:42:00.668396 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:01.168378369 +0000 UTC m=+149.124564723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.670536 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.676461 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.682833 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.693198 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lqlcg" event={"ID":"aa7d6dd6-1d54-400f-a188-628f99083f93","Type":"ContainerStarted","Data":"b217211e7cbaf748cdc9bceb0df808b21c134cbc75ddf3431f1f71cf26b16740"} Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.700131 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6hnhg" event={"ID":"250aa960-c630-48e1-b8a2-8b34917bccb1","Type":"ContainerStarted","Data":"f2bcd9a7456f8dc352a80b9c10c9a906da1e15788de1aa40fa288822df3e72a5"} Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.700191 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6hnhg" event={"ID":"250aa960-c630-48e1-b8a2-8b34917bccb1","Type":"ContainerStarted","Data":"1625d6d82074ce0384bc198a20f0f717db0c4d3af6a42ed6967b94ddaaa78823"} Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.700972 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-6hnhg" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.719420 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-m6pvw" event={"ID":"5b1d3e40-8458-4661-854f-c16ab4cd7596","Type":"ContainerStarted","Data":"3a5b977b1f8f500d19e04707eaaa770a246c9950804876f0e0dd60cf0fcd0a58"} Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.721738 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-85ttn"] Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.722990 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-85ttn" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.736103 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wpwwb" event={"ID":"ef8a3065-87eb-468f-a985-936f973c8f1a","Type":"ContainerStarted","Data":"610027f7025d6f019470ade09f77b016fd0162743f21f6b3fd18018ae8a0725d"} Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.740569 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.762640 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rbl7q" podStartSLOduration=127.762615651 podStartE2EDuration="2m7.762615651s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:00.740999854 +0000 UTC m=+148.697186198" watchObservedRunningTime="2026-01-28 20:42:00.762615651 +0000 UTC m=+148.718802005" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.763724 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-85ttn"] Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.763985 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq" event={"ID":"7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee","Type":"ContainerStarted","Data":"4882e1b5a1403ed1140b210f8143c69bbb04b68550c7e08cbf06d5bbe0cf0dde"} Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.765191 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.771648 4746 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-27vdq container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.771702 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq" podUID="7d1acc2c-a42e-40a6-8ef5-c7e10ce548ee" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.772732 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.773598 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:42:00 crc kubenswrapper[4746]: E0128 20:42:00.775836 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:01.275812168 +0000 UTC m=+149.231998522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.798933 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.804893 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" event={"ID":"3edaca00-e1a6-4b56-9290-cad6311263ee","Type":"ContainerStarted","Data":"247a6dffe30c252f58a1ab345b064bf16a2db97efef470149722bc5c23ef722d"} Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.806218 4746 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-sqh2q container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.806256 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" podUID="3edaca00-e1a6-4b56-9290-cad6311263ee" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.851002 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-lqlcg" podStartSLOduration=127.850985044 podStartE2EDuration="2m7.850985044s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:00.849194964 +0000 UTC m=+148.805381318" watchObservedRunningTime="2026-01-28 20:42:00.850985044 +0000 UTC m=+148.807171398" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.855348 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q8fsc" event={"ID":"c29355d7-4e9a-4e5a-838e-77a4df7c2fda","Type":"ContainerStarted","Data":"b9a63d51e5068403e84dc22558e1da9b2a51adf7776558f72220ca2eded9fb36"} Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.864428 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.866098 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.867887 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ghx5p"] Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.869107 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ghx5p" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.885259 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.886673 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvnnm\" (UniqueName: \"kubernetes.io/projected/6c585264-9fea-4d40-910d-68a31c553f76-kube-api-access-pvnnm\") pod \"certified-operators-85ttn\" (UID: \"6c585264-9fea-4d40-910d-68a31c553f76\") " pod="openshift-marketplace/certified-operators-85ttn" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.886845 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c585264-9fea-4d40-910d-68a31c553f76-catalog-content\") pod \"certified-operators-85ttn\" (UID: \"6c585264-9fea-4d40-910d-68a31c553f76\") " pod="openshift-marketplace/certified-operators-85ttn" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.886943 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.887002 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c585264-9fea-4d40-910d-68a31c553f76-utilities\") pod \"certified-operators-85ttn\" (UID: \"6c585264-9fea-4d40-910d-68a31c553f76\") " pod="openshift-marketplace/certified-operators-85ttn" Jan 28 20:42:00 crc kubenswrapper[4746]: E0128 20:42:00.889070 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:01.389055773 +0000 UTC m=+149.345242127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.897605 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" podStartSLOduration=127.897564225 podStartE2EDuration="2m7.897564225s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:00.896890326 +0000 UTC m=+148.853076680" watchObservedRunningTime="2026-01-28 20:42:00.897564225 +0000 UTC m=+148.853750579" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.905439 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h2q6n" event={"ID":"f3cd630b-3fe5-497f-90da-f58bcb7aac8b","Type":"ContainerStarted","Data":"8c5cff0706b113508d2d286291d060232d125ffdd2aabf3645f07d0fb8c73075"} Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.939536 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ghx5p"] Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.959055 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6hnhg" podStartSLOduration=8.959030861 podStartE2EDuration="8.959030861s" podCreationTimestamp="2026-01-28 20:41:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:00.938637598 +0000 UTC m=+148.894823952" watchObservedRunningTime="2026-01-28 20:42:00.959030861 +0000 UTC m=+148.915217215" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.959439 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.971389 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4" event={"ID":"89ebd250-beb4-4c8e-8889-bd221f68af5e","Type":"ContainerStarted","Data":"b76de7d10c25ad5df8170b393e83eb999d858bbafbe3ebea89eb4c6e9464be59"} Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.971457 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4" event={"ID":"89ebd250-beb4-4c8e-8889-bd221f68af5e","Type":"ContainerStarted","Data":"b69ae21b2487173f1876f62ba34963da278a7ae889d6f23012f47392b9b5f931"} Jan 28 20:42:00 crc kubenswrapper[4746]: I0128 20:42:00.983476 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zzrh5" podStartSLOduration=127.983462579 podStartE2EDuration="2m7.983462579s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:00.981422181 +0000 UTC m=+148.937608535" watchObservedRunningTime="2026-01-28 20:42:00.983462579 +0000 UTC m=+148.939648933" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.056794 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.056995 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa890224-0942-4671-a9d8-97b6f465b0df-utilities\") pod \"community-operators-ghx5p\" (UID: \"fa890224-0942-4671-a9d8-97b6f465b0df\") " pod="openshift-marketplace/community-operators-ghx5p" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.057153 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c585264-9fea-4d40-910d-68a31c553f76-catalog-content\") pod \"certified-operators-85ttn\" (UID: \"6c585264-9fea-4d40-910d-68a31c553f76\") " pod="openshift-marketplace/certified-operators-85ttn" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.057222 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa890224-0942-4671-a9d8-97b6f465b0df-catalog-content\") pod \"community-operators-ghx5p\" (UID: \"fa890224-0942-4671-a9d8-97b6f465b0df\") " pod="openshift-marketplace/community-operators-ghx5p" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.057291 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c585264-9fea-4d40-910d-68a31c553f76-utilities\") pod \"certified-operators-85ttn\" (UID: \"6c585264-9fea-4d40-910d-68a31c553f76\") " pod="openshift-marketplace/certified-operators-85ttn" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.057332 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2s85\" (UniqueName: \"kubernetes.io/projected/fa890224-0942-4671-a9d8-97b6f465b0df-kube-api-access-k2s85\") pod \"community-operators-ghx5p\" (UID: \"fa890224-0942-4671-a9d8-97b6f465b0df\") " pod="openshift-marketplace/community-operators-ghx5p" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.057372 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvnnm\" (UniqueName: \"kubernetes.io/projected/6c585264-9fea-4d40-910d-68a31c553f76-kube-api-access-pvnnm\") pod \"certified-operators-85ttn\" (UID: \"6c585264-9fea-4d40-910d-68a31c553f76\") " pod="openshift-marketplace/certified-operators-85ttn" Jan 28 20:42:01 crc kubenswrapper[4746]: E0128 20:42:01.057842 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:01.557817902 +0000 UTC m=+149.514004266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.058970 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c585264-9fea-4d40-910d-68a31c553f76-catalog-content\") pod \"certified-operators-85ttn\" (UID: \"6c585264-9fea-4d40-910d-68a31c553f76\") " pod="openshift-marketplace/certified-operators-85ttn" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.061692 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c585264-9fea-4d40-910d-68a31c553f76-utilities\") pod \"certified-operators-85ttn\" (UID: \"6c585264-9fea-4d40-910d-68a31c553f76\") " pod="openshift-marketplace/certified-operators-85ttn" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.100290 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq" podStartSLOduration=128.100266304 podStartE2EDuration="2m8.100266304s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:01.064070221 +0000 UTC m=+149.020256585" watchObservedRunningTime="2026-01-28 20:42:01.100266304 +0000 UTC m=+149.056452658" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.144377 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvnnm\" (UniqueName: \"kubernetes.io/projected/6c585264-9fea-4d40-910d-68a31c553f76-kube-api-access-pvnnm\") pod \"certified-operators-85ttn\" (UID: \"6c585264-9fea-4d40-910d-68a31c553f76\") " pod="openshift-marketplace/certified-operators-85ttn" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.145814 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c9v8x"] Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.151630 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-85ttn" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.153583 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c9v8x" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.160843 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2s85\" (UniqueName: \"kubernetes.io/projected/fa890224-0942-4671-a9d8-97b6f465b0df-kube-api-access-k2s85\") pod \"community-operators-ghx5p\" (UID: \"fa890224-0942-4671-a9d8-97b6f465b0df\") " pod="openshift-marketplace/community-operators-ghx5p" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.160909 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa890224-0942-4671-a9d8-97b6f465b0df-utilities\") pod \"community-operators-ghx5p\" (UID: \"fa890224-0942-4671-a9d8-97b6f465b0df\") " pod="openshift-marketplace/community-operators-ghx5p" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.160978 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa890224-0942-4671-a9d8-97b6f465b0df-catalog-content\") pod \"community-operators-ghx5p\" (UID: \"fa890224-0942-4671-a9d8-97b6f465b0df\") " pod="openshift-marketplace/community-operators-ghx5p" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.161001 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:01 crc kubenswrapper[4746]: E0128 20:42:01.161424 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:01.661400371 +0000 UTC m=+149.617586715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.162655 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa890224-0942-4671-a9d8-97b6f465b0df-catalog-content\") pod \"community-operators-ghx5p\" (UID: \"fa890224-0942-4671-a9d8-97b6f465b0df\") " pod="openshift-marketplace/community-operators-ghx5p" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.175966 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" event={"ID":"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07","Type":"ContainerStarted","Data":"1674fb3e38f15d74dd6c8e49f9c9e85a7391ca830e1520391b8dbce1da38e29f"} Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.176023 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" event={"ID":"af3e8fcb-fbde-4b65-92ab-3d8b71b2de07","Type":"ContainerStarted","Data":"8d32c91e2466af2619e91bc7ea7949ba4e355665eb54d08f379b19a3d9bce7a7"} Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.176847 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa890224-0942-4671-a9d8-97b6f465b0df-utilities\") pod \"community-operators-ghx5p\" (UID: \"fa890224-0942-4671-a9d8-97b6f465b0df\") " pod="openshift-marketplace/community-operators-ghx5p" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.180541 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vfcnn" event={"ID":"46800869-0687-4288-9a6e-3512b2e2c499","Type":"ContainerStarted","Data":"34dbc9557dfbaf0fbbf2d916b72c1cc4f6d137c43d069ea246907257671f2455"} Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.182714 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rktng" event={"ID":"18ad1fee-6a3a-4b98-83c5-6a13f22699db","Type":"ContainerStarted","Data":"1a77c6970c627c104d72a565351a1a797b206bfed637ca44d9c5ecc8caff2563"} Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.183478 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rktng" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.184442 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-62cbt" event={"ID":"3c71e425-b304-49f8-ac53-5e1383f73eb7","Type":"ContainerStarted","Data":"da482b56ea6abbdd0b0eff2a67c459dfb9851405dc28db886f79c54eb421119e"} Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.189427 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-wh6h4" podStartSLOduration=128.18939184 podStartE2EDuration="2m8.18939184s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:01.140162764 +0000 UTC m=+149.096349118" watchObservedRunningTime="2026-01-28 20:42:01.18939184 +0000 UTC m=+149.145578194" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.190002 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c9v8x"] Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.213998 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-q8fsc" podStartSLOduration=129.213981793 podStartE2EDuration="2m9.213981793s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:01.191321905 +0000 UTC m=+149.147508259" watchObservedRunningTime="2026-01-28 20:42:01.213981793 +0000 UTC m=+149.170168147" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.217294 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2s85\" (UniqueName: \"kubernetes.io/projected/fa890224-0942-4671-a9d8-97b6f465b0df-kube-api-access-k2s85\") pod \"community-operators-ghx5p\" (UID: \"fa890224-0942-4671-a9d8-97b6f465b0df\") " pod="openshift-marketplace/community-operators-ghx5p" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.236892 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-89zbv" event={"ID":"058433b4-653b-4170-83e2-ed7c5d753323","Type":"ContainerStarted","Data":"d65fc119cf4b76e294c34e75339fbc60b10f845e0f754bd7659156c7917e0235"} Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.239543 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ghx5p" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.261736 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.262014 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjzkd\" (UniqueName: \"kubernetes.io/projected/abe404c6-f1c8-4ad6-92b9-7c082b112b50-kube-api-access-pjzkd\") pod \"certified-operators-c9v8x\" (UID: \"abe404c6-f1c8-4ad6-92b9-7c082b112b50\") " pod="openshift-marketplace/certified-operators-c9v8x" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.262069 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe404c6-f1c8-4ad6-92b9-7c082b112b50-utilities\") pod \"certified-operators-c9v8x\" (UID: \"abe404c6-f1c8-4ad6-92b9-7c082b112b50\") " pod="openshift-marketplace/certified-operators-c9v8x" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.262109 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe404c6-f1c8-4ad6-92b9-7c082b112b50-catalog-content\") pod \"certified-operators-c9v8x\" (UID: \"abe404c6-f1c8-4ad6-92b9-7c082b112b50\") " pod="openshift-marketplace/certified-operators-c9v8x" Jan 28 20:42:01 crc kubenswrapper[4746]: E0128 20:42:01.262215 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:01.76219949 +0000 UTC m=+149.718385844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.272198 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:42:01 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:42:01 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:42:01 crc kubenswrapper[4746]: healthz check failed Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.272255 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.278373 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.297109 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hxz5g"] Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.299625 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hxz5g" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.327855 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hxz5g"] Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.347212 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rktng" podStartSLOduration=129.347182087 podStartE2EDuration="2m9.347182087s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:01.316500301 +0000 UTC m=+149.272686665" watchObservedRunningTime="2026-01-28 20:42:01.347182087 +0000 UTC m=+149.303368441" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.358262 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-62cbt" podStartSLOduration=128.358247773 podStartE2EDuration="2m8.358247773s" podCreationTimestamp="2026-01-28 20:39:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:01.356470072 +0000 UTC m=+149.312656426" watchObservedRunningTime="2026-01-28 20:42:01.358247773 +0000 UTC m=+149.314434127" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.370163 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f220cb-71e1-4b97-960e-ef8742661130-catalog-content\") pod \"community-operators-hxz5g\" (UID: \"f8f220cb-71e1-4b97-960e-ef8742661130\") " pod="openshift-marketplace/community-operators-hxz5g" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.370205 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe404c6-f1c8-4ad6-92b9-7c082b112b50-utilities\") pod \"certified-operators-c9v8x\" (UID: \"abe404c6-f1c8-4ad6-92b9-7c082b112b50\") " pod="openshift-marketplace/certified-operators-c9v8x" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.370337 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe404c6-f1c8-4ad6-92b9-7c082b112b50-catalog-content\") pod \"certified-operators-c9v8x\" (UID: \"abe404c6-f1c8-4ad6-92b9-7c082b112b50\") " pod="openshift-marketplace/certified-operators-c9v8x" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.370472 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.370725 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6jsk\" (UniqueName: \"kubernetes.io/projected/f8f220cb-71e1-4b97-960e-ef8742661130-kube-api-access-f6jsk\") pod \"community-operators-hxz5g\" (UID: \"f8f220cb-71e1-4b97-960e-ef8742661130\") " pod="openshift-marketplace/community-operators-hxz5g" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.370760 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f220cb-71e1-4b97-960e-ef8742661130-utilities\") pod \"community-operators-hxz5g\" (UID: \"f8f220cb-71e1-4b97-960e-ef8742661130\") " pod="openshift-marketplace/community-operators-hxz5g" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.370813 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjzkd\" (UniqueName: \"kubernetes.io/projected/abe404c6-f1c8-4ad6-92b9-7c082b112b50-kube-api-access-pjzkd\") pod \"certified-operators-c9v8x\" (UID: \"abe404c6-f1c8-4ad6-92b9-7c082b112b50\") " pod="openshift-marketplace/certified-operators-c9v8x" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.391418 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe404c6-f1c8-4ad6-92b9-7c082b112b50-utilities\") pod \"certified-operators-c9v8x\" (UID: \"abe404c6-f1c8-4ad6-92b9-7c082b112b50\") " pod="openshift-marketplace/certified-operators-c9v8x" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.397375 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe404c6-f1c8-4ad6-92b9-7c082b112b50-catalog-content\") pod \"certified-operators-c9v8x\" (UID: \"abe404c6-f1c8-4ad6-92b9-7c082b112b50\") " pod="openshift-marketplace/certified-operators-c9v8x" Jan 28 20:42:01 crc kubenswrapper[4746]: E0128 20:42:01.397554 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:01.897529155 +0000 UTC m=+149.853715589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.426573 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" podStartSLOduration=129.426552103 podStartE2EDuration="2m9.426552103s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:01.415640143 +0000 UTC m=+149.371826507" watchObservedRunningTime="2026-01-28 20:42:01.426552103 +0000 UTC m=+149.382738457" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.452506 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjzkd\" (UniqueName: \"kubernetes.io/projected/abe404c6-f1c8-4ad6-92b9-7c082b112b50-kube-api-access-pjzkd\") pod \"certified-operators-c9v8x\" (UID: \"abe404c6-f1c8-4ad6-92b9-7c082b112b50\") " pod="openshift-marketplace/certified-operators-c9v8x" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.482907 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.483403 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6jsk\" (UniqueName: \"kubernetes.io/projected/f8f220cb-71e1-4b97-960e-ef8742661130-kube-api-access-f6jsk\") pod \"community-operators-hxz5g\" (UID: \"f8f220cb-71e1-4b97-960e-ef8742661130\") " pod="openshift-marketplace/community-operators-hxz5g" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.483446 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f220cb-71e1-4b97-960e-ef8742661130-utilities\") pod \"community-operators-hxz5g\" (UID: \"f8f220cb-71e1-4b97-960e-ef8742661130\") " pod="openshift-marketplace/community-operators-hxz5g" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.483510 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f220cb-71e1-4b97-960e-ef8742661130-catalog-content\") pod \"community-operators-hxz5g\" (UID: \"f8f220cb-71e1-4b97-960e-ef8742661130\") " pod="openshift-marketplace/community-operators-hxz5g" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.484104 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f220cb-71e1-4b97-960e-ef8742661130-catalog-content\") pod \"community-operators-hxz5g\" (UID: \"f8f220cb-71e1-4b97-960e-ef8742661130\") " pod="openshift-marketplace/community-operators-hxz5g" Jan 28 20:42:01 crc kubenswrapper[4746]: E0128 20:42:01.484498 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:01.984458358 +0000 UTC m=+149.940644712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.484779 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f220cb-71e1-4b97-960e-ef8742661130-utilities\") pod \"community-operators-hxz5g\" (UID: \"f8f220cb-71e1-4b97-960e-ef8742661130\") " pod="openshift-marketplace/community-operators-hxz5g" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.532143 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6jsk\" (UniqueName: \"kubernetes.io/projected/f8f220cb-71e1-4b97-960e-ef8742661130-kube-api-access-f6jsk\") pod \"community-operators-hxz5g\" (UID: \"f8f220cb-71e1-4b97-960e-ef8742661130\") " pod="openshift-marketplace/community-operators-hxz5g" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.536832 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c9v8x" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.585101 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:01 crc kubenswrapper[4746]: E0128 20:42:01.585461 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:02.085449622 +0000 UTC m=+150.041635976 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.687808 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:01 crc kubenswrapper[4746]: E0128 20:42:01.687916 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:02.187900708 +0000 UTC m=+150.144087062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.688146 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:01 crc kubenswrapper[4746]: E0128 20:42:01.688480 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:02.188445764 +0000 UTC m=+150.144632118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.721422 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hxz5g" Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.793264 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:01 crc kubenswrapper[4746]: E0128 20:42:01.793464 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:02.293430843 +0000 UTC m=+150.249617197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.793619 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:01 crc kubenswrapper[4746]: E0128 20:42:01.794109 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:02.294092531 +0000 UTC m=+150.250278895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.896054 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:01 crc kubenswrapper[4746]: E0128 20:42:01.896734 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:02.396710552 +0000 UTC m=+150.352896906 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:01 crc kubenswrapper[4746]: I0128 20:42:01.997453 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:01 crc kubenswrapper[4746]: E0128 20:42:01.998325 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:02.498308894 +0000 UTC m=+150.454495248 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.099203 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:02 crc kubenswrapper[4746]: E0128 20:42:02.099629 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:02.599580747 +0000 UTC m=+150.555767101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.202149 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:02 crc kubenswrapper[4746]: E0128 20:42:02.202449 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:02.702438815 +0000 UTC m=+150.658625169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.237876 4746 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-8vvmj container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.237937 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" podUID="aa555a55-a0b0-47e3-959f-e2d8d387aae2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.254903 4746 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-sqh2q container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.254951 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" podUID="3edaca00-e1a6-4b56-9290-cad6311263ee" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.277290 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:42:02 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:42:02 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:42:02 crc kubenswrapper[4746]: healthz check failed Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.277348 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.307847 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:02 crc kubenswrapper[4746]: E0128 20:42:02.308326 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:02.808272597 +0000 UTC m=+150.764458951 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.352387 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-27vdq" Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.409902 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:02 crc kubenswrapper[4746]: E0128 20:42:02.411281 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:02.911224248 +0000 UTC m=+150.867410772 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.512448 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:02 crc kubenswrapper[4746]: E0128 20:42:02.513238 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:03.013221841 +0000 UTC m=+150.969408195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.536418 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8vvmj" Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.614924 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:02 crc kubenswrapper[4746]: E0128 20:42:02.615284 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:03.115272455 +0000 UTC m=+151.071458809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.675345 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4t66g"] Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.676545 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4t66g" Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.686898 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.708368 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rktng" Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.720038 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:02 crc kubenswrapper[4746]: E0128 20:42:02.720451 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:03.22043319 +0000 UTC m=+151.176619544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.723134 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4t66g"] Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.823398 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.829598 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18cbfd39-cf22-428c-ab2a-708082df0357-catalog-content\") pod \"redhat-marketplace-4t66g\" (UID: \"18cbfd39-cf22-428c-ab2a-708082df0357\") " pod="openshift-marketplace/redhat-marketplace-4t66g" Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.829758 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18cbfd39-cf22-428c-ab2a-708082df0357-utilities\") pod \"redhat-marketplace-4t66g\" (UID: \"18cbfd39-cf22-428c-ab2a-708082df0357\") " pod="openshift-marketplace/redhat-marketplace-4t66g" Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.829853 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n9sg\" (UniqueName: \"kubernetes.io/projected/18cbfd39-cf22-428c-ab2a-708082df0357-kube-api-access-6n9sg\") pod \"redhat-marketplace-4t66g\" (UID: \"18cbfd39-cf22-428c-ab2a-708082df0357\") " pod="openshift-marketplace/redhat-marketplace-4t66g" Jan 28 20:42:02 crc kubenswrapper[4746]: E0128 20:42:02.830435 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:03.330420691 +0000 UTC m=+151.286607045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.939684 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.939859 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18cbfd39-cf22-428c-ab2a-708082df0357-catalog-content\") pod \"redhat-marketplace-4t66g\" (UID: \"18cbfd39-cf22-428c-ab2a-708082df0357\") " pod="openshift-marketplace/redhat-marketplace-4t66g" Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.939884 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18cbfd39-cf22-428c-ab2a-708082df0357-utilities\") pod \"redhat-marketplace-4t66g\" (UID: \"18cbfd39-cf22-428c-ab2a-708082df0357\") " pod="openshift-marketplace/redhat-marketplace-4t66g" Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.939904 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n9sg\" (UniqueName: \"kubernetes.io/projected/18cbfd39-cf22-428c-ab2a-708082df0357-kube-api-access-6n9sg\") pod \"redhat-marketplace-4t66g\" (UID: \"18cbfd39-cf22-428c-ab2a-708082df0357\") " pod="openshift-marketplace/redhat-marketplace-4t66g" Jan 28 20:42:02 crc kubenswrapper[4746]: E0128 20:42:02.940171 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:03.440154765 +0000 UTC m=+151.396341119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.940500 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18cbfd39-cf22-428c-ab2a-708082df0357-catalog-content\") pod \"redhat-marketplace-4t66g\" (UID: \"18cbfd39-cf22-428c-ab2a-708082df0357\") " pod="openshift-marketplace/redhat-marketplace-4t66g" Jan 28 20:42:02 crc kubenswrapper[4746]: I0128 20:42:02.940708 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18cbfd39-cf22-428c-ab2a-708082df0357-utilities\") pod \"redhat-marketplace-4t66g\" (UID: \"18cbfd39-cf22-428c-ab2a-708082df0357\") " pod="openshift-marketplace/redhat-marketplace-4t66g" Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.013831 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n9sg\" (UniqueName: \"kubernetes.io/projected/18cbfd39-cf22-428c-ab2a-708082df0357-kube-api-access-6n9sg\") pod \"redhat-marketplace-4t66g\" (UID: \"18cbfd39-cf22-428c-ab2a-708082df0357\") " pod="openshift-marketplace/redhat-marketplace-4t66g" Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.041554 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:03 crc kubenswrapper[4746]: E0128 20:42:03.041888 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:03.5418722 +0000 UTC m=+151.498058554 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.074760 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4t66g" Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.123359 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-85ttn"] Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.123411 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f2t9c"] Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.124682 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2t9c" Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.142697 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:03 crc kubenswrapper[4746]: E0128 20:42:03.143130 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:03.643106571 +0000 UTC m=+151.599292925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.167928 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2t9c"] Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.254292 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsqtr\" (UniqueName: \"kubernetes.io/projected/1f42df00-e947-4928-a51f-ddaa3658cc67-kube-api-access-wsqtr\") pod \"redhat-marketplace-f2t9c\" (UID: \"1f42df00-e947-4928-a51f-ddaa3658cc67\") " pod="openshift-marketplace/redhat-marketplace-f2t9c" Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.254364 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f42df00-e947-4928-a51f-ddaa3658cc67-utilities\") pod \"redhat-marketplace-f2t9c\" (UID: \"1f42df00-e947-4928-a51f-ddaa3658cc67\") " pod="openshift-marketplace/redhat-marketplace-f2t9c" Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.254385 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f42df00-e947-4928-a51f-ddaa3658cc67-catalog-content\") pod \"redhat-marketplace-f2t9c\" (UID: \"1f42df00-e947-4928-a51f-ddaa3658cc67\") " pod="openshift-marketplace/redhat-marketplace-f2t9c" Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.254448 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:03 crc kubenswrapper[4746]: E0128 20:42:03.254737 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:03.754725909 +0000 UTC m=+151.710912263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.258288 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:42:03 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:42:03 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:42:03 crc kubenswrapper[4746]: healthz check failed Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.258344 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:42:03 crc kubenswrapper[4746]: W0128 20:42:03.320390 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-f1650c43814b89ec5e1721d263bf42aefa8b8e8d37f850dd4f17ef7b788a467d WatchSource:0}: Error finding container f1650c43814b89ec5e1721d263bf42aefa8b8e8d37f850dd4f17ef7b788a467d: Status 404 returned error can't find the container with id f1650c43814b89ec5e1721d263bf42aefa8b8e8d37f850dd4f17ef7b788a467d Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.352986 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4c2d2fb47ebac843c150c3c044d92391ecd2680671619bda28b367b67608a90a"} Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.353028 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9b14d7d4011320c62041db50e8a45626b4543ab037afb9caa679312b0481582b"} Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.357563 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.357812 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsqtr\" (UniqueName: \"kubernetes.io/projected/1f42df00-e947-4928-a51f-ddaa3658cc67-kube-api-access-wsqtr\") pod \"redhat-marketplace-f2t9c\" (UID: \"1f42df00-e947-4928-a51f-ddaa3658cc67\") " pod="openshift-marketplace/redhat-marketplace-f2t9c" Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.357862 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f42df00-e947-4928-a51f-ddaa3658cc67-utilities\") pod \"redhat-marketplace-f2t9c\" (UID: \"1f42df00-e947-4928-a51f-ddaa3658cc67\") " pod="openshift-marketplace/redhat-marketplace-f2t9c" Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.357885 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f42df00-e947-4928-a51f-ddaa3658cc67-catalog-content\") pod \"redhat-marketplace-f2t9c\" (UID: \"1f42df00-e947-4928-a51f-ddaa3658cc67\") " pod="openshift-marketplace/redhat-marketplace-f2t9c" Jan 28 20:42:03 crc kubenswrapper[4746]: E0128 20:42:03.358898 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:03.858881015 +0000 UTC m=+151.815067369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.359620 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f42df00-e947-4928-a51f-ddaa3658cc67-utilities\") pod \"redhat-marketplace-f2t9c\" (UID: \"1f42df00-e947-4928-a51f-ddaa3658cc67\") " pod="openshift-marketplace/redhat-marketplace-f2t9c" Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.359846 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f42df00-e947-4928-a51f-ddaa3658cc67-catalog-content\") pod \"redhat-marketplace-f2t9c\" (UID: \"1f42df00-e947-4928-a51f-ddaa3658cc67\") " pod="openshift-marketplace/redhat-marketplace-f2t9c" Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.395338 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-89zbv" event={"ID":"058433b4-653b-4170-83e2-ed7c5d753323","Type":"ContainerStarted","Data":"bb6dd6a305275e60d4b8ac5a352afceffef020d179af397528864d72cf4f7d59"} Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.453354 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsqtr\" (UniqueName: \"kubernetes.io/projected/1f42df00-e947-4928-a51f-ddaa3658cc67-kube-api-access-wsqtr\") pod \"redhat-marketplace-f2t9c\" (UID: \"1f42df00-e947-4928-a51f-ddaa3658cc67\") " pod="openshift-marketplace/redhat-marketplace-f2t9c" Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.460051 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:03 crc kubenswrapper[4746]: E0128 20:42:03.460454 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:03.960440495 +0000 UTC m=+151.916626849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.486213 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2t9c" Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.494116 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85ttn" event={"ID":"6c585264-9fea-4d40-910d-68a31c553f76","Type":"ContainerStarted","Data":"1c152bc56d2dadac40f6ef2599da3caee82ce68ae0450e9ed48e6d99dec850bc"} Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.498839 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hxz5g"] Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.504566 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7ec281e219a4d6a0703ee78cf4d5c8aa173385af57a920f7b0a6ce283d6ac8a0"} Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.522742 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ghx5p"] Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.562978 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:03 crc kubenswrapper[4746]: E0128 20:42:03.568997 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:04.068975746 +0000 UTC m=+152.025162100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.666027 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:03 crc kubenswrapper[4746]: E0128 20:42:03.666550 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:04.166530861 +0000 UTC m=+152.122717215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.713289 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c9v8x"] Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.771746 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:03 crc kubenswrapper[4746]: E0128 20:42:03.772066 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:04.272048486 +0000 UTC m=+152.228234840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.873912 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:03 crc kubenswrapper[4746]: E0128 20:42:03.874677 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:04.374624385 +0000 UTC m=+152.330810739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:03 crc kubenswrapper[4746]: I0128 20:42:03.977617 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:03 crc kubenswrapper[4746]: E0128 20:42:03.978414 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:04.478398289 +0000 UTC m=+152.434584643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.019949 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mlqqs"] Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.021439 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mlqqs" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.069256 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.081809 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:04 crc kubenswrapper[4746]: E0128 20:42:04.082514 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:04.582490742 +0000 UTC m=+152.538677096 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.132543 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mlqqs"] Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.183916 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.184143 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvjxs\" (UniqueName: \"kubernetes.io/projected/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b-kube-api-access-xvjxs\") pod \"redhat-operators-mlqqs\" (UID: \"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b\") " pod="openshift-marketplace/redhat-operators-mlqqs" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.184231 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b-utilities\") pod \"redhat-operators-mlqqs\" (UID: \"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b\") " pod="openshift-marketplace/redhat-operators-mlqqs" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.184254 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b-catalog-content\") pod \"redhat-operators-mlqqs\" (UID: \"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b\") " pod="openshift-marketplace/redhat-operators-mlqqs" Jan 28 20:42:04 crc kubenswrapper[4746]: E0128 20:42:04.184375 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:04.684357232 +0000 UTC m=+152.640543606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.260691 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:42:04 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:42:04 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:42:04 crc kubenswrapper[4746]: healthz check failed Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.260744 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.288980 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.289035 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b-utilities\") pod \"redhat-operators-mlqqs\" (UID: \"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b\") " pod="openshift-marketplace/redhat-operators-mlqqs" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.289060 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b-catalog-content\") pod \"redhat-operators-mlqqs\" (UID: \"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b\") " pod="openshift-marketplace/redhat-operators-mlqqs" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.289175 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvjxs\" (UniqueName: \"kubernetes.io/projected/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b-kube-api-access-xvjxs\") pod \"redhat-operators-mlqqs\" (UID: \"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b\") " pod="openshift-marketplace/redhat-operators-mlqqs" Jan 28 20:42:04 crc kubenswrapper[4746]: E0128 20:42:04.289725 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:04.789711481 +0000 UTC m=+152.745897835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.290677 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b-utilities\") pod \"redhat-operators-mlqqs\" (UID: \"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b\") " pod="openshift-marketplace/redhat-operators-mlqqs" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.290931 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b-catalog-content\") pod \"redhat-operators-mlqqs\" (UID: \"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b\") " pod="openshift-marketplace/redhat-operators-mlqqs" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.296134 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4t66g"] Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.313257 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cl6g8"] Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.314445 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cl6g8" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.346464 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cl6g8"] Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.360403 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvjxs\" (UniqueName: \"kubernetes.io/projected/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b-kube-api-access-xvjxs\") pod \"redhat-operators-mlqqs\" (UID: \"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b\") " pod="openshift-marketplace/redhat-operators-mlqqs" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.391517 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:04 crc kubenswrapper[4746]: E0128 20:42:04.391906 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:04.891882549 +0000 UTC m=+152.848068903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.494648 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee-utilities\") pod \"redhat-operators-cl6g8\" (UID: \"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee\") " pod="openshift-marketplace/redhat-operators-cl6g8" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.494679 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm4t6\" (UniqueName: \"kubernetes.io/projected/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee-kube-api-access-lm4t6\") pod \"redhat-operators-cl6g8\" (UID: \"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee\") " pod="openshift-marketplace/redhat-operators-cl6g8" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.494740 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.494771 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee-catalog-content\") pod \"redhat-operators-cl6g8\" (UID: \"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee\") " pod="openshift-marketplace/redhat-operators-cl6g8" Jan 28 20:42:04 crc kubenswrapper[4746]: E0128 20:42:04.495586 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:04.99557465 +0000 UTC m=+152.951761004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.545235 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.545942 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.553603 4746 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.553966 4746 generic.go:334] "Generic (PLEG): container finished" podID="6c585264-9fea-4d40-910d-68a31c553f76" containerID="6fac53a10394eaceda79307d69b7558b0c7a7c5ebc41fd8e2a9dabdab713dee5" exitCode=0 Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.554046 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85ttn" event={"ID":"6c585264-9fea-4d40-910d-68a31c553f76","Type":"ContainerDied","Data":"6fac53a10394eaceda79307d69b7558b0c7a7c5ebc41fd8e2a9dabdab713dee5"} Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.558432 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.558745 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.610293 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.610682 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee-catalog-content\") pod \"redhat-operators-cl6g8\" (UID: \"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee\") " pod="openshift-marketplace/redhat-operators-cl6g8" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.610750 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee-utilities\") pod \"redhat-operators-cl6g8\" (UID: \"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee\") " pod="openshift-marketplace/redhat-operators-cl6g8" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.610772 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm4t6\" (UniqueName: \"kubernetes.io/projected/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee-kube-api-access-lm4t6\") pod \"redhat-operators-cl6g8\" (UID: \"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee\") " pod="openshift-marketplace/redhat-operators-cl6g8" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.612218 4746 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 20:42:04 crc kubenswrapper[4746]: E0128 20:42:04.612638 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:05.112619924 +0000 UTC m=+153.068806278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.613001 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee-catalog-content\") pod \"redhat-operators-cl6g8\" (UID: \"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee\") " pod="openshift-marketplace/redhat-operators-cl6g8" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.613251 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee-utilities\") pod \"redhat-operators-cl6g8\" (UID: \"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee\") " pod="openshift-marketplace/redhat-operators-cl6g8" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.628269 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hxz5g" event={"ID":"f8f220cb-71e1-4b97-960e-ef8742661130","Type":"ContainerStarted","Data":"3088251d6df21e737845c981681e5810236877bf93c54cf60f668108a5fec33d"} Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.646168 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.649600 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mlqqs" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.650126 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2t9c"] Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.651214 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm4t6\" (UniqueName: \"kubernetes.io/projected/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee-kube-api-access-lm4t6\") pod \"redhat-operators-cl6g8\" (UID: \"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee\") " pod="openshift-marketplace/redhat-operators-cl6g8" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.685456 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a8bfa9b6d39809dc1a4f48ff1972ad088e0bdc2439d254290554f68364fc1317"} Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.711845 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/763cd6ec-43ad-4481-bd62-0864f47f1b0e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"763cd6ec-43ad-4481-bd62-0864f47f1b0e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.711950 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/763cd6ec-43ad-4481-bd62-0864f47f1b0e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"763cd6ec-43ad-4481-bd62-0864f47f1b0e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.712030 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.712156 4746 patch_prober.go:28] interesting pod/downloads-7954f5f757-qrffw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.712205 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qrffw" podUID="b34266ce-b971-4f4b-b8b7-c54ff8b6212c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.712260 4746 patch_prober.go:28] interesting pod/downloads-7954f5f757-qrffw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.712285 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qrffw" podUID="b34266ce-b971-4f4b-b8b7-c54ff8b6212c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 28 20:42:04 crc kubenswrapper[4746]: E0128 20:42:04.712370 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:05.212356372 +0000 UTC m=+153.168542726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.726339 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9v8x" event={"ID":"abe404c6-f1c8-4ad6-92b9-7c082b112b50","Type":"ContainerStarted","Data":"cb598f9ea962c4558236bbf4b7f16c6f3a29aab5024527148774d0e8547405ed"} Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.726402 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9v8x" event={"ID":"abe404c6-f1c8-4ad6-92b9-7c082b112b50","Type":"ContainerStarted","Data":"2afa48be8e6e186e182a9adcdf5c0a0656a4e51c7c293d2ec30272750799e7c6"} Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.755676 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cl6g8" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.756014 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t66g" event={"ID":"18cbfd39-cf22-428c-ab2a-708082df0357","Type":"ContainerStarted","Data":"dbcf39c2a0ca5808d27f687c44c2a3c58ea2e8dd4303076e253d69c06dd6de74"} Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.770326 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f70fadd2e87953d6874b18ca48f38f274f4af1f63ac5dfa1a9a6b81d3e071d51"} Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.770370 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f1650c43814b89ec5e1721d263bf42aefa8b8e8d37f850dd4f17ef7b788a467d"} Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.770622 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.788696 4746 generic.go:334] "Generic (PLEG): container finished" podID="fa890224-0942-4671-a9d8-97b6f465b0df" containerID="7d8495ae7bda633303f679c164a3d194416ae004312340eab8c9690fba4331f2" exitCode=0 Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.788774 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ghx5p" event={"ID":"fa890224-0942-4671-a9d8-97b6f465b0df","Type":"ContainerDied","Data":"7d8495ae7bda633303f679c164a3d194416ae004312340eab8c9690fba4331f2"} Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.788801 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ghx5p" event={"ID":"fa890224-0942-4671-a9d8-97b6f465b0df","Type":"ContainerStarted","Data":"1ec69a65aba434976ee0a2f50750e942a4b966ba8d025d472492feb7673348a6"} Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.815953 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.816300 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/763cd6ec-43ad-4481-bd62-0864f47f1b0e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"763cd6ec-43ad-4481-bd62-0864f47f1b0e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.816392 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/763cd6ec-43ad-4481-bd62-0864f47f1b0e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"763cd6ec-43ad-4481-bd62-0864f47f1b0e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.816589 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/763cd6ec-43ad-4481-bd62-0864f47f1b0e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"763cd6ec-43ad-4481-bd62-0864f47f1b0e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 20:42:04 crc kubenswrapper[4746]: E0128 20:42:04.816682 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:05.316663982 +0000 UTC m=+153.272850336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.865198 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/763cd6ec-43ad-4481-bd62-0864f47f1b0e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"763cd6ec-43ad-4481-bd62-0864f47f1b0e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.911455 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.911884 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.911899 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-89zbv" event={"ID":"058433b4-653b-4170-83e2-ed7c5d753323","Type":"ContainerStarted","Data":"a0a8b74aae8044950d5f6286dd6721d203e96cc0157a42f905ea3d0a3f8cd6a9"} Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.911757 4746 patch_prober.go:28] interesting pod/console-f9d7485db-hcxv8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.912605 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hcxv8" podUID="94cef654-afbe-42c2-8069-5dbcb7294abb" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Jan 28 20:42:04 crc kubenswrapper[4746]: I0128 20:42:04.919407 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:04 crc kubenswrapper[4746]: E0128 20:42:04.920059 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-28 20:42:05.420047545 +0000 UTC m=+153.376233899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lwhk4" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.021226 4746 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-28T20:42:04.553622139Z","Handler":null,"Name":""} Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.021541 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:05 crc kubenswrapper[4746]: E0128 20:42:05.023583 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-28 20:42:05.523562942 +0000 UTC m=+153.479749296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.032100 4746 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.032144 4746 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.055719 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.119551 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.119604 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.133135 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.157006 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.163510 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.163550 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.247298 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.249852 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:42:05 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:42:05 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:42:05 crc kubenswrapper[4746]: healthz check failed Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.249906 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.273786 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lwhk4\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.338011 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.345640 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.415555 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.507465 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mlqqs"] Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.552491 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cl6g8"] Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.560303 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.560355 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.587149 4746 patch_prober.go:28] interesting pod/apiserver-76f77b778f-sk7xz container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 28 20:42:05 crc kubenswrapper[4746]: [+]log ok Jan 28 20:42:05 crc kubenswrapper[4746]: [+]etcd ok Jan 28 20:42:05 crc kubenswrapper[4746]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 28 20:42:05 crc kubenswrapper[4746]: [+]poststarthook/generic-apiserver-start-informers ok Jan 28 20:42:05 crc kubenswrapper[4746]: [+]poststarthook/max-in-flight-filter ok Jan 28 20:42:05 crc kubenswrapper[4746]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 28 20:42:05 crc kubenswrapper[4746]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 28 20:42:05 crc kubenswrapper[4746]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 28 20:42:05 crc kubenswrapper[4746]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 28 20:42:05 crc kubenswrapper[4746]: [+]poststarthook/project.openshift.io-projectcache ok Jan 28 20:42:05 crc kubenswrapper[4746]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 28 20:42:05 crc kubenswrapper[4746]: [+]poststarthook/openshift.io-startinformers ok Jan 28 20:42:05 crc kubenswrapper[4746]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 28 20:42:05 crc kubenswrapper[4746]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 28 20:42:05 crc kubenswrapper[4746]: livez check failed Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.587231 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" podUID="af3e8fcb-fbde-4b65-92ab-3d8b71b2de07" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.676445 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.792979 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lwhk4"] Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.856986 4746 generic.go:334] "Generic (PLEG): container finished" podID="abe404c6-f1c8-4ad6-92b9-7c082b112b50" containerID="cb598f9ea962c4558236bbf4b7f16c6f3a29aab5024527148774d0e8547405ed" exitCode=0 Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.857067 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9v8x" event={"ID":"abe404c6-f1c8-4ad6-92b9-7c082b112b50","Type":"ContainerDied","Data":"cb598f9ea962c4558236bbf4b7f16c6f3a29aab5024527148774d0e8547405ed"} Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.860955 4746 generic.go:334] "Generic (PLEG): container finished" podID="18cbfd39-cf22-428c-ab2a-708082df0357" containerID="640584f5445acc292be3e899fbe66326f8914115659d5783f447fd6048c07e74" exitCode=0 Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.861384 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t66g" event={"ID":"18cbfd39-cf22-428c-ab2a-708082df0357","Type":"ContainerDied","Data":"640584f5445acc292be3e899fbe66326f8914115659d5783f447fd6048c07e74"} Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.867203 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlqqs" event={"ID":"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b","Type":"ContainerStarted","Data":"813e0f60a52a5e3ac9fdec874e3f88c3de70aa414cf90abda02a22436f40adaa"} Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.869564 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" event={"ID":"627f2e7c-f091-4ea2-9c3c-fce02f2b7669","Type":"ContainerStarted","Data":"d0c002ddefd779f35ff45eca4808099be622b3caba190689f19423097af3c16d"} Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.885971 4746 generic.go:334] "Generic (PLEG): container finished" podID="1f42df00-e947-4928-a51f-ddaa3658cc67" containerID="17f2ca649624860330af720f180c4cae93097af14ba360d93ac5e2164112d0c4" exitCode=0 Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.886064 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2t9c" event={"ID":"1f42df00-e947-4928-a51f-ddaa3658cc67","Type":"ContainerDied","Data":"17f2ca649624860330af720f180c4cae93097af14ba360d93ac5e2164112d0c4"} Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.886196 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2t9c" event={"ID":"1f42df00-e947-4928-a51f-ddaa3658cc67","Type":"ContainerStarted","Data":"59f40486ad366506798fc88daee90c33d4b58a66f1f18831ade1f670ae41b699"} Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.894097 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-89zbv" event={"ID":"058433b4-653b-4170-83e2-ed7c5d753323","Type":"ContainerStarted","Data":"a1739c23c158d33f8f009e0bed0bd32375c1fe54afbdbfe278074774524f86d0"} Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.896982 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"763cd6ec-43ad-4481-bd62-0864f47f1b0e","Type":"ContainerStarted","Data":"c368a40ae649e0aaace0bc72712493a6571ffc8e80a3294c9dc2e809ff3bdd17"} Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.912063 4746 generic.go:334] "Generic (PLEG): container finished" podID="c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6" containerID="0285afbcb142cdf5b9d8c281aa5e2ffa73afb12d29dd61ed8bc2f60ab072fa58" exitCode=0 Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.912156 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6" event={"ID":"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6","Type":"ContainerDied","Data":"0285afbcb142cdf5b9d8c281aa5e2ffa73afb12d29dd61ed8bc2f60ab072fa58"} Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.917410 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cl6g8" event={"ID":"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee","Type":"ContainerStarted","Data":"130f1ae53c7bccf0e87fcafd6727a8da8f02b9bb27bcdb9627449d3dd4380dde"} Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.918472 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.923572 4746 generic.go:334] "Generic (PLEG): container finished" podID="f8f220cb-71e1-4b97-960e-ef8742661130" containerID="ee30384131315f8e32eb7d79d6e8219c181b4d8d31cea43f077290c1cfe6bddb" exitCode=0 Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.924618 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hxz5g" event={"ID":"f8f220cb-71e1-4b97-960e-ef8742661130","Type":"ContainerDied","Data":"ee30384131315f8e32eb7d79d6e8219c181b4d8d31cea43f077290c1cfe6bddb"} Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.943765 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nz5l2" Jan 28 20:42:05 crc kubenswrapper[4746]: I0128 20:42:05.945464 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-89zbv" podStartSLOduration=13.945451442 podStartE2EDuration="13.945451442s" podCreationTimestamp="2026-01-28 20:41:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:05.929806025 +0000 UTC m=+153.885992399" watchObservedRunningTime="2026-01-28 20:42:05.945451442 +0000 UTC m=+153.901637796" Jan 28 20:42:06 crc kubenswrapper[4746]: I0128 20:42:06.252253 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:42:06 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:42:06 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:42:06 crc kubenswrapper[4746]: healthz check failed Jan 28 20:42:06 crc kubenswrapper[4746]: I0128 20:42:06.252319 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:42:06 crc kubenswrapper[4746]: I0128 20:42:06.889998 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 28 20:42:06 crc kubenswrapper[4746]: I0128 20:42:06.942809 4746 generic.go:334] "Generic (PLEG): container finished" podID="8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" containerID="9499d6f79c0f3462bab4e907e6efe9120566a0596c9ba1f9be584a03dd93d8d1" exitCode=0 Jan 28 20:42:06 crc kubenswrapper[4746]: I0128 20:42:06.942996 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlqqs" event={"ID":"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b","Type":"ContainerDied","Data":"9499d6f79c0f3462bab4e907e6efe9120566a0596c9ba1f9be584a03dd93d8d1"} Jan 28 20:42:06 crc kubenswrapper[4746]: I0128 20:42:06.953896 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" event={"ID":"627f2e7c-f091-4ea2-9c3c-fce02f2b7669","Type":"ContainerStarted","Data":"d9c0f04370f1bdb461f2c7bd32fdc1dac6f456ff6b66d902ff14df19d51c7911"} Jan 28 20:42:06 crc kubenswrapper[4746]: I0128 20:42:06.954227 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:06 crc kubenswrapper[4746]: I0128 20:42:06.958244 4746 generic.go:334] "Generic (PLEG): container finished" podID="8f41ddf2-d97e-4e32-ac3d-48850a4d47ee" containerID="6345f42eeb2545e0d80d647e35269349daf234ba1a1afcf516046fb572b40f7b" exitCode=0 Jan 28 20:42:06 crc kubenswrapper[4746]: I0128 20:42:06.958294 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cl6g8" event={"ID":"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee","Type":"ContainerDied","Data":"6345f42eeb2545e0d80d647e35269349daf234ba1a1afcf516046fb572b40f7b"} Jan 28 20:42:06 crc kubenswrapper[4746]: I0128 20:42:06.966328 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"763cd6ec-43ad-4481-bd62-0864f47f1b0e","Type":"ContainerStarted","Data":"20c9e80f71157caebc479f3320e5ad17e6b71bf2525598eca902a6994caae7d3"} Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.014180 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" podStartSLOduration=135.014159296 podStartE2EDuration="2m15.014159296s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:07.011490579 +0000 UTC m=+154.967676933" watchObservedRunningTime="2026-01-28 20:42:07.014159296 +0000 UTC m=+154.970345650" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.094438 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.094395808 podStartE2EDuration="3.094395808s" podCreationTimestamp="2026-01-28 20:42:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:07.079344068 +0000 UTC m=+155.035530422" watchObservedRunningTime="2026-01-28 20:42:07.094395808 +0000 UTC m=+155.050582162" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.255753 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:42:07 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:42:07 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:42:07 crc kubenswrapper[4746]: healthz check failed Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.255809 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.408907 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.501422 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kglc\" (UniqueName: \"kubernetes.io/projected/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6-kube-api-access-9kglc\") pod \"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6\" (UID: \"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6\") " Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.501563 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6-secret-volume\") pod \"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6\" (UID: \"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6\") " Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.502200 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6-config-volume\") pod \"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6\" (UID: \"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6\") " Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.503247 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6-config-volume" (OuterVolumeSpecName: "config-volume") pod "c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6" (UID: "c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.516192 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6-kube-api-access-9kglc" (OuterVolumeSpecName: "kube-api-access-9kglc") pod "c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6" (UID: "c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6"). InnerVolumeSpecName "kube-api-access-9kglc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.529828 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6" (UID: "c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.604339 4746 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.604386 4746 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.604397 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kglc\" (UniqueName: \"kubernetes.io/projected/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6-kube-api-access-9kglc\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.747466 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 28 20:42:07 crc kubenswrapper[4746]: E0128 20:42:07.747734 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6" containerName="collect-profiles" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.747747 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6" containerName="collect-profiles" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.747860 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6" containerName="collect-profiles" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.748517 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.757138 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.758955 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.759222 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.909356 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6379e9ad-dca2-43a4-92eb-368b25447884-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6379e9ad-dca2-43a4-92eb-368b25447884\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.909472 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6379e9ad-dca2-43a4-92eb-368b25447884-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6379e9ad-dca2-43a4-92eb-368b25447884\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.983693 4746 generic.go:334] "Generic (PLEG): container finished" podID="763cd6ec-43ad-4481-bd62-0864f47f1b0e" containerID="20c9e80f71157caebc479f3320e5ad17e6b71bf2525598eca902a6994caae7d3" exitCode=0 Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.983766 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"763cd6ec-43ad-4481-bd62-0864f47f1b0e","Type":"ContainerDied","Data":"20c9e80f71157caebc479f3320e5ad17e6b71bf2525598eca902a6994caae7d3"} Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.988534 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6" event={"ID":"c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6","Type":"ContainerDied","Data":"b18e900b3e8378a4deae407829dbf120d084c32a5a579ef60953929086a0cc81"} Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.988579 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b18e900b3e8378a4deae407829dbf120d084c32a5a579ef60953929086a0cc81" Jan 28 20:42:07 crc kubenswrapper[4746]: I0128 20:42:07.988610 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6" Jan 28 20:42:08 crc kubenswrapper[4746]: I0128 20:42:08.010474 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6379e9ad-dca2-43a4-92eb-368b25447884-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6379e9ad-dca2-43a4-92eb-368b25447884\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 20:42:08 crc kubenswrapper[4746]: I0128 20:42:08.010606 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6379e9ad-dca2-43a4-92eb-368b25447884-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6379e9ad-dca2-43a4-92eb-368b25447884\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 20:42:08 crc kubenswrapper[4746]: I0128 20:42:08.010621 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6379e9ad-dca2-43a4-92eb-368b25447884-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6379e9ad-dca2-43a4-92eb-368b25447884\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 20:42:08 crc kubenswrapper[4746]: I0128 20:42:08.045421 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6379e9ad-dca2-43a4-92eb-368b25447884-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6379e9ad-dca2-43a4-92eb-368b25447884\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 20:42:08 crc kubenswrapper[4746]: I0128 20:42:08.113724 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 20:42:08 crc kubenswrapper[4746]: I0128 20:42:08.250656 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:42:08 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:42:08 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:42:08 crc kubenswrapper[4746]: healthz check failed Jan 28 20:42:08 crc kubenswrapper[4746]: I0128 20:42:08.250719 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:42:08 crc kubenswrapper[4746]: I0128 20:42:08.698918 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 28 20:42:08 crc kubenswrapper[4746]: W0128 20:42:08.748745 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6379e9ad_dca2_43a4_92eb_368b25447884.slice/crio-be76f68a6f1900c6dd5b125ed615f7b82e8479ea1f447c332cdd6cbb11263eff WatchSource:0}: Error finding container be76f68a6f1900c6dd5b125ed615f7b82e8479ea1f447c332cdd6cbb11263eff: Status 404 returned error can't find the container with id be76f68a6f1900c6dd5b125ed615f7b82e8479ea1f447c332cdd6cbb11263eff Jan 28 20:42:09 crc kubenswrapper[4746]: I0128 20:42:09.070950 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6379e9ad-dca2-43a4-92eb-368b25447884","Type":"ContainerStarted","Data":"be76f68a6f1900c6dd5b125ed615f7b82e8479ea1f447c332cdd6cbb11263eff"} Jan 28 20:42:09 crc kubenswrapper[4746]: I0128 20:42:09.252309 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:42:09 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:42:09 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:42:09 crc kubenswrapper[4746]: healthz check failed Jan 28 20:42:09 crc kubenswrapper[4746]: I0128 20:42:09.252759 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:42:09 crc kubenswrapper[4746]: I0128 20:42:09.470970 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 20:42:09 crc kubenswrapper[4746]: I0128 20:42:09.553731 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/763cd6ec-43ad-4481-bd62-0864f47f1b0e-kube-api-access\") pod \"763cd6ec-43ad-4481-bd62-0864f47f1b0e\" (UID: \"763cd6ec-43ad-4481-bd62-0864f47f1b0e\") " Jan 28 20:42:09 crc kubenswrapper[4746]: I0128 20:42:09.553854 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/763cd6ec-43ad-4481-bd62-0864f47f1b0e-kubelet-dir\") pod \"763cd6ec-43ad-4481-bd62-0864f47f1b0e\" (UID: \"763cd6ec-43ad-4481-bd62-0864f47f1b0e\") " Jan 28 20:42:09 crc kubenswrapper[4746]: I0128 20:42:09.554155 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/763cd6ec-43ad-4481-bd62-0864f47f1b0e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "763cd6ec-43ad-4481-bd62-0864f47f1b0e" (UID: "763cd6ec-43ad-4481-bd62-0864f47f1b0e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:42:09 crc kubenswrapper[4746]: I0128 20:42:09.564230 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/763cd6ec-43ad-4481-bd62-0864f47f1b0e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "763cd6ec-43ad-4481-bd62-0864f47f1b0e" (UID: "763cd6ec-43ad-4481-bd62-0864f47f1b0e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:42:09 crc kubenswrapper[4746]: I0128 20:42:09.655643 4746 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/763cd6ec-43ad-4481-bd62-0864f47f1b0e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:09 crc kubenswrapper[4746]: I0128 20:42:09.655676 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/763cd6ec-43ad-4481-bd62-0864f47f1b0e-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:10 crc kubenswrapper[4746]: I0128 20:42:10.107311 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6379e9ad-dca2-43a4-92eb-368b25447884","Type":"ContainerStarted","Data":"812bf20723bccd4093b7516cbac9dd8d3d05606dc7ad10113e2b817665a1d711"} Jan 28 20:42:10 crc kubenswrapper[4746]: I0128 20:42:10.137134 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.137102573 podStartE2EDuration="3.137102573s" podCreationTimestamp="2026-01-28 20:42:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:10.123440462 +0000 UTC m=+158.079626816" watchObservedRunningTime="2026-01-28 20:42:10.137102573 +0000 UTC m=+158.093288927" Jan 28 20:42:10 crc kubenswrapper[4746]: I0128 20:42:10.138667 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"763cd6ec-43ad-4481-bd62-0864f47f1b0e","Type":"ContainerDied","Data":"c368a40ae649e0aaace0bc72712493a6571ffc8e80a3294c9dc2e809ff3bdd17"} Jan 28 20:42:10 crc kubenswrapper[4746]: I0128 20:42:10.138713 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c368a40ae649e0aaace0bc72712493a6571ffc8e80a3294c9dc2e809ff3bdd17" Jan 28 20:42:10 crc kubenswrapper[4746]: I0128 20:42:10.138842 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 28 20:42:10 crc kubenswrapper[4746]: I0128 20:42:10.251352 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:42:10 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:42:10 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:42:10 crc kubenswrapper[4746]: healthz check failed Jan 28 20:42:10 crc kubenswrapper[4746]: I0128 20:42:10.251798 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:42:10 crc kubenswrapper[4746]: I0128 20:42:10.567231 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:42:10 crc kubenswrapper[4746]: I0128 20:42:10.574140 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-sk7xz" Jan 28 20:42:10 crc kubenswrapper[4746]: I0128 20:42:10.978595 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6hnhg" Jan 28 20:42:11 crc kubenswrapper[4746]: I0128 20:42:11.201784 4746 generic.go:334] "Generic (PLEG): container finished" podID="6379e9ad-dca2-43a4-92eb-368b25447884" containerID="812bf20723bccd4093b7516cbac9dd8d3d05606dc7ad10113e2b817665a1d711" exitCode=0 Jan 28 20:42:11 crc kubenswrapper[4746]: I0128 20:42:11.203234 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6379e9ad-dca2-43a4-92eb-368b25447884","Type":"ContainerDied","Data":"812bf20723bccd4093b7516cbac9dd8d3d05606dc7ad10113e2b817665a1d711"} Jan 28 20:42:11 crc kubenswrapper[4746]: I0128 20:42:11.259328 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:42:11 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:42:11 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:42:11 crc kubenswrapper[4746]: healthz check failed Jan 28 20:42:11 crc kubenswrapper[4746]: I0128 20:42:11.259407 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:42:12 crc kubenswrapper[4746]: I0128 20:42:12.250557 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:42:12 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:42:12 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:42:12 crc kubenswrapper[4746]: healthz check failed Jan 28 20:42:12 crc kubenswrapper[4746]: I0128 20:42:12.251129 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:42:13 crc kubenswrapper[4746]: I0128 20:42:13.249295 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:42:13 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:42:13 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:42:13 crc kubenswrapper[4746]: healthz check failed Jan 28 20:42:13 crc kubenswrapper[4746]: I0128 20:42:13.249364 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:42:14 crc kubenswrapper[4746]: I0128 20:42:14.255577 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:42:14 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:42:14 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:42:14 crc kubenswrapper[4746]: healthz check failed Jan 28 20:42:14 crc kubenswrapper[4746]: I0128 20:42:14.256099 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:42:14 crc kubenswrapper[4746]: I0128 20:42:14.711595 4746 patch_prober.go:28] interesting pod/downloads-7954f5f757-qrffw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 28 20:42:14 crc kubenswrapper[4746]: I0128 20:42:14.711690 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qrffw" podUID="b34266ce-b971-4f4b-b8b7-c54ff8b6212c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 28 20:42:14 crc kubenswrapper[4746]: I0128 20:42:14.711959 4746 patch_prober.go:28] interesting pod/downloads-7954f5f757-qrffw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Jan 28 20:42:14 crc kubenswrapper[4746]: I0128 20:42:14.713147 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qrffw" podUID="b34266ce-b971-4f4b-b8b7-c54ff8b6212c" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Jan 28 20:42:14 crc kubenswrapper[4746]: I0128 20:42:14.912650 4746 patch_prober.go:28] interesting pod/console-f9d7485db-hcxv8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Jan 28 20:42:14 crc kubenswrapper[4746]: I0128 20:42:14.912716 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hcxv8" podUID="94cef654-afbe-42c2-8069-5dbcb7294abb" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Jan 28 20:42:15 crc kubenswrapper[4746]: I0128 20:42:15.078390 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs\") pod \"network-metrics-daemon-2blg6\" (UID: \"f60a5487-5012-4cc9-ad94-5dfb4957d74e\") " pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:42:15 crc kubenswrapper[4746]: I0128 20:42:15.086365 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f60a5487-5012-4cc9-ad94-5dfb4957d74e-metrics-certs\") pod \"network-metrics-daemon-2blg6\" (UID: \"f60a5487-5012-4cc9-ad94-5dfb4957d74e\") " pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:42:15 crc kubenswrapper[4746]: I0128 20:42:15.250481 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:42:15 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:42:15 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:42:15 crc kubenswrapper[4746]: healthz check failed Jan 28 20:42:15 crc kubenswrapper[4746]: I0128 20:42:15.250575 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:42:15 crc kubenswrapper[4746]: I0128 20:42:15.350591 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2blg6" Jan 28 20:42:15 crc kubenswrapper[4746]: I0128 20:42:15.872380 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:42:15 crc kubenswrapper[4746]: I0128 20:42:15.872793 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:42:16 crc kubenswrapper[4746]: I0128 20:42:16.250045 4746 patch_prober.go:28] interesting pod/router-default-5444994796-rjmhd container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 28 20:42:16 crc kubenswrapper[4746]: [-]has-synced failed: reason withheld Jan 28 20:42:16 crc kubenswrapper[4746]: [+]process-running ok Jan 28 20:42:16 crc kubenswrapper[4746]: healthz check failed Jan 28 20:42:16 crc kubenswrapper[4746]: I0128 20:42:16.250354 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rjmhd" podUID="c068f936-795b-4eb3-83a8-e363131119e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 28 20:42:17 crc kubenswrapper[4746]: I0128 20:42:17.254764 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:42:17 crc kubenswrapper[4746]: I0128 20:42:17.262455 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-rjmhd" Jan 28 20:42:18 crc kubenswrapper[4746]: I0128 20:42:18.834117 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5d8cs"] Jan 28 20:42:18 crc kubenswrapper[4746]: I0128 20:42:18.834730 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" podUID="0f0eb07a-e0b4-4702-89b0-d94e937471a5" containerName="controller-manager" containerID="cri-o://df9bf263c305c8676e726e87dedce4bc509ccb4a4b19146e7680dedbbb7271af" gracePeriod=30 Jan 28 20:42:18 crc kubenswrapper[4746]: I0128 20:42:18.876118 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2"] Jan 28 20:42:18 crc kubenswrapper[4746]: I0128 20:42:18.876366 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" podUID="447abd89-31fd-4bb6-a965-97d7954f47bb" containerName="route-controller-manager" containerID="cri-o://e3cc08b656c04dd728a73d49429e4cfe55f037ac18b352729b49f276d53467d5" gracePeriod=30 Jan 28 20:42:20 crc kubenswrapper[4746]: I0128 20:42:20.355072 4746 generic.go:334] "Generic (PLEG): container finished" podID="0f0eb07a-e0b4-4702-89b0-d94e937471a5" containerID="df9bf263c305c8676e726e87dedce4bc509ccb4a4b19146e7680dedbbb7271af" exitCode=0 Jan 28 20:42:20 crc kubenswrapper[4746]: I0128 20:42:20.355206 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" event={"ID":"0f0eb07a-e0b4-4702-89b0-d94e937471a5","Type":"ContainerDied","Data":"df9bf263c305c8676e726e87dedce4bc509ccb4a4b19146e7680dedbbb7271af"} Jan 28 20:42:20 crc kubenswrapper[4746]: I0128 20:42:20.358403 4746 generic.go:334] "Generic (PLEG): container finished" podID="447abd89-31fd-4bb6-a965-97d7954f47bb" containerID="e3cc08b656c04dd728a73d49429e4cfe55f037ac18b352729b49f276d53467d5" exitCode=0 Jan 28 20:42:20 crc kubenswrapper[4746]: I0128 20:42:20.358439 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" event={"ID":"447abd89-31fd-4bb6-a965-97d7954f47bb","Type":"ContainerDied","Data":"e3cc08b656c04dd728a73d49429e4cfe55f037ac18b352729b49f276d53467d5"} Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.634629 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.636109 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.714718 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6379e9ad-dca2-43a4-92eb-368b25447884-kubelet-dir\") pod \"6379e9ad-dca2-43a4-92eb-368b25447884\" (UID: \"6379e9ad-dca2-43a4-92eb-368b25447884\") " Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.714822 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rktqc\" (UniqueName: \"kubernetes.io/projected/447abd89-31fd-4bb6-a965-97d7954f47bb-kube-api-access-rktqc\") pod \"447abd89-31fd-4bb6-a965-97d7954f47bb\" (UID: \"447abd89-31fd-4bb6-a965-97d7954f47bb\") " Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.714828 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6379e9ad-dca2-43a4-92eb-368b25447884-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6379e9ad-dca2-43a4-92eb-368b25447884" (UID: "6379e9ad-dca2-43a4-92eb-368b25447884"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.714860 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/447abd89-31fd-4bb6-a965-97d7954f47bb-client-ca\") pod \"447abd89-31fd-4bb6-a965-97d7954f47bb\" (UID: \"447abd89-31fd-4bb6-a965-97d7954f47bb\") " Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.716001 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/447abd89-31fd-4bb6-a965-97d7954f47bb-client-ca" (OuterVolumeSpecName: "client-ca") pod "447abd89-31fd-4bb6-a965-97d7954f47bb" (UID: "447abd89-31fd-4bb6-a965-97d7954f47bb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.716486 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/447abd89-31fd-4bb6-a965-97d7954f47bb-serving-cert\") pod \"447abd89-31fd-4bb6-a965-97d7954f47bb\" (UID: \"447abd89-31fd-4bb6-a965-97d7954f47bb\") " Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.716545 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/447abd89-31fd-4bb6-a965-97d7954f47bb-config\") pod \"447abd89-31fd-4bb6-a965-97d7954f47bb\" (UID: \"447abd89-31fd-4bb6-a965-97d7954f47bb\") " Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.716609 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6379e9ad-dca2-43a4-92eb-368b25447884-kube-api-access\") pod \"6379e9ad-dca2-43a4-92eb-368b25447884\" (UID: \"6379e9ad-dca2-43a4-92eb-368b25447884\") " Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.717177 4746 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6379e9ad-dca2-43a4-92eb-368b25447884-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.717193 4746 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/447abd89-31fd-4bb6-a965-97d7954f47bb-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.717413 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/447abd89-31fd-4bb6-a965-97d7954f47bb-config" (OuterVolumeSpecName: "config") pod "447abd89-31fd-4bb6-a965-97d7954f47bb" (UID: "447abd89-31fd-4bb6-a965-97d7954f47bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.722196 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/447abd89-31fd-4bb6-a965-97d7954f47bb-kube-api-access-rktqc" (OuterVolumeSpecName: "kube-api-access-rktqc") pod "447abd89-31fd-4bb6-a965-97d7954f47bb" (UID: "447abd89-31fd-4bb6-a965-97d7954f47bb"). InnerVolumeSpecName "kube-api-access-rktqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.722575 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6379e9ad-dca2-43a4-92eb-368b25447884-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6379e9ad-dca2-43a4-92eb-368b25447884" (UID: "6379e9ad-dca2-43a4-92eb-368b25447884"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.723151 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/447abd89-31fd-4bb6-a965-97d7954f47bb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "447abd89-31fd-4bb6-a965-97d7954f47bb" (UID: "447abd89-31fd-4bb6-a965-97d7954f47bb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.818444 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/447abd89-31fd-4bb6-a965-97d7954f47bb-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.818477 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6379e9ad-dca2-43a4-92eb-368b25447884-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.818490 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rktqc\" (UniqueName: \"kubernetes.io/projected/447abd89-31fd-4bb6-a965-97d7954f47bb-kube-api-access-rktqc\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:21 crc kubenswrapper[4746]: I0128 20:42:21.818500 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/447abd89-31fd-4bb6-a965-97d7954f47bb-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:22 crc kubenswrapper[4746]: I0128 20:42:22.374263 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" event={"ID":"447abd89-31fd-4bb6-a965-97d7954f47bb","Type":"ContainerDied","Data":"bc5b0dcf6777f7974c3190d1ac497859037c09767ca8f552acdb4d092538fe0b"} Jan 28 20:42:22 crc kubenswrapper[4746]: I0128 20:42:22.374337 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2" Jan 28 20:42:22 crc kubenswrapper[4746]: I0128 20:42:22.374932 4746 scope.go:117] "RemoveContainer" containerID="e3cc08b656c04dd728a73d49429e4cfe55f037ac18b352729b49f276d53467d5" Jan 28 20:42:22 crc kubenswrapper[4746]: I0128 20:42:22.376252 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6379e9ad-dca2-43a4-92eb-368b25447884","Type":"ContainerDied","Data":"be76f68a6f1900c6dd5b125ed615f7b82e8479ea1f447c332cdd6cbb11263eff"} Jan 28 20:42:22 crc kubenswrapper[4746]: I0128 20:42:22.376310 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be76f68a6f1900c6dd5b125ed615f7b82e8479ea1f447c332cdd6cbb11263eff" Jan 28 20:42:22 crc kubenswrapper[4746]: I0128 20:42:22.376403 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 28 20:42:22 crc kubenswrapper[4746]: I0128 20:42:22.418879 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2"] Jan 28 20:42:22 crc kubenswrapper[4746]: I0128 20:42:22.429917 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sjqv2"] Jan 28 20:42:22 crc kubenswrapper[4746]: I0128 20:42:22.844597 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="447abd89-31fd-4bb6-a965-97d7954f47bb" path="/var/lib/kubelet/pods/447abd89-31fd-4bb6-a965-97d7954f47bb/volumes" Jan 28 20:42:23 crc kubenswrapper[4746]: I0128 20:42:23.937808 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl"] Jan 28 20:42:23 crc kubenswrapper[4746]: E0128 20:42:23.938174 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="447abd89-31fd-4bb6-a965-97d7954f47bb" containerName="route-controller-manager" Jan 28 20:42:23 crc kubenswrapper[4746]: I0128 20:42:23.938192 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="447abd89-31fd-4bb6-a965-97d7954f47bb" containerName="route-controller-manager" Jan 28 20:42:23 crc kubenswrapper[4746]: E0128 20:42:23.938218 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763cd6ec-43ad-4481-bd62-0864f47f1b0e" containerName="pruner" Jan 28 20:42:23 crc kubenswrapper[4746]: I0128 20:42:23.938227 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="763cd6ec-43ad-4481-bd62-0864f47f1b0e" containerName="pruner" Jan 28 20:42:23 crc kubenswrapper[4746]: E0128 20:42:23.938240 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6379e9ad-dca2-43a4-92eb-368b25447884" containerName="pruner" Jan 28 20:42:23 crc kubenswrapper[4746]: I0128 20:42:23.938248 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="6379e9ad-dca2-43a4-92eb-368b25447884" containerName="pruner" Jan 28 20:42:23 crc kubenswrapper[4746]: I0128 20:42:23.938375 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="6379e9ad-dca2-43a4-92eb-368b25447884" containerName="pruner" Jan 28 20:42:23 crc kubenswrapper[4746]: I0128 20:42:23.938388 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="447abd89-31fd-4bb6-a965-97d7954f47bb" containerName="route-controller-manager" Jan 28 20:42:23 crc kubenswrapper[4746]: I0128 20:42:23.938406 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="763cd6ec-43ad-4481-bd62-0864f47f1b0e" containerName="pruner" Jan 28 20:42:23 crc kubenswrapper[4746]: I0128 20:42:23.938967 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" Jan 28 20:42:23 crc kubenswrapper[4746]: I0128 20:42:23.944867 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 28 20:42:23 crc kubenswrapper[4746]: I0128 20:42:23.945353 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 28 20:42:23 crc kubenswrapper[4746]: I0128 20:42:23.946221 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 28 20:42:23 crc kubenswrapper[4746]: I0128 20:42:23.946535 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 28 20:42:23 crc kubenswrapper[4746]: I0128 20:42:23.946767 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 28 20:42:23 crc kubenswrapper[4746]: I0128 20:42:23.946947 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 28 20:42:23 crc kubenswrapper[4746]: I0128 20:42:23.951386 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl"] Jan 28 20:42:24 crc kubenswrapper[4746]: I0128 20:42:24.059382 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02bbbb13-0f81-478b-9284-a905b8326768-serving-cert\") pod \"route-controller-manager-6ccff4b8f8-99rzl\" (UID: \"02bbbb13-0f81-478b-9284-a905b8326768\") " pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" Jan 28 20:42:24 crc kubenswrapper[4746]: I0128 20:42:24.059885 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmj6r\" (UniqueName: \"kubernetes.io/projected/02bbbb13-0f81-478b-9284-a905b8326768-kube-api-access-vmj6r\") pod \"route-controller-manager-6ccff4b8f8-99rzl\" (UID: \"02bbbb13-0f81-478b-9284-a905b8326768\") " pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" Jan 28 20:42:24 crc kubenswrapper[4746]: I0128 20:42:24.059934 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02bbbb13-0f81-478b-9284-a905b8326768-client-ca\") pod \"route-controller-manager-6ccff4b8f8-99rzl\" (UID: \"02bbbb13-0f81-478b-9284-a905b8326768\") " pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" Jan 28 20:42:24 crc kubenswrapper[4746]: I0128 20:42:24.059954 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02bbbb13-0f81-478b-9284-a905b8326768-config\") pod \"route-controller-manager-6ccff4b8f8-99rzl\" (UID: \"02bbbb13-0f81-478b-9284-a905b8326768\") " pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" Jan 28 20:42:24 crc kubenswrapper[4746]: I0128 20:42:24.161348 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmj6r\" (UniqueName: \"kubernetes.io/projected/02bbbb13-0f81-478b-9284-a905b8326768-kube-api-access-vmj6r\") pod \"route-controller-manager-6ccff4b8f8-99rzl\" (UID: \"02bbbb13-0f81-478b-9284-a905b8326768\") " pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" Jan 28 20:42:24 crc kubenswrapper[4746]: I0128 20:42:24.161436 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02bbbb13-0f81-478b-9284-a905b8326768-client-ca\") pod \"route-controller-manager-6ccff4b8f8-99rzl\" (UID: \"02bbbb13-0f81-478b-9284-a905b8326768\") " pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" Jan 28 20:42:24 crc kubenswrapper[4746]: I0128 20:42:24.161466 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02bbbb13-0f81-478b-9284-a905b8326768-config\") pod \"route-controller-manager-6ccff4b8f8-99rzl\" (UID: \"02bbbb13-0f81-478b-9284-a905b8326768\") " pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" Jan 28 20:42:24 crc kubenswrapper[4746]: I0128 20:42:24.161498 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02bbbb13-0f81-478b-9284-a905b8326768-serving-cert\") pod \"route-controller-manager-6ccff4b8f8-99rzl\" (UID: \"02bbbb13-0f81-478b-9284-a905b8326768\") " pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" Jan 28 20:42:24 crc kubenswrapper[4746]: I0128 20:42:24.164832 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02bbbb13-0f81-478b-9284-a905b8326768-client-ca\") pod \"route-controller-manager-6ccff4b8f8-99rzl\" (UID: \"02bbbb13-0f81-478b-9284-a905b8326768\") " pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" Jan 28 20:42:24 crc kubenswrapper[4746]: I0128 20:42:24.165440 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02bbbb13-0f81-478b-9284-a905b8326768-config\") pod \"route-controller-manager-6ccff4b8f8-99rzl\" (UID: \"02bbbb13-0f81-478b-9284-a905b8326768\") " pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" Jan 28 20:42:24 crc kubenswrapper[4746]: I0128 20:42:24.170460 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02bbbb13-0f81-478b-9284-a905b8326768-serving-cert\") pod \"route-controller-manager-6ccff4b8f8-99rzl\" (UID: \"02bbbb13-0f81-478b-9284-a905b8326768\") " pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" Jan 28 20:42:24 crc kubenswrapper[4746]: I0128 20:42:24.184651 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmj6r\" (UniqueName: \"kubernetes.io/projected/02bbbb13-0f81-478b-9284-a905b8326768-kube-api-access-vmj6r\") pod \"route-controller-manager-6ccff4b8f8-99rzl\" (UID: \"02bbbb13-0f81-478b-9284-a905b8326768\") " pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" Jan 28 20:42:24 crc kubenswrapper[4746]: I0128 20:42:24.266994 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" Jan 28 20:42:24 crc kubenswrapper[4746]: I0128 20:42:24.731172 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-qrffw" Jan 28 20:42:24 crc kubenswrapper[4746]: I0128 20:42:24.909774 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:42:24 crc kubenswrapper[4746]: I0128 20:42:24.920405 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:42:25 crc kubenswrapper[4746]: I0128 20:42:25.349817 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:42:26 crc kubenswrapper[4746]: I0128 20:42:26.165447 4746 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-5d8cs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 28 20:42:26 crc kubenswrapper[4746]: I0128 20:42:26.165586 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" podUID="0f0eb07a-e0b4-4702-89b0-d94e937471a5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.058205 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.215402 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f0eb07a-e0b4-4702-89b0-d94e937471a5-serving-cert\") pod \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.215481 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0eb07a-e0b4-4702-89b0-d94e937471a5-config\") pod \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.215578 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f0eb07a-e0b4-4702-89b0-d94e937471a5-client-ca\") pod \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.215609 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k8vv\" (UniqueName: \"kubernetes.io/projected/0f0eb07a-e0b4-4702-89b0-d94e937471a5-kube-api-access-2k8vv\") pod \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.215683 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f0eb07a-e0b4-4702-89b0-d94e937471a5-proxy-ca-bundles\") pod \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\" (UID: \"0f0eb07a-e0b4-4702-89b0-d94e937471a5\") " Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.217221 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f0eb07a-e0b4-4702-89b0-d94e937471a5-client-ca" (OuterVolumeSpecName: "client-ca") pod "0f0eb07a-e0b4-4702-89b0-d94e937471a5" (UID: "0f0eb07a-e0b4-4702-89b0-d94e937471a5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.217633 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f0eb07a-e0b4-4702-89b0-d94e937471a5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0f0eb07a-e0b4-4702-89b0-d94e937471a5" (UID: "0f0eb07a-e0b4-4702-89b0-d94e937471a5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.224181 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f0eb07a-e0b4-4702-89b0-d94e937471a5-config" (OuterVolumeSpecName: "config") pod "0f0eb07a-e0b4-4702-89b0-d94e937471a5" (UID: "0f0eb07a-e0b4-4702-89b0-d94e937471a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.224249 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f0eb07a-e0b4-4702-89b0-d94e937471a5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0f0eb07a-e0b4-4702-89b0-d94e937471a5" (UID: "0f0eb07a-e0b4-4702-89b0-d94e937471a5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.224806 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f0eb07a-e0b4-4702-89b0-d94e937471a5-kube-api-access-2k8vv" (OuterVolumeSpecName: "kube-api-access-2k8vv") pod "0f0eb07a-e0b4-4702-89b0-d94e937471a5" (UID: "0f0eb07a-e0b4-4702-89b0-d94e937471a5"). InnerVolumeSpecName "kube-api-access-2k8vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.316976 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f0eb07a-e0b4-4702-89b0-d94e937471a5-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.317019 4746 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0f0eb07a-e0b4-4702-89b0-d94e937471a5-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.317031 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k8vv\" (UniqueName: \"kubernetes.io/projected/0f0eb07a-e0b4-4702-89b0-d94e937471a5-kube-api-access-2k8vv\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.317068 4746 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0f0eb07a-e0b4-4702-89b0-d94e937471a5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.317092 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f0eb07a-e0b4-4702-89b0-d94e937471a5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.416605 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" event={"ID":"0f0eb07a-e0b4-4702-89b0-d94e937471a5","Type":"ContainerDied","Data":"8888ab26476e69ebc3b97c4249886b7c64b042ab0cf20bf596eb2c3ca5c61513"} Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.416689 4746 scope.go:117] "RemoveContainer" containerID="df9bf263c305c8676e726e87dedce4bc509ccb4a4b19146e7680dedbbb7271af" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.416697 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5d8cs" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.454383 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5d8cs"] Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.457305 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5d8cs"] Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.953901 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-666dd99597-bcgsc"] Jan 28 20:42:27 crc kubenswrapper[4746]: E0128 20:42:27.954604 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f0eb07a-e0b4-4702-89b0-d94e937471a5" containerName="controller-manager" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.954622 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f0eb07a-e0b4-4702-89b0-d94e937471a5" containerName="controller-manager" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.954763 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f0eb07a-e0b4-4702-89b0-d94e937471a5" containerName="controller-manager" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.955241 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.957660 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.958349 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.958542 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.960163 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.960974 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.962123 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.965141 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-666dd99597-bcgsc"] Jan 28 20:42:27 crc kubenswrapper[4746]: I0128 20:42:27.973723 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 28 20:42:28 crc kubenswrapper[4746]: I0128 20:42:28.028852 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-client-ca\") pod \"controller-manager-666dd99597-bcgsc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:28 crc kubenswrapper[4746]: I0128 20:42:28.028917 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-serving-cert\") pod \"controller-manager-666dd99597-bcgsc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:28 crc kubenswrapper[4746]: I0128 20:42:28.028967 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-proxy-ca-bundles\") pod \"controller-manager-666dd99597-bcgsc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:28 crc kubenswrapper[4746]: I0128 20:42:28.029130 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-config\") pod \"controller-manager-666dd99597-bcgsc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:28 crc kubenswrapper[4746]: I0128 20:42:28.029329 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxxbd\" (UniqueName: \"kubernetes.io/projected/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-kube-api-access-lxxbd\") pod \"controller-manager-666dd99597-bcgsc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:28 crc kubenswrapper[4746]: I0128 20:42:28.134302 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-proxy-ca-bundles\") pod \"controller-manager-666dd99597-bcgsc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:28 crc kubenswrapper[4746]: I0128 20:42:28.131407 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-proxy-ca-bundles\") pod \"controller-manager-666dd99597-bcgsc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:28 crc kubenswrapper[4746]: I0128 20:42:28.134515 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-config\") pod \"controller-manager-666dd99597-bcgsc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:28 crc kubenswrapper[4746]: I0128 20:42:28.135359 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxxbd\" (UniqueName: \"kubernetes.io/projected/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-kube-api-access-lxxbd\") pod \"controller-manager-666dd99597-bcgsc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:28 crc kubenswrapper[4746]: I0128 20:42:28.135624 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-client-ca\") pod \"controller-manager-666dd99597-bcgsc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:28 crc kubenswrapper[4746]: I0128 20:42:28.135685 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-serving-cert\") pod \"controller-manager-666dd99597-bcgsc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:28 crc kubenswrapper[4746]: I0128 20:42:28.136534 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-config\") pod \"controller-manager-666dd99597-bcgsc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:28 crc kubenswrapper[4746]: I0128 20:42:28.137362 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-client-ca\") pod \"controller-manager-666dd99597-bcgsc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:28 crc kubenswrapper[4746]: I0128 20:42:28.149408 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-serving-cert\") pod \"controller-manager-666dd99597-bcgsc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:28 crc kubenswrapper[4746]: I0128 20:42:28.157234 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxxbd\" (UniqueName: \"kubernetes.io/projected/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-kube-api-access-lxxbd\") pod \"controller-manager-666dd99597-bcgsc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:28 crc kubenswrapper[4746]: I0128 20:42:28.279019 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:28 crc kubenswrapper[4746]: I0128 20:42:28.843068 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f0eb07a-e0b4-4702-89b0-d94e937471a5" path="/var/lib/kubelet/pods/0f0eb07a-e0b4-4702-89b0-d94e937471a5/volumes" Jan 28 20:42:35 crc kubenswrapper[4746]: I0128 20:42:35.581264 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zzrh5" Jan 28 20:42:37 crc kubenswrapper[4746]: E0128 20:42:37.613208 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 28 20:42:37 crc kubenswrapper[4746]: E0128 20:42:37.613491 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pvnnm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-85ttn_openshift-marketplace(6c585264-9fea-4d40-910d-68a31c553f76): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 20:42:37 crc kubenswrapper[4746]: E0128 20:42:37.614699 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-85ttn" podUID="6c585264-9fea-4d40-910d-68a31c553f76" Jan 28 20:42:38 crc kubenswrapper[4746]: I0128 20:42:38.765416 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-666dd99597-bcgsc"] Jan 28 20:42:38 crc kubenswrapper[4746]: I0128 20:42:38.860303 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl"] Jan 28 20:42:40 crc kubenswrapper[4746]: E0128 20:42:40.523261 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-85ttn" podUID="6c585264-9fea-4d40-910d-68a31c553f76" Jan 28 20:42:40 crc kubenswrapper[4746]: E0128 20:42:40.597196 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 28 20:42:40 crc kubenswrapper[4746]: E0128 20:42:40.597398 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xvjxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-mlqqs_openshift-marketplace(8b55bfa9-466f-44d9-8bc7-753bff9b7a7b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 20:42:40 crc kubenswrapper[4746]: E0128 20:42:40.598453 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 28 20:42:40 crc kubenswrapper[4746]: E0128 20:42:40.598554 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-mlqqs" podUID="8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" Jan 28 20:42:40 crc kubenswrapper[4746]: E0128 20:42:40.598644 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f6jsk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-hxz5g_openshift-marketplace(f8f220cb-71e1-4b97-960e-ef8742661130): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 20:42:40 crc kubenswrapper[4746]: E0128 20:42:40.599841 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-hxz5g" podUID="f8f220cb-71e1-4b97-960e-ef8742661130" Jan 28 20:42:40 crc kubenswrapper[4746]: I0128 20:42:40.871115 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 28 20:42:42 crc kubenswrapper[4746]: E0128 20:42:42.440438 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 28 20:42:42 crc kubenswrapper[4746]: E0128 20:42:42.441510 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pjzkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-c9v8x_openshift-marketplace(abe404c6-f1c8-4ad6-92b9-7c082b112b50): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 20:42:42 crc kubenswrapper[4746]: E0128 20:42:42.442827 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-c9v8x" podUID="abe404c6-f1c8-4ad6-92b9-7c082b112b50" Jan 28 20:42:43 crc kubenswrapper[4746]: I0128 20:42:43.536165 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 28 20:42:43 crc kubenswrapper[4746]: I0128 20:42:43.537496 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 20:42:43 crc kubenswrapper[4746]: I0128 20:42:43.545503 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 28 20:42:43 crc kubenswrapper[4746]: I0128 20:42:43.545984 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 28 20:42:43 crc kubenswrapper[4746]: I0128 20:42:43.550341 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 28 20:42:43 crc kubenswrapper[4746]: I0128 20:42:43.669490 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 20:42:43 crc kubenswrapper[4746]: I0128 20:42:43.669566 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 20:42:43 crc kubenswrapper[4746]: I0128 20:42:43.771615 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 20:42:43 crc kubenswrapper[4746]: I0128 20:42:43.771716 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 20:42:43 crc kubenswrapper[4746]: I0128 20:42:43.771753 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 20:42:43 crc kubenswrapper[4746]: I0128 20:42:43.795797 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 20:42:43 crc kubenswrapper[4746]: I0128 20:42:43.872947 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 20:42:43 crc kubenswrapper[4746]: E0128 20:42:43.876416 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-hxz5g" podUID="f8f220cb-71e1-4b97-960e-ef8742661130" Jan 28 20:42:43 crc kubenswrapper[4746]: E0128 20:42:43.876459 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-mlqqs" podUID="8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" Jan 28 20:42:43 crc kubenswrapper[4746]: E0128 20:42:43.876486 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-c9v8x" podUID="abe404c6-f1c8-4ad6-92b9-7c082b112b50" Jan 28 20:42:43 crc kubenswrapper[4746]: E0128 20:42:43.945540 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 28 20:42:43 crc kubenswrapper[4746]: E0128 20:42:43.946127 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6n9sg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4t66g_openshift-marketplace(18cbfd39-cf22-428c-ab2a-708082df0357): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 20:42:43 crc kubenswrapper[4746]: E0128 20:42:43.947601 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-4t66g" podUID="18cbfd39-cf22-428c-ab2a-708082df0357" Jan 28 20:42:44 crc kubenswrapper[4746]: E0128 20:42:44.004954 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 28 20:42:44 crc kubenswrapper[4746]: E0128 20:42:44.005145 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k2s85,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ghx5p_openshift-marketplace(fa890224-0942-4671-a9d8-97b6f465b0df): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 20:42:44 crc kubenswrapper[4746]: E0128 20:42:44.006483 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ghx5p" podUID="fa890224-0942-4671-a9d8-97b6f465b0df" Jan 28 20:42:44 crc kubenswrapper[4746]: E0128 20:42:44.019252 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 28 20:42:44 crc kubenswrapper[4746]: E0128 20:42:44.020054 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wsqtr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-f2t9c_openshift-marketplace(1f42df00-e947-4928-a51f-ddaa3658cc67): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 28 20:42:44 crc kubenswrapper[4746]: E0128 20:42:44.022669 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-f2t9c" podUID="1f42df00-e947-4928-a51f-ddaa3658cc67" Jan 28 20:42:44 crc kubenswrapper[4746]: I0128 20:42:44.330615 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2blg6"] Jan 28 20:42:44 crc kubenswrapper[4746]: W0128 20:42:44.339410 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf60a5487_5012_4cc9_ad94_5dfb4957d74e.slice/crio-1a430d85257ebdd6c2620cd0239a6559a29a4d5bf3da5a31b52cfc96604778ac WatchSource:0}: Error finding container 1a430d85257ebdd6c2620cd0239a6559a29a4d5bf3da5a31b52cfc96604778ac: Status 404 returned error can't find the container with id 1a430d85257ebdd6c2620cd0239a6559a29a4d5bf3da5a31b52cfc96604778ac Jan 28 20:42:44 crc kubenswrapper[4746]: I0128 20:42:44.390387 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl"] Jan 28 20:42:44 crc kubenswrapper[4746]: I0128 20:42:44.448027 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 28 20:42:44 crc kubenswrapper[4746]: I0128 20:42:44.462259 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-666dd99597-bcgsc"] Jan 28 20:42:44 crc kubenswrapper[4746]: I0128 20:42:44.528627 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" event={"ID":"02bbbb13-0f81-478b-9284-a905b8326768","Type":"ContainerStarted","Data":"04bff03ac282ac7e498c3c9bd3db36205ed63ba07316ee4bafb66683568023ff"} Jan 28 20:42:44 crc kubenswrapper[4746]: I0128 20:42:44.538025 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04","Type":"ContainerStarted","Data":"d20610d2488afe1e332f8296d5198ee3e687624a58b715e6f84f7132e76a9e63"} Jan 28 20:42:44 crc kubenswrapper[4746]: I0128 20:42:44.539633 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" event={"ID":"03192e9a-18c9-4c69-8f44-5bf1c083b7fc","Type":"ContainerStarted","Data":"25ae3c2bc08dffadea1a5d0a6b0e221f42bf2b318f68ee14614016a63f30dd90"} Jan 28 20:42:44 crc kubenswrapper[4746]: I0128 20:42:44.540926 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2blg6" event={"ID":"f60a5487-5012-4cc9-ad94-5dfb4957d74e","Type":"ContainerStarted","Data":"1a430d85257ebdd6c2620cd0239a6559a29a4d5bf3da5a31b52cfc96604778ac"} Jan 28 20:42:44 crc kubenswrapper[4746]: I0128 20:42:44.545152 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cl6g8" event={"ID":"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee","Type":"ContainerStarted","Data":"b791457d961abcfb593dd83a572da88bcc9eab6f3644ef823352350e466ab91e"} Jan 28 20:42:44 crc kubenswrapper[4746]: E0128 20:42:44.547902 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ghx5p" podUID="fa890224-0942-4671-a9d8-97b6f465b0df" Jan 28 20:42:44 crc kubenswrapper[4746]: E0128 20:42:44.548579 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4t66g" podUID="18cbfd39-cf22-428c-ab2a-708082df0357" Jan 28 20:42:44 crc kubenswrapper[4746]: E0128 20:42:44.548766 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-f2t9c" podUID="1f42df00-e947-4928-a51f-ddaa3658cc67" Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.553997 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2blg6" event={"ID":"f60a5487-5012-4cc9-ad94-5dfb4957d74e","Type":"ContainerStarted","Data":"25737c504f870b7e2c3b8518605b5c427a4ade3d7ff7bf86d6239de263df09e1"} Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.554501 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2blg6" event={"ID":"f60a5487-5012-4cc9-ad94-5dfb4957d74e","Type":"ContainerStarted","Data":"c27b13361294567e0f1df96c4382758a4194c9bfb7cfe304dedfde8df6ea4520"} Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.557844 4746 generic.go:334] "Generic (PLEG): container finished" podID="8f41ddf2-d97e-4e32-ac3d-48850a4d47ee" containerID="b791457d961abcfb593dd83a572da88bcc9eab6f3644ef823352350e466ab91e" exitCode=0 Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.557938 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cl6g8" event={"ID":"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee","Type":"ContainerDied","Data":"b791457d961abcfb593dd83a572da88bcc9eab6f3644ef823352350e466ab91e"} Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.563036 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" event={"ID":"02bbbb13-0f81-478b-9284-a905b8326768","Type":"ContainerStarted","Data":"b18729b4a167d9d05e5ced0f1db35f394c546f826ab5a89343e608e2a4fc3474"} Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.563327 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" podUID="02bbbb13-0f81-478b-9284-a905b8326768" containerName="route-controller-manager" containerID="cri-o://b18729b4a167d9d05e5ced0f1db35f394c546f826ab5a89343e608e2a4fc3474" gracePeriod=30 Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.563910 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.583034 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2blg6" podStartSLOduration=173.583013448 podStartE2EDuration="2m53.583013448s" podCreationTimestamp="2026-01-28 20:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:45.572297512 +0000 UTC m=+193.528483866" watchObservedRunningTime="2026-01-28 20:42:45.583013448 +0000 UTC m=+193.539199802" Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.587815 4746 patch_prober.go:28] interesting pod/route-controller-manager-6ccff4b8f8-99rzl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": EOF" start-of-body= Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.587937 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" podUID="02bbbb13-0f81-478b-9284-a905b8326768" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": EOF" Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.588337 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04","Type":"ContainerStarted","Data":"e92c8e92331bf9dd167150bb33cf9fca6b295935ee0b99e304f3cfafef2b25f7"} Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.593202 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" event={"ID":"03192e9a-18c9-4c69-8f44-5bf1c083b7fc","Type":"ContainerStarted","Data":"78c3550fbc07038b5c9ce6151fbccf75a21193ece1c0238dfd186ed9a6e8fa0b"} Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.593471 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" podUID="03192e9a-18c9-4c69-8f44-5bf1c083b7fc" containerName="controller-manager" containerID="cri-o://78c3550fbc07038b5c9ce6151fbccf75a21193ece1c0238dfd186ed9a6e8fa0b" gracePeriod=30 Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.593901 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.604107 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" podStartSLOduration=27.604068979 podStartE2EDuration="27.604068979s" podCreationTimestamp="2026-01-28 20:42:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:45.595782663 +0000 UTC m=+193.551969017" watchObservedRunningTime="2026-01-28 20:42:45.604068979 +0000 UTC m=+193.560255333" Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.604351 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.643099 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" podStartSLOduration=27.643066032 podStartE2EDuration="27.643066032s" podCreationTimestamp="2026-01-28 20:42:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:45.636912857 +0000 UTC m=+193.593099211" watchObservedRunningTime="2026-01-28 20:42:45.643066032 +0000 UTC m=+193.599252386" Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.871779 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.872179 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:42:45 crc kubenswrapper[4746]: I0128 20:42:45.990736 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.011212 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=3.011195277 podStartE2EDuration="3.011195277s" podCreationTimestamp="2026-01-28 20:42:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:45.663743383 +0000 UTC m=+193.619929737" watchObservedRunningTime="2026-01-28 20:42:46.011195277 +0000 UTC m=+193.967381631" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.031016 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx"] Jan 28 20:42:46 crc kubenswrapper[4746]: E0128 20:42:46.031534 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02bbbb13-0f81-478b-9284-a905b8326768" containerName="route-controller-manager" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.031558 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="02bbbb13-0f81-478b-9284-a905b8326768" containerName="route-controller-manager" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.031876 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="02bbbb13-0f81-478b-9284-a905b8326768" containerName="route-controller-manager" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.032634 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.052790 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx"] Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.059589 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.129191 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02bbbb13-0f81-478b-9284-a905b8326768-config\") pod \"02bbbb13-0f81-478b-9284-a905b8326768\" (UID: \"02bbbb13-0f81-478b-9284-a905b8326768\") " Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.129316 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02bbbb13-0f81-478b-9284-a905b8326768-serving-cert\") pod \"02bbbb13-0f81-478b-9284-a905b8326768\" (UID: \"02bbbb13-0f81-478b-9284-a905b8326768\") " Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.129401 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmj6r\" (UniqueName: \"kubernetes.io/projected/02bbbb13-0f81-478b-9284-a905b8326768-kube-api-access-vmj6r\") pod \"02bbbb13-0f81-478b-9284-a905b8326768\" (UID: \"02bbbb13-0f81-478b-9284-a905b8326768\") " Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.129436 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-client-ca\") pod \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.129491 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-serving-cert\") pod \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.129516 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-proxy-ca-bundles\") pod \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.129562 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxxbd\" (UniqueName: \"kubernetes.io/projected/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-kube-api-access-lxxbd\") pod \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.129589 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02bbbb13-0f81-478b-9284-a905b8326768-client-ca\") pod \"02bbbb13-0f81-478b-9284-a905b8326768\" (UID: \"02bbbb13-0f81-478b-9284-a905b8326768\") " Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.129621 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-config\") pod \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\" (UID: \"03192e9a-18c9-4c69-8f44-5bf1c083b7fc\") " Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.129912 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-client-ca\") pod \"route-controller-manager-7c874fbf6-kz5dx\" (UID: \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\") " pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.129959 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-serving-cert\") pod \"route-controller-manager-7c874fbf6-kz5dx\" (UID: \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\") " pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.130027 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-config\") pod \"route-controller-manager-7c874fbf6-kz5dx\" (UID: \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\") " pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.130075 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjtmf\" (UniqueName: \"kubernetes.io/projected/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-kube-api-access-gjtmf\") pod \"route-controller-manager-7c874fbf6-kz5dx\" (UID: \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\") " pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.131280 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02bbbb13-0f81-478b-9284-a905b8326768-config" (OuterVolumeSpecName: "config") pod "02bbbb13-0f81-478b-9284-a905b8326768" (UID: "02bbbb13-0f81-478b-9284-a905b8326768"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.132961 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02bbbb13-0f81-478b-9284-a905b8326768-client-ca" (OuterVolumeSpecName: "client-ca") pod "02bbbb13-0f81-478b-9284-a905b8326768" (UID: "02bbbb13-0f81-478b-9284-a905b8326768"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.133209 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "03192e9a-18c9-4c69-8f44-5bf1c083b7fc" (UID: "03192e9a-18c9-4c69-8f44-5bf1c083b7fc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.133462 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-config" (OuterVolumeSpecName: "config") pod "03192e9a-18c9-4c69-8f44-5bf1c083b7fc" (UID: "03192e9a-18c9-4c69-8f44-5bf1c083b7fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.133500 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-client-ca" (OuterVolumeSpecName: "client-ca") pod "03192e9a-18c9-4c69-8f44-5bf1c083b7fc" (UID: "03192e9a-18c9-4c69-8f44-5bf1c083b7fc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.139054 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "03192e9a-18c9-4c69-8f44-5bf1c083b7fc" (UID: "03192e9a-18c9-4c69-8f44-5bf1c083b7fc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.141287 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-kube-api-access-lxxbd" (OuterVolumeSpecName: "kube-api-access-lxxbd") pod "03192e9a-18c9-4c69-8f44-5bf1c083b7fc" (UID: "03192e9a-18c9-4c69-8f44-5bf1c083b7fc"). InnerVolumeSpecName "kube-api-access-lxxbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.142225 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02bbbb13-0f81-478b-9284-a905b8326768-kube-api-access-vmj6r" (OuterVolumeSpecName: "kube-api-access-vmj6r") pod "02bbbb13-0f81-478b-9284-a905b8326768" (UID: "02bbbb13-0f81-478b-9284-a905b8326768"). InnerVolumeSpecName "kube-api-access-vmj6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.146428 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02bbbb13-0f81-478b-9284-a905b8326768-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "02bbbb13-0f81-478b-9284-a905b8326768" (UID: "02bbbb13-0f81-478b-9284-a905b8326768"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.232152 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-config\") pod \"route-controller-manager-7c874fbf6-kz5dx\" (UID: \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\") " pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.232445 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjtmf\" (UniqueName: \"kubernetes.io/projected/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-kube-api-access-gjtmf\") pod \"route-controller-manager-7c874fbf6-kz5dx\" (UID: \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\") " pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.232497 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-client-ca\") pod \"route-controller-manager-7c874fbf6-kz5dx\" (UID: \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\") " pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.232556 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-serving-cert\") pod \"route-controller-manager-7c874fbf6-kz5dx\" (UID: \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\") " pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.232608 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxxbd\" (UniqueName: \"kubernetes.io/projected/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-kube-api-access-lxxbd\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.232637 4746 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02bbbb13-0f81-478b-9284-a905b8326768-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.232648 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.232659 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02bbbb13-0f81-478b-9284-a905b8326768-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.232669 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02bbbb13-0f81-478b-9284-a905b8326768-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.232679 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmj6r\" (UniqueName: \"kubernetes.io/projected/02bbbb13-0f81-478b-9284-a905b8326768-kube-api-access-vmj6r\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.232690 4746 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.232699 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.232710 4746 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03192e9a-18c9-4c69-8f44-5bf1c083b7fc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.233719 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-client-ca\") pod \"route-controller-manager-7c874fbf6-kz5dx\" (UID: \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\") " pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.234062 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-config\") pod \"route-controller-manager-7c874fbf6-kz5dx\" (UID: \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\") " pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.239551 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-serving-cert\") pod \"route-controller-manager-7c874fbf6-kz5dx\" (UID: \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\") " pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.254376 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjtmf\" (UniqueName: \"kubernetes.io/projected/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-kube-api-access-gjtmf\") pod \"route-controller-manager-7c874fbf6-kz5dx\" (UID: \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\") " pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.368531 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.547881 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx"] Jan 28 20:42:46 crc kubenswrapper[4746]: W0128 20:42:46.558752 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd78c5ee4_f3b6_4203_a658_c64799bbf6ab.slice/crio-12092dd133976d5b100a05868e21a0c2938ad0e8f5bb5802a3fb8afe3e28dbf0 WatchSource:0}: Error finding container 12092dd133976d5b100a05868e21a0c2938ad0e8f5bb5802a3fb8afe3e28dbf0: Status 404 returned error can't find the container with id 12092dd133976d5b100a05868e21a0c2938ad0e8f5bb5802a3fb8afe3e28dbf0 Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.605599 4746 generic.go:334] "Generic (PLEG): container finished" podID="02bbbb13-0f81-478b-9284-a905b8326768" containerID="b18729b4a167d9d05e5ced0f1db35f394c546f826ab5a89343e608e2a4fc3474" exitCode=0 Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.605686 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" event={"ID":"02bbbb13-0f81-478b-9284-a905b8326768","Type":"ContainerDied","Data":"b18729b4a167d9d05e5ced0f1db35f394c546f826ab5a89343e608e2a4fc3474"} Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.605745 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.606467 4746 scope.go:117] "RemoveContainer" containerID="b18729b4a167d9d05e5ced0f1db35f394c546f826ab5a89343e608e2a4fc3474" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.613012 4746 generic.go:334] "Generic (PLEG): container finished" podID="04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04" containerID="e92c8e92331bf9dd167150bb33cf9fca6b295935ee0b99e304f3cfafef2b25f7" exitCode=0 Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.606335 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl" event={"ID":"02bbbb13-0f81-478b-9284-a905b8326768","Type":"ContainerDied","Data":"04bff03ac282ac7e498c3c9bd3db36205ed63ba07316ee4bafb66683568023ff"} Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.613577 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04","Type":"ContainerDied","Data":"e92c8e92331bf9dd167150bb33cf9fca6b295935ee0b99e304f3cfafef2b25f7"} Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.617215 4746 generic.go:334] "Generic (PLEG): container finished" podID="03192e9a-18c9-4c69-8f44-5bf1c083b7fc" containerID="78c3550fbc07038b5c9ce6151fbccf75a21193ece1c0238dfd186ed9a6e8fa0b" exitCode=0 Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.617262 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" event={"ID":"03192e9a-18c9-4c69-8f44-5bf1c083b7fc","Type":"ContainerDied","Data":"78c3550fbc07038b5c9ce6151fbccf75a21193ece1c0238dfd186ed9a6e8fa0b"} Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.617271 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.617291 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-666dd99597-bcgsc" event={"ID":"03192e9a-18c9-4c69-8f44-5bf1c083b7fc","Type":"ContainerDied","Data":"25ae3c2bc08dffadea1a5d0a6b0e221f42bf2b318f68ee14614016a63f30dd90"} Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.619010 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" event={"ID":"d78c5ee4-f3b6-4203-a658-c64799bbf6ab","Type":"ContainerStarted","Data":"12092dd133976d5b100a05868e21a0c2938ad0e8f5bb5802a3fb8afe3e28dbf0"} Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.623658 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cl6g8" event={"ID":"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee","Type":"ContainerStarted","Data":"aa273c3888084de81fbc3edee374e8d06772f532894719ca0847702e23dfc782"} Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.637361 4746 scope.go:117] "RemoveContainer" containerID="b18729b4a167d9d05e5ced0f1db35f394c546f826ab5a89343e608e2a4fc3474" Jan 28 20:42:46 crc kubenswrapper[4746]: E0128 20:42:46.638337 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b18729b4a167d9d05e5ced0f1db35f394c546f826ab5a89343e608e2a4fc3474\": container with ID starting with b18729b4a167d9d05e5ced0f1db35f394c546f826ab5a89343e608e2a4fc3474 not found: ID does not exist" containerID="b18729b4a167d9d05e5ced0f1db35f394c546f826ab5a89343e608e2a4fc3474" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.638440 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b18729b4a167d9d05e5ced0f1db35f394c546f826ab5a89343e608e2a4fc3474"} err="failed to get container status \"b18729b4a167d9d05e5ced0f1db35f394c546f826ab5a89343e608e2a4fc3474\": rpc error: code = NotFound desc = could not find container \"b18729b4a167d9d05e5ced0f1db35f394c546f826ab5a89343e608e2a4fc3474\": container with ID starting with b18729b4a167d9d05e5ced0f1db35f394c546f826ab5a89343e608e2a4fc3474 not found: ID does not exist" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.638623 4746 scope.go:117] "RemoveContainer" containerID="78c3550fbc07038b5c9ce6151fbccf75a21193ece1c0238dfd186ed9a6e8fa0b" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.657577 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl"] Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.661439 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ccff4b8f8-99rzl"] Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.673501 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cl6g8" podStartSLOduration=3.497297214 podStartE2EDuration="42.673474332s" podCreationTimestamp="2026-01-28 20:42:04 +0000 UTC" firstStartedPulling="2026-01-28 20:42:06.959507215 +0000 UTC m=+154.915693569" lastFinishedPulling="2026-01-28 20:42:46.135684333 +0000 UTC m=+194.091870687" observedRunningTime="2026-01-28 20:42:46.671963609 +0000 UTC m=+194.628149963" watchObservedRunningTime="2026-01-28 20:42:46.673474332 +0000 UTC m=+194.629660686" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.683318 4746 scope.go:117] "RemoveContainer" containerID="78c3550fbc07038b5c9ce6151fbccf75a21193ece1c0238dfd186ed9a6e8fa0b" Jan 28 20:42:46 crc kubenswrapper[4746]: E0128 20:42:46.685527 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78c3550fbc07038b5c9ce6151fbccf75a21193ece1c0238dfd186ed9a6e8fa0b\": container with ID starting with 78c3550fbc07038b5c9ce6151fbccf75a21193ece1c0238dfd186ed9a6e8fa0b not found: ID does not exist" containerID="78c3550fbc07038b5c9ce6151fbccf75a21193ece1c0238dfd186ed9a6e8fa0b" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.685566 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78c3550fbc07038b5c9ce6151fbccf75a21193ece1c0238dfd186ed9a6e8fa0b"} err="failed to get container status \"78c3550fbc07038b5c9ce6151fbccf75a21193ece1c0238dfd186ed9a6e8fa0b\": rpc error: code = NotFound desc = could not find container \"78c3550fbc07038b5c9ce6151fbccf75a21193ece1c0238dfd186ed9a6e8fa0b\": container with ID starting with 78c3550fbc07038b5c9ce6151fbccf75a21193ece1c0238dfd186ed9a6e8fa0b not found: ID does not exist" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.698400 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-666dd99597-bcgsc"] Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.701161 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-666dd99597-bcgsc"] Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.843813 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02bbbb13-0f81-478b-9284-a905b8326768" path="/var/lib/kubelet/pods/02bbbb13-0f81-478b-9284-a905b8326768/volumes" Jan 28 20:42:46 crc kubenswrapper[4746]: I0128 20:42:46.844430 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03192e9a-18c9-4c69-8f44-5bf1c083b7fc" path="/var/lib/kubelet/pods/03192e9a-18c9-4c69-8f44-5bf1c083b7fc/volumes" Jan 28 20:42:47 crc kubenswrapper[4746]: I0128 20:42:47.629719 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" event={"ID":"d78c5ee4-f3b6-4203-a658-c64799bbf6ab","Type":"ContainerStarted","Data":"273e2a6cd0cae1f380a2631d7e51fbf2897d6f7f32e57f9e0fcc4a40d0fc3091"} Jan 28 20:42:47 crc kubenswrapper[4746]: I0128 20:42:47.630398 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" Jan 28 20:42:47 crc kubenswrapper[4746]: I0128 20:42:47.637748 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" Jan 28 20:42:47 crc kubenswrapper[4746]: I0128 20:42:47.652487 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" podStartSLOduration=9.652466415 podStartE2EDuration="9.652466415s" podCreationTimestamp="2026-01-28 20:42:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:47.649969343 +0000 UTC m=+195.606155697" watchObservedRunningTime="2026-01-28 20:42:47.652466415 +0000 UTC m=+195.608652769" Jan 28 20:42:47 crc kubenswrapper[4746]: I0128 20:42:47.899523 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.069453 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04-kubelet-dir\") pod \"04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04\" (UID: \"04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04\") " Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.069577 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04-kube-api-access\") pod \"04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04\" (UID: \"04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04\") " Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.070975 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04" (UID: "04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.077442 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04" (UID: "04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.171384 4746 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.171470 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.646152 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.646132 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04","Type":"ContainerDied","Data":"d20610d2488afe1e332f8296d5198ee3e687624a58b715e6f84f7132e76a9e63"} Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.647156 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d20610d2488afe1e332f8296d5198ee3e687624a58b715e6f84f7132e76a9e63" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.959250 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-dc8f69579-d5l5z"] Jan 28 20:42:48 crc kubenswrapper[4746]: E0128 20:42:48.959490 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03192e9a-18c9-4c69-8f44-5bf1c083b7fc" containerName="controller-manager" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.959501 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="03192e9a-18c9-4c69-8f44-5bf1c083b7fc" containerName="controller-manager" Jan 28 20:42:48 crc kubenswrapper[4746]: E0128 20:42:48.959515 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04" containerName="pruner" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.959521 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04" containerName="pruner" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.959616 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="03192e9a-18c9-4c69-8f44-5bf1c083b7fc" containerName="controller-manager" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.959628 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d5cb49-01ac-4bb2-aad0-2f3ed5ec3c04" containerName="pruner" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.959979 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.963502 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.963582 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.963816 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.965002 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.965337 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.965330 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.970284 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-dc8f69579-d5l5z"] Jan 28 20:42:48 crc kubenswrapper[4746]: I0128 20:42:48.972319 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.086275 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c53d0862-8035-43b1-9b6d-374d70bec983-serving-cert\") pod \"controller-manager-dc8f69579-d5l5z\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.086843 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c53d0862-8035-43b1-9b6d-374d70bec983-client-ca\") pod \"controller-manager-dc8f69579-d5l5z\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.087143 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4cv7\" (UniqueName: \"kubernetes.io/projected/c53d0862-8035-43b1-9b6d-374d70bec983-kube-api-access-r4cv7\") pod \"controller-manager-dc8f69579-d5l5z\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.087278 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c53d0862-8035-43b1-9b6d-374d70bec983-config\") pod \"controller-manager-dc8f69579-d5l5z\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.087320 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c53d0862-8035-43b1-9b6d-374d70bec983-proxy-ca-bundles\") pod \"controller-manager-dc8f69579-d5l5z\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.137778 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.138587 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.141123 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.141423 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.159523 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.188522 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c53d0862-8035-43b1-9b6d-374d70bec983-serving-cert\") pod \"controller-manager-dc8f69579-d5l5z\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.188586 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c53d0862-8035-43b1-9b6d-374d70bec983-client-ca\") pod \"controller-manager-dc8f69579-d5l5z\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.188652 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4cv7\" (UniqueName: \"kubernetes.io/projected/c53d0862-8035-43b1-9b6d-374d70bec983-kube-api-access-r4cv7\") pod \"controller-manager-dc8f69579-d5l5z\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.188702 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c53d0862-8035-43b1-9b6d-374d70bec983-config\") pod \"controller-manager-dc8f69579-d5l5z\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.188738 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c53d0862-8035-43b1-9b6d-374d70bec983-proxy-ca-bundles\") pod \"controller-manager-dc8f69579-d5l5z\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.190707 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c53d0862-8035-43b1-9b6d-374d70bec983-proxy-ca-bundles\") pod \"controller-manager-dc8f69579-d5l5z\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.190892 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c53d0862-8035-43b1-9b6d-374d70bec983-config\") pod \"controller-manager-dc8f69579-d5l5z\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.191343 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c53d0862-8035-43b1-9b6d-374d70bec983-client-ca\") pod \"controller-manager-dc8f69579-d5l5z\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.197815 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c53d0862-8035-43b1-9b6d-374d70bec983-serving-cert\") pod \"controller-manager-dc8f69579-d5l5z\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.208357 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4cv7\" (UniqueName: \"kubernetes.io/projected/c53d0862-8035-43b1-9b6d-374d70bec983-kube-api-access-r4cv7\") pod \"controller-manager-dc8f69579-d5l5z\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.282280 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.289655 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b727dece-e4ea-4d38-899f-3c5c4c941a6c-kube-api-access\") pod \"installer-9-crc\" (UID: \"b727dece-e4ea-4d38-899f-3c5c4c941a6c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.289718 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b727dece-e4ea-4d38-899f-3c5c4c941a6c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b727dece-e4ea-4d38-899f-3c5c4c941a6c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.289748 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b727dece-e4ea-4d38-899f-3c5c4c941a6c-var-lock\") pod \"installer-9-crc\" (UID: \"b727dece-e4ea-4d38-899f-3c5c4c941a6c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.391204 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b727dece-e4ea-4d38-899f-3c5c4c941a6c-kube-api-access\") pod \"installer-9-crc\" (UID: \"b727dece-e4ea-4d38-899f-3c5c4c941a6c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.391266 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b727dece-e4ea-4d38-899f-3c5c4c941a6c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b727dece-e4ea-4d38-899f-3c5c4c941a6c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.391302 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b727dece-e4ea-4d38-899f-3c5c4c941a6c-var-lock\") pod \"installer-9-crc\" (UID: \"b727dece-e4ea-4d38-899f-3c5c4c941a6c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.391391 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b727dece-e4ea-4d38-899f-3c5c4c941a6c-var-lock\") pod \"installer-9-crc\" (UID: \"b727dece-e4ea-4d38-899f-3c5c4c941a6c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.391822 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b727dece-e4ea-4d38-899f-3c5c4c941a6c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b727dece-e4ea-4d38-899f-3c5c4c941a6c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.415674 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b727dece-e4ea-4d38-899f-3c5c4c941a6c-kube-api-access\") pod \"installer-9-crc\" (UID: \"b727dece-e4ea-4d38-899f-3c5c4c941a6c\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.455487 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.574222 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-dc8f69579-d5l5z"] Jan 28 20:42:49 crc kubenswrapper[4746]: W0128 20:42:49.583927 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc53d0862_8035_43b1_9b6d_374d70bec983.slice/crio-503feb6548cedc39815cef9ce48468edaf90a7b018da506af8f948803d6a6f8e WatchSource:0}: Error finding container 503feb6548cedc39815cef9ce48468edaf90a7b018da506af8f948803d6a6f8e: Status 404 returned error can't find the container with id 503feb6548cedc39815cef9ce48468edaf90a7b018da506af8f948803d6a6f8e Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.673895 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" event={"ID":"c53d0862-8035-43b1-9b6d-374d70bec983","Type":"ContainerStarted","Data":"503feb6548cedc39815cef9ce48468edaf90a7b018da506af8f948803d6a6f8e"} Jan 28 20:42:49 crc kubenswrapper[4746]: I0128 20:42:49.693132 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 28 20:42:49 crc kubenswrapper[4746]: W0128 20:42:49.704977 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb727dece_e4ea_4d38_899f_3c5c4c941a6c.slice/crio-fa6f7bcd1f424d0b6b5677fbeee1a4ed355f2b279b3ddf100d2ccee5dd87d729 WatchSource:0}: Error finding container fa6f7bcd1f424d0b6b5677fbeee1a4ed355f2b279b3ddf100d2ccee5dd87d729: Status 404 returned error can't find the container with id fa6f7bcd1f424d0b6b5677fbeee1a4ed355f2b279b3ddf100d2ccee5dd87d729 Jan 28 20:42:50 crc kubenswrapper[4746]: I0128 20:42:50.678119 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b727dece-e4ea-4d38-899f-3c5c4c941a6c","Type":"ContainerStarted","Data":"d2bd3327e728b7371e4489ba1aad2337b15cb382bc8a32c2b74f59da95955c6b"} Jan 28 20:42:50 crc kubenswrapper[4746]: I0128 20:42:50.678452 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b727dece-e4ea-4d38-899f-3c5c4c941a6c","Type":"ContainerStarted","Data":"fa6f7bcd1f424d0b6b5677fbeee1a4ed355f2b279b3ddf100d2ccee5dd87d729"} Jan 28 20:42:50 crc kubenswrapper[4746]: I0128 20:42:50.679923 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" event={"ID":"c53d0862-8035-43b1-9b6d-374d70bec983","Type":"ContainerStarted","Data":"7a15e2b0bfca4caf4d1369cede56c806a1e4907df1b641b3f25aae44c7a2b8ae"} Jan 28 20:42:50 crc kubenswrapper[4746]: I0128 20:42:50.680818 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:50 crc kubenswrapper[4746]: I0128 20:42:50.705964 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.705940706 podStartE2EDuration="1.705940706s" podCreationTimestamp="2026-01-28 20:42:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:50.703536708 +0000 UTC m=+198.659723062" watchObservedRunningTime="2026-01-28 20:42:50.705940706 +0000 UTC m=+198.662127060" Jan 28 20:42:50 crc kubenswrapper[4746]: I0128 20:42:50.709489 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:42:50 crc kubenswrapper[4746]: I0128 20:42:50.744137 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" podStartSLOduration=12.744108987 podStartE2EDuration="12.744108987s" podCreationTimestamp="2026-01-28 20:42:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:42:50.742184391 +0000 UTC m=+198.698370766" watchObservedRunningTime="2026-01-28 20:42:50.744108987 +0000 UTC m=+198.700295341" Jan 28 20:42:52 crc kubenswrapper[4746]: I0128 20:42:52.898752 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4g4p7"] Jan 28 20:42:53 crc kubenswrapper[4746]: I0128 20:42:53.704131 4746 generic.go:334] "Generic (PLEG): container finished" podID="6c585264-9fea-4d40-910d-68a31c553f76" containerID="2705ebede777955e1f157a73a21abb2133c3b3f0ad608862ab9a313f6568a8a8" exitCode=0 Jan 28 20:42:53 crc kubenswrapper[4746]: I0128 20:42:53.704194 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85ttn" event={"ID":"6c585264-9fea-4d40-910d-68a31c553f76","Type":"ContainerDied","Data":"2705ebede777955e1f157a73a21abb2133c3b3f0ad608862ab9a313f6568a8a8"} Jan 28 20:42:54 crc kubenswrapper[4746]: I0128 20:42:54.716321 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85ttn" event={"ID":"6c585264-9fea-4d40-910d-68a31c553f76","Type":"ContainerStarted","Data":"3209e2910865c45825fc585990fa2b0e07aa9b8fcd03af6eb5bddeea419f0bcf"} Jan 28 20:42:54 crc kubenswrapper[4746]: I0128 20:42:54.740265 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-85ttn" podStartSLOduration=5.147729518 podStartE2EDuration="54.740242114s" podCreationTimestamp="2026-01-28 20:42:00 +0000 UTC" firstStartedPulling="2026-01-28 20:42:04.611878892 +0000 UTC m=+152.568065246" lastFinishedPulling="2026-01-28 20:42:54.204391488 +0000 UTC m=+202.160577842" observedRunningTime="2026-01-28 20:42:54.739371488 +0000 UTC m=+202.695557852" watchObservedRunningTime="2026-01-28 20:42:54.740242114 +0000 UTC m=+202.696428468" Jan 28 20:42:54 crc kubenswrapper[4746]: I0128 20:42:54.756380 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cl6g8" Jan 28 20:42:54 crc kubenswrapper[4746]: I0128 20:42:54.757179 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cl6g8" Jan 28 20:42:54 crc kubenswrapper[4746]: I0128 20:42:54.911709 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cl6g8" Jan 28 20:42:55 crc kubenswrapper[4746]: I0128 20:42:55.772434 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cl6g8" Jan 28 20:42:55 crc kubenswrapper[4746]: I0128 20:42:55.952252 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cl6g8"] Jan 28 20:42:57 crc kubenswrapper[4746]: I0128 20:42:57.734711 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hxz5g" event={"ID":"f8f220cb-71e1-4b97-960e-ef8742661130","Type":"ContainerStarted","Data":"e66e84ecc9aef3f38aa7901fcda5dbebd35192cf04ae069e2d37723f5cf3bd30"} Jan 28 20:42:57 crc kubenswrapper[4746]: I0128 20:42:57.738028 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9v8x" event={"ID":"abe404c6-f1c8-4ad6-92b9-7c082b112b50","Type":"ContainerStarted","Data":"4b30acf98a4f9337a249e775286542e3e54002c749e6801ae5b8e7864b1188dc"} Jan 28 20:42:57 crc kubenswrapper[4746]: I0128 20:42:57.740654 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cl6g8" podUID="8f41ddf2-d97e-4e32-ac3d-48850a4d47ee" containerName="registry-server" containerID="cri-o://aa273c3888084de81fbc3edee374e8d06772f532894719ca0847702e23dfc782" gracePeriod=2 Jan 28 20:42:57 crc kubenswrapper[4746]: I0128 20:42:57.740824 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t66g" event={"ID":"18cbfd39-cf22-428c-ab2a-708082df0357","Type":"ContainerStarted","Data":"3d2f7c7bdc3aee38214514d5420ddf8c61bd01380a6c2d114f424e19461defae"} Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.197875 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cl6g8" Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.355914 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee-utilities\") pod \"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee\" (UID: \"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee\") " Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.356051 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm4t6\" (UniqueName: \"kubernetes.io/projected/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee-kube-api-access-lm4t6\") pod \"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee\" (UID: \"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee\") " Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.356122 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee-catalog-content\") pod \"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee\" (UID: \"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee\") " Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.358159 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee-utilities" (OuterVolumeSpecName: "utilities") pod "8f41ddf2-d97e-4e32-ac3d-48850a4d47ee" (UID: "8f41ddf2-d97e-4e32-ac3d-48850a4d47ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.364409 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee-kube-api-access-lm4t6" (OuterVolumeSpecName: "kube-api-access-lm4t6") pod "8f41ddf2-d97e-4e32-ac3d-48850a4d47ee" (UID: "8f41ddf2-d97e-4e32-ac3d-48850a4d47ee"). InnerVolumeSpecName "kube-api-access-lm4t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.457748 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm4t6\" (UniqueName: \"kubernetes.io/projected/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee-kube-api-access-lm4t6\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.457783 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.501198 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f41ddf2-d97e-4e32-ac3d-48850a4d47ee" (UID: "8f41ddf2-d97e-4e32-ac3d-48850a4d47ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.559558 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.747914 4746 generic.go:334] "Generic (PLEG): container finished" podID="abe404c6-f1c8-4ad6-92b9-7c082b112b50" containerID="4b30acf98a4f9337a249e775286542e3e54002c749e6801ae5b8e7864b1188dc" exitCode=0 Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.748099 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9v8x" event={"ID":"abe404c6-f1c8-4ad6-92b9-7c082b112b50","Type":"ContainerDied","Data":"4b30acf98a4f9337a249e775286542e3e54002c749e6801ae5b8e7864b1188dc"} Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.750562 4746 generic.go:334] "Generic (PLEG): container finished" podID="18cbfd39-cf22-428c-ab2a-708082df0357" containerID="3d2f7c7bdc3aee38214514d5420ddf8c61bd01380a6c2d114f424e19461defae" exitCode=0 Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.750641 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t66g" event={"ID":"18cbfd39-cf22-428c-ab2a-708082df0357","Type":"ContainerDied","Data":"3d2f7c7bdc3aee38214514d5420ddf8c61bd01380a6c2d114f424e19461defae"} Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.753948 4746 generic.go:334] "Generic (PLEG): container finished" podID="8f41ddf2-d97e-4e32-ac3d-48850a4d47ee" containerID="aa273c3888084de81fbc3edee374e8d06772f532894719ca0847702e23dfc782" exitCode=0 Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.754023 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cl6g8" event={"ID":"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee","Type":"ContainerDied","Data":"aa273c3888084de81fbc3edee374e8d06772f532894719ca0847702e23dfc782"} Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.754060 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cl6g8" Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.754091 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cl6g8" event={"ID":"8f41ddf2-d97e-4e32-ac3d-48850a4d47ee","Type":"ContainerDied","Data":"130f1ae53c7bccf0e87fcafd6727a8da8f02b9bb27bcdb9627449d3dd4380dde"} Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.754104 4746 scope.go:117] "RemoveContainer" containerID="aa273c3888084de81fbc3edee374e8d06772f532894719ca0847702e23dfc782" Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.759054 4746 generic.go:334] "Generic (PLEG): container finished" podID="1f42df00-e947-4928-a51f-ddaa3658cc67" containerID="d64d7a1f62f2db500c6c8da501014bd83a035579a9931041f5819b3ce1f832e2" exitCode=0 Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.759110 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2t9c" event={"ID":"1f42df00-e947-4928-a51f-ddaa3658cc67","Type":"ContainerDied","Data":"d64d7a1f62f2db500c6c8da501014bd83a035579a9931041f5819b3ce1f832e2"} Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.762052 4746 generic.go:334] "Generic (PLEG): container finished" podID="f8f220cb-71e1-4b97-960e-ef8742661130" containerID="e66e84ecc9aef3f38aa7901fcda5dbebd35192cf04ae069e2d37723f5cf3bd30" exitCode=0 Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.762100 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hxz5g" event={"ID":"f8f220cb-71e1-4b97-960e-ef8742661130","Type":"ContainerDied","Data":"e66e84ecc9aef3f38aa7901fcda5dbebd35192cf04ae069e2d37723f5cf3bd30"} Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.789715 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cl6g8"] Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.791727 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cl6g8"] Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.800522 4746 scope.go:117] "RemoveContainer" containerID="b791457d961abcfb593dd83a572da88bcc9eab6f3644ef823352350e466ab91e" Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.835790 4746 scope.go:117] "RemoveContainer" containerID="6345f42eeb2545e0d80d647e35269349daf234ba1a1afcf516046fb572b40f7b" Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.850608 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f41ddf2-d97e-4e32-ac3d-48850a4d47ee" path="/var/lib/kubelet/pods/8f41ddf2-d97e-4e32-ac3d-48850a4d47ee/volumes" Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.864987 4746 scope.go:117] "RemoveContainer" containerID="aa273c3888084de81fbc3edee374e8d06772f532894719ca0847702e23dfc782" Jan 28 20:42:58 crc kubenswrapper[4746]: E0128 20:42:58.866215 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa273c3888084de81fbc3edee374e8d06772f532894719ca0847702e23dfc782\": container with ID starting with aa273c3888084de81fbc3edee374e8d06772f532894719ca0847702e23dfc782 not found: ID does not exist" containerID="aa273c3888084de81fbc3edee374e8d06772f532894719ca0847702e23dfc782" Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.866296 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa273c3888084de81fbc3edee374e8d06772f532894719ca0847702e23dfc782"} err="failed to get container status \"aa273c3888084de81fbc3edee374e8d06772f532894719ca0847702e23dfc782\": rpc error: code = NotFound desc = could not find container \"aa273c3888084de81fbc3edee374e8d06772f532894719ca0847702e23dfc782\": container with ID starting with aa273c3888084de81fbc3edee374e8d06772f532894719ca0847702e23dfc782 not found: ID does not exist" Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.866339 4746 scope.go:117] "RemoveContainer" containerID="b791457d961abcfb593dd83a572da88bcc9eab6f3644ef823352350e466ab91e" Jan 28 20:42:58 crc kubenswrapper[4746]: E0128 20:42:58.866936 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b791457d961abcfb593dd83a572da88bcc9eab6f3644ef823352350e466ab91e\": container with ID starting with b791457d961abcfb593dd83a572da88bcc9eab6f3644ef823352350e466ab91e not found: ID does not exist" containerID="b791457d961abcfb593dd83a572da88bcc9eab6f3644ef823352350e466ab91e" Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.867180 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b791457d961abcfb593dd83a572da88bcc9eab6f3644ef823352350e466ab91e"} err="failed to get container status \"b791457d961abcfb593dd83a572da88bcc9eab6f3644ef823352350e466ab91e\": rpc error: code = NotFound desc = could not find container \"b791457d961abcfb593dd83a572da88bcc9eab6f3644ef823352350e466ab91e\": container with ID starting with b791457d961abcfb593dd83a572da88bcc9eab6f3644ef823352350e466ab91e not found: ID does not exist" Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.867252 4746 scope.go:117] "RemoveContainer" containerID="6345f42eeb2545e0d80d647e35269349daf234ba1a1afcf516046fb572b40f7b" Jan 28 20:42:58 crc kubenswrapper[4746]: E0128 20:42:58.869342 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6345f42eeb2545e0d80d647e35269349daf234ba1a1afcf516046fb572b40f7b\": container with ID starting with 6345f42eeb2545e0d80d647e35269349daf234ba1a1afcf516046fb572b40f7b not found: ID does not exist" containerID="6345f42eeb2545e0d80d647e35269349daf234ba1a1afcf516046fb572b40f7b" Jan 28 20:42:58 crc kubenswrapper[4746]: I0128 20:42:58.869392 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6345f42eeb2545e0d80d647e35269349daf234ba1a1afcf516046fb572b40f7b"} err="failed to get container status \"6345f42eeb2545e0d80d647e35269349daf234ba1a1afcf516046fb572b40f7b\": rpc error: code = NotFound desc = could not find container \"6345f42eeb2545e0d80d647e35269349daf234ba1a1afcf516046fb572b40f7b\": container with ID starting with 6345f42eeb2545e0d80d647e35269349daf234ba1a1afcf516046fb572b40f7b not found: ID does not exist" Jan 28 20:42:59 crc kubenswrapper[4746]: I0128 20:42:59.771016 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2t9c" event={"ID":"1f42df00-e947-4928-a51f-ddaa3658cc67","Type":"ContainerStarted","Data":"e53569df4787d5c5def247f3bc6929834be74259f6b3599f9fa4a4189d6e2620"} Jan 28 20:42:59 crc kubenswrapper[4746]: I0128 20:42:59.774897 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hxz5g" event={"ID":"f8f220cb-71e1-4b97-960e-ef8742661130","Type":"ContainerStarted","Data":"79f78a3339aa87612d90f4b9442c37cd103a06f265033bde9faaddbd4951ba43"} Jan 28 20:42:59 crc kubenswrapper[4746]: I0128 20:42:59.778249 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9v8x" event={"ID":"abe404c6-f1c8-4ad6-92b9-7c082b112b50","Type":"ContainerStarted","Data":"93c872d144be30351aeaa3702258771b2f9ef8c46a0fb016fdd102eba5675926"} Jan 28 20:42:59 crc kubenswrapper[4746]: I0128 20:42:59.781256 4746 generic.go:334] "Generic (PLEG): container finished" podID="8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" containerID="578f2b2763780b4281717d9382666e95eb03ed3741ab3a72e97691170c3313bd" exitCode=0 Jan 28 20:42:59 crc kubenswrapper[4746]: I0128 20:42:59.781327 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlqqs" event={"ID":"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b","Type":"ContainerDied","Data":"578f2b2763780b4281717d9382666e95eb03ed3741ab3a72e97691170c3313bd"} Jan 28 20:42:59 crc kubenswrapper[4746]: I0128 20:42:59.783679 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t66g" event={"ID":"18cbfd39-cf22-428c-ab2a-708082df0357","Type":"ContainerStarted","Data":"143e6acaec90736206f109f4678283fa5aa04d423876a786f92adf3ec3991ecf"} Jan 28 20:42:59 crc kubenswrapper[4746]: I0128 20:42:59.803235 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f2t9c" podStartSLOduration=3.5135538889999998 podStartE2EDuration="56.803218721s" podCreationTimestamp="2026-01-28 20:42:03 +0000 UTC" firstStartedPulling="2026-01-28 20:42:05.92259811 +0000 UTC m=+153.878784464" lastFinishedPulling="2026-01-28 20:42:59.212262942 +0000 UTC m=+207.168449296" observedRunningTime="2026-01-28 20:42:59.801613155 +0000 UTC m=+207.757799519" watchObservedRunningTime="2026-01-28 20:42:59.803218721 +0000 UTC m=+207.759405065" Jan 28 20:42:59 crc kubenswrapper[4746]: I0128 20:42:59.844969 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hxz5g" podStartSLOduration=5.515388871 podStartE2EDuration="58.844952423s" podCreationTimestamp="2026-01-28 20:42:01 +0000 UTC" firstStartedPulling="2026-01-28 20:42:05.92611916 +0000 UTC m=+153.882305514" lastFinishedPulling="2026-01-28 20:42:59.255682712 +0000 UTC m=+207.211869066" observedRunningTime="2026-01-28 20:42:59.844402367 +0000 UTC m=+207.800588721" watchObservedRunningTime="2026-01-28 20:42:59.844952423 +0000 UTC m=+207.801138777" Jan 28 20:42:59 crc kubenswrapper[4746]: I0128 20:42:59.886597 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4t66g" podStartSLOduration=4.458166545 podStartE2EDuration="57.886578971s" podCreationTimestamp="2026-01-28 20:42:02 +0000 UTC" firstStartedPulling="2026-01-28 20:42:05.922947309 +0000 UTC m=+153.879133663" lastFinishedPulling="2026-01-28 20:42:59.351359735 +0000 UTC m=+207.307546089" observedRunningTime="2026-01-28 20:42:59.884497982 +0000 UTC m=+207.840684346" watchObservedRunningTime="2026-01-28 20:42:59.886578971 +0000 UTC m=+207.842765325" Jan 28 20:42:59 crc kubenswrapper[4746]: I0128 20:42:59.911715 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c9v8x" podStartSLOduration=4.324595739 podStartE2EDuration="58.911699459s" podCreationTimestamp="2026-01-28 20:42:01 +0000 UTC" firstStartedPulling="2026-01-28 20:42:04.748191146 +0000 UTC m=+152.704377500" lastFinishedPulling="2026-01-28 20:42:59.335294866 +0000 UTC m=+207.291481220" observedRunningTime="2026-01-28 20:42:59.909957759 +0000 UTC m=+207.866144123" watchObservedRunningTime="2026-01-28 20:42:59.911699459 +0000 UTC m=+207.867885813" Jan 28 20:43:00 crc kubenswrapper[4746]: I0128 20:43:00.795509 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ghx5p" event={"ID":"fa890224-0942-4671-a9d8-97b6f465b0df","Type":"ContainerStarted","Data":"21fc934156582dff48d7bbd6ca5af885c135910d225d3ce4e1850dd45c5df869"} Jan 28 20:43:01 crc kubenswrapper[4746]: I0128 20:43:01.152580 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-85ttn" Jan 28 20:43:01 crc kubenswrapper[4746]: I0128 20:43:01.152640 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-85ttn" Jan 28 20:43:01 crc kubenswrapper[4746]: I0128 20:43:01.201114 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-85ttn" Jan 28 20:43:01 crc kubenswrapper[4746]: I0128 20:43:01.537399 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c9v8x" Jan 28 20:43:01 crc kubenswrapper[4746]: I0128 20:43:01.537458 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c9v8x" Jan 28 20:43:01 crc kubenswrapper[4746]: I0128 20:43:01.612832 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c9v8x" Jan 28 20:43:01 crc kubenswrapper[4746]: I0128 20:43:01.723192 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hxz5g" Jan 28 20:43:01 crc kubenswrapper[4746]: I0128 20:43:01.723274 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hxz5g" Jan 28 20:43:01 crc kubenswrapper[4746]: I0128 20:43:01.805110 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlqqs" event={"ID":"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b","Type":"ContainerStarted","Data":"1b916d1fcc57694a8d2d4f44b18cff98f131a258a4a52f7839523c53e042b66e"} Jan 28 20:43:01 crc kubenswrapper[4746]: I0128 20:43:01.807526 4746 generic.go:334] "Generic (PLEG): container finished" podID="fa890224-0942-4671-a9d8-97b6f465b0df" containerID="21fc934156582dff48d7bbd6ca5af885c135910d225d3ce4e1850dd45c5df869" exitCode=0 Jan 28 20:43:01 crc kubenswrapper[4746]: I0128 20:43:01.808158 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ghx5p" event={"ID":"fa890224-0942-4671-a9d8-97b6f465b0df","Type":"ContainerDied","Data":"21fc934156582dff48d7bbd6ca5af885c135910d225d3ce4e1850dd45c5df869"} Jan 28 20:43:01 crc kubenswrapper[4746]: I0128 20:43:01.856343 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mlqqs" podStartSLOduration=4.344717958 podStartE2EDuration="58.856317301s" podCreationTimestamp="2026-01-28 20:42:03 +0000 UTC" firstStartedPulling="2026-01-28 20:42:06.945276009 +0000 UTC m=+154.901462353" lastFinishedPulling="2026-01-28 20:43:01.456875342 +0000 UTC m=+209.413061696" observedRunningTime="2026-01-28 20:43:01.825707786 +0000 UTC m=+209.781894140" watchObservedRunningTime="2026-01-28 20:43:01.856317301 +0000 UTC m=+209.812503655" Jan 28 20:43:01 crc kubenswrapper[4746]: I0128 20:43:01.861776 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-85ttn" Jan 28 20:43:02 crc kubenswrapper[4746]: I0128 20:43:02.771368 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-hxz5g" podUID="f8f220cb-71e1-4b97-960e-ef8742661130" containerName="registry-server" probeResult="failure" output=< Jan 28 20:43:02 crc kubenswrapper[4746]: timeout: failed to connect service ":50051" within 1s Jan 28 20:43:02 crc kubenswrapper[4746]: > Jan 28 20:43:03 crc kubenswrapper[4746]: I0128 20:43:03.075552 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4t66g" Jan 28 20:43:03 crc kubenswrapper[4746]: I0128 20:43:03.075604 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4t66g" Jan 28 20:43:03 crc kubenswrapper[4746]: I0128 20:43:03.122763 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4t66g" Jan 28 20:43:03 crc kubenswrapper[4746]: I0128 20:43:03.487646 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f2t9c" Jan 28 20:43:03 crc kubenswrapper[4746]: I0128 20:43:03.487696 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f2t9c" Jan 28 20:43:03 crc kubenswrapper[4746]: I0128 20:43:03.537310 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f2t9c" Jan 28 20:43:03 crc kubenswrapper[4746]: I0128 20:43:03.835028 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ghx5p" event={"ID":"fa890224-0942-4671-a9d8-97b6f465b0df","Type":"ContainerStarted","Data":"a97c4a3cc18a3236a655a4072cc1a10ac809fb9a1cffaec7954c6199ad833df8"} Jan 28 20:43:03 crc kubenswrapper[4746]: I0128 20:43:03.862015 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ghx5p" podStartSLOduration=6.067665014 podStartE2EDuration="1m3.861955486s" podCreationTimestamp="2026-01-28 20:42:00 +0000 UTC" firstStartedPulling="2026-01-28 20:42:04.809862348 +0000 UTC m=+152.766048702" lastFinishedPulling="2026-01-28 20:43:02.60415283 +0000 UTC m=+210.560339174" observedRunningTime="2026-01-28 20:43:03.856370706 +0000 UTC m=+211.812557070" watchObservedRunningTime="2026-01-28 20:43:03.861955486 +0000 UTC m=+211.818141880" Jan 28 20:43:04 crc kubenswrapper[4746]: I0128 20:43:04.651189 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mlqqs" Jan 28 20:43:04 crc kubenswrapper[4746]: I0128 20:43:04.651250 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mlqqs" Jan 28 20:43:05 crc kubenswrapper[4746]: I0128 20:43:05.732694 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mlqqs" podUID="8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" containerName="registry-server" probeResult="failure" output=< Jan 28 20:43:05 crc kubenswrapper[4746]: timeout: failed to connect service ":50051" within 1s Jan 28 20:43:05 crc kubenswrapper[4746]: > Jan 28 20:43:11 crc kubenswrapper[4746]: I0128 20:43:11.241052 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ghx5p" Jan 28 20:43:11 crc kubenswrapper[4746]: I0128 20:43:11.241414 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ghx5p" Jan 28 20:43:11 crc kubenswrapper[4746]: I0128 20:43:11.280173 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ghx5p" Jan 28 20:43:11 crc kubenswrapper[4746]: I0128 20:43:11.576096 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c9v8x" Jan 28 20:43:11 crc kubenswrapper[4746]: I0128 20:43:11.623517 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c9v8x"] Jan 28 20:43:11 crc kubenswrapper[4746]: I0128 20:43:11.797570 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hxz5g" Jan 28 20:43:11 crc kubenswrapper[4746]: I0128 20:43:11.861385 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hxz5g" Jan 28 20:43:11 crc kubenswrapper[4746]: I0128 20:43:11.879832 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c9v8x" podUID="abe404c6-f1c8-4ad6-92b9-7c082b112b50" containerName="registry-server" containerID="cri-o://93c872d144be30351aeaa3702258771b2f9ef8c46a0fb016fdd102eba5675926" gracePeriod=2 Jan 28 20:43:11 crc kubenswrapper[4746]: I0128 20:43:11.957443 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ghx5p" Jan 28 20:43:12 crc kubenswrapper[4746]: I0128 20:43:12.890608 4746 generic.go:334] "Generic (PLEG): container finished" podID="abe404c6-f1c8-4ad6-92b9-7c082b112b50" containerID="93c872d144be30351aeaa3702258771b2f9ef8c46a0fb016fdd102eba5675926" exitCode=0 Jan 28 20:43:12 crc kubenswrapper[4746]: I0128 20:43:12.890693 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9v8x" event={"ID":"abe404c6-f1c8-4ad6-92b9-7c082b112b50","Type":"ContainerDied","Data":"93c872d144be30351aeaa3702258771b2f9ef8c46a0fb016fdd102eba5675926"} Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.144810 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4t66g" Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.563129 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f2t9c" Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.597402 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c9v8x" Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.687705 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjzkd\" (UniqueName: \"kubernetes.io/projected/abe404c6-f1c8-4ad6-92b9-7c082b112b50-kube-api-access-pjzkd\") pod \"abe404c6-f1c8-4ad6-92b9-7c082b112b50\" (UID: \"abe404c6-f1c8-4ad6-92b9-7c082b112b50\") " Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.687838 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe404c6-f1c8-4ad6-92b9-7c082b112b50-utilities\") pod \"abe404c6-f1c8-4ad6-92b9-7c082b112b50\" (UID: \"abe404c6-f1c8-4ad6-92b9-7c082b112b50\") " Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.687886 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe404c6-f1c8-4ad6-92b9-7c082b112b50-catalog-content\") pod \"abe404c6-f1c8-4ad6-92b9-7c082b112b50\" (UID: \"abe404c6-f1c8-4ad6-92b9-7c082b112b50\") " Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.689059 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abe404c6-f1c8-4ad6-92b9-7c082b112b50-utilities" (OuterVolumeSpecName: "utilities") pod "abe404c6-f1c8-4ad6-92b9-7c082b112b50" (UID: "abe404c6-f1c8-4ad6-92b9-7c082b112b50"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.696194 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abe404c6-f1c8-4ad6-92b9-7c082b112b50-kube-api-access-pjzkd" (OuterVolumeSpecName: "kube-api-access-pjzkd") pod "abe404c6-f1c8-4ad6-92b9-7c082b112b50" (UID: "abe404c6-f1c8-4ad6-92b9-7c082b112b50"). InnerVolumeSpecName "kube-api-access-pjzkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.743873 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abe404c6-f1c8-4ad6-92b9-7c082b112b50-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "abe404c6-f1c8-4ad6-92b9-7c082b112b50" (UID: "abe404c6-f1c8-4ad6-92b9-7c082b112b50"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.790072 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjzkd\" (UniqueName: \"kubernetes.io/projected/abe404c6-f1c8-4ad6-92b9-7c082b112b50-kube-api-access-pjzkd\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.790179 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/abe404c6-f1c8-4ad6-92b9-7c082b112b50-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.790200 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/abe404c6-f1c8-4ad6-92b9-7c082b112b50-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.901132 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9v8x" event={"ID":"abe404c6-f1c8-4ad6-92b9-7c082b112b50","Type":"ContainerDied","Data":"2afa48be8e6e186e182a9adcdf5c0a0656a4e51c7c293d2ec30272750799e7c6"} Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.901197 4746 scope.go:117] "RemoveContainer" containerID="93c872d144be30351aeaa3702258771b2f9ef8c46a0fb016fdd102eba5675926" Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.901770 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c9v8x" Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.922699 4746 scope.go:117] "RemoveContainer" containerID="4b30acf98a4f9337a249e775286542e3e54002c749e6801ae5b8e7864b1188dc" Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.939046 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c9v8x"] Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.944817 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c9v8x"] Jan 28 20:43:13 crc kubenswrapper[4746]: I0128 20:43:13.956518 4746 scope.go:117] "RemoveContainer" containerID="cb598f9ea962c4558236bbf4b7f16c6f3a29aab5024527148774d0e8547405ed" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.131726 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hxz5g"] Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.132311 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hxz5g" podUID="f8f220cb-71e1-4b97-960e-ef8742661130" containerName="registry-server" containerID="cri-o://79f78a3339aa87612d90f4b9442c37cd103a06f265033bde9faaddbd4951ba43" gracePeriod=2 Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.632710 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hxz5g" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.702428 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mlqqs" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.704953 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f220cb-71e1-4b97-960e-ef8742661130-utilities\") pod \"f8f220cb-71e1-4b97-960e-ef8742661130\" (UID: \"f8f220cb-71e1-4b97-960e-ef8742661130\") " Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.705110 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f220cb-71e1-4b97-960e-ef8742661130-catalog-content\") pod \"f8f220cb-71e1-4b97-960e-ef8742661130\" (UID: \"f8f220cb-71e1-4b97-960e-ef8742661130\") " Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.705185 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6jsk\" (UniqueName: \"kubernetes.io/projected/f8f220cb-71e1-4b97-960e-ef8742661130-kube-api-access-f6jsk\") pod \"f8f220cb-71e1-4b97-960e-ef8742661130\" (UID: \"f8f220cb-71e1-4b97-960e-ef8742661130\") " Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.706239 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8f220cb-71e1-4b97-960e-ef8742661130-utilities" (OuterVolumeSpecName: "utilities") pod "f8f220cb-71e1-4b97-960e-ef8742661130" (UID: "f8f220cb-71e1-4b97-960e-ef8742661130"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.712588 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8f220cb-71e1-4b97-960e-ef8742661130-kube-api-access-f6jsk" (OuterVolumeSpecName: "kube-api-access-f6jsk") pod "f8f220cb-71e1-4b97-960e-ef8742661130" (UID: "f8f220cb-71e1-4b97-960e-ef8742661130"). InnerVolumeSpecName "kube-api-access-f6jsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.766960 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mlqqs" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.775885 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8f220cb-71e1-4b97-960e-ef8742661130-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8f220cb-71e1-4b97-960e-ef8742661130" (UID: "f8f220cb-71e1-4b97-960e-ef8742661130"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.807040 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8f220cb-71e1-4b97-960e-ef8742661130-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.807108 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6jsk\" (UniqueName: \"kubernetes.io/projected/f8f220cb-71e1-4b97-960e-ef8742661130-kube-api-access-f6jsk\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.807130 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8f220cb-71e1-4b97-960e-ef8742661130-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.843043 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abe404c6-f1c8-4ad6-92b9-7c082b112b50" path="/var/lib/kubelet/pods/abe404c6-f1c8-4ad6-92b9-7c082b112b50/volumes" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.909296 4746 generic.go:334] "Generic (PLEG): container finished" podID="f8f220cb-71e1-4b97-960e-ef8742661130" containerID="79f78a3339aa87612d90f4b9442c37cd103a06f265033bde9faaddbd4951ba43" exitCode=0 Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.909428 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hxz5g" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.909503 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hxz5g" event={"ID":"f8f220cb-71e1-4b97-960e-ef8742661130","Type":"ContainerDied","Data":"79f78a3339aa87612d90f4b9442c37cd103a06f265033bde9faaddbd4951ba43"} Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.909560 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hxz5g" event={"ID":"f8f220cb-71e1-4b97-960e-ef8742661130","Type":"ContainerDied","Data":"3088251d6df21e737845c981681e5810236877bf93c54cf60f668108a5fec33d"} Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.909590 4746 scope.go:117] "RemoveContainer" containerID="79f78a3339aa87612d90f4b9442c37cd103a06f265033bde9faaddbd4951ba43" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.928385 4746 scope.go:117] "RemoveContainer" containerID="e66e84ecc9aef3f38aa7901fcda5dbebd35192cf04ae069e2d37723f5cf3bd30" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.932388 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hxz5g"] Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.937418 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hxz5g"] Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.945561 4746 scope.go:117] "RemoveContainer" containerID="ee30384131315f8e32eb7d79d6e8219c181b4d8d31cea43f077290c1cfe6bddb" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.963362 4746 scope.go:117] "RemoveContainer" containerID="79f78a3339aa87612d90f4b9442c37cd103a06f265033bde9faaddbd4951ba43" Jan 28 20:43:14 crc kubenswrapper[4746]: E0128 20:43:14.964102 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79f78a3339aa87612d90f4b9442c37cd103a06f265033bde9faaddbd4951ba43\": container with ID starting with 79f78a3339aa87612d90f4b9442c37cd103a06f265033bde9faaddbd4951ba43 not found: ID does not exist" containerID="79f78a3339aa87612d90f4b9442c37cd103a06f265033bde9faaddbd4951ba43" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.964146 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79f78a3339aa87612d90f4b9442c37cd103a06f265033bde9faaddbd4951ba43"} err="failed to get container status \"79f78a3339aa87612d90f4b9442c37cd103a06f265033bde9faaddbd4951ba43\": rpc error: code = NotFound desc = could not find container \"79f78a3339aa87612d90f4b9442c37cd103a06f265033bde9faaddbd4951ba43\": container with ID starting with 79f78a3339aa87612d90f4b9442c37cd103a06f265033bde9faaddbd4951ba43 not found: ID does not exist" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.964177 4746 scope.go:117] "RemoveContainer" containerID="e66e84ecc9aef3f38aa7901fcda5dbebd35192cf04ae069e2d37723f5cf3bd30" Jan 28 20:43:14 crc kubenswrapper[4746]: E0128 20:43:14.964519 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e66e84ecc9aef3f38aa7901fcda5dbebd35192cf04ae069e2d37723f5cf3bd30\": container with ID starting with e66e84ecc9aef3f38aa7901fcda5dbebd35192cf04ae069e2d37723f5cf3bd30 not found: ID does not exist" containerID="e66e84ecc9aef3f38aa7901fcda5dbebd35192cf04ae069e2d37723f5cf3bd30" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.964617 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e66e84ecc9aef3f38aa7901fcda5dbebd35192cf04ae069e2d37723f5cf3bd30"} err="failed to get container status \"e66e84ecc9aef3f38aa7901fcda5dbebd35192cf04ae069e2d37723f5cf3bd30\": rpc error: code = NotFound desc = could not find container \"e66e84ecc9aef3f38aa7901fcda5dbebd35192cf04ae069e2d37723f5cf3bd30\": container with ID starting with e66e84ecc9aef3f38aa7901fcda5dbebd35192cf04ae069e2d37723f5cf3bd30 not found: ID does not exist" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.964738 4746 scope.go:117] "RemoveContainer" containerID="ee30384131315f8e32eb7d79d6e8219c181b4d8d31cea43f077290c1cfe6bddb" Jan 28 20:43:14 crc kubenswrapper[4746]: E0128 20:43:14.965123 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee30384131315f8e32eb7d79d6e8219c181b4d8d31cea43f077290c1cfe6bddb\": container with ID starting with ee30384131315f8e32eb7d79d6e8219c181b4d8d31cea43f077290c1cfe6bddb not found: ID does not exist" containerID="ee30384131315f8e32eb7d79d6e8219c181b4d8d31cea43f077290c1cfe6bddb" Jan 28 20:43:14 crc kubenswrapper[4746]: I0128 20:43:14.965168 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee30384131315f8e32eb7d79d6e8219c181b4d8d31cea43f077290c1cfe6bddb"} err="failed to get container status \"ee30384131315f8e32eb7d79d6e8219c181b4d8d31cea43f077290c1cfe6bddb\": rpc error: code = NotFound desc = could not find container \"ee30384131315f8e32eb7d79d6e8219c181b4d8d31cea43f077290c1cfe6bddb\": container with ID starting with ee30384131315f8e32eb7d79d6e8219c181b4d8d31cea43f077290c1cfe6bddb not found: ID does not exist" Jan 28 20:43:15 crc kubenswrapper[4746]: I0128 20:43:15.871762 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:43:15 crc kubenswrapper[4746]: I0128 20:43:15.871850 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:43:15 crc kubenswrapper[4746]: I0128 20:43:15.871911 4746 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:43:15 crc kubenswrapper[4746]: I0128 20:43:15.872730 4746 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188"} pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 20:43:15 crc kubenswrapper[4746]: I0128 20:43:15.872800 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" containerID="cri-o://66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188" gracePeriod=600 Jan 28 20:43:15 crc kubenswrapper[4746]: I0128 20:43:15.921279 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2t9c"] Jan 28 20:43:15 crc kubenswrapper[4746]: I0128 20:43:15.921584 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f2t9c" podUID="1f42df00-e947-4928-a51f-ddaa3658cc67" containerName="registry-server" containerID="cri-o://e53569df4787d5c5def247f3bc6929834be74259f6b3599f9fa4a4189d6e2620" gracePeriod=2 Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.379011 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2t9c" Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.465978 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsqtr\" (UniqueName: \"kubernetes.io/projected/1f42df00-e947-4928-a51f-ddaa3658cc67-kube-api-access-wsqtr\") pod \"1f42df00-e947-4928-a51f-ddaa3658cc67\" (UID: \"1f42df00-e947-4928-a51f-ddaa3658cc67\") " Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.466113 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f42df00-e947-4928-a51f-ddaa3658cc67-utilities\") pod \"1f42df00-e947-4928-a51f-ddaa3658cc67\" (UID: \"1f42df00-e947-4928-a51f-ddaa3658cc67\") " Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.466183 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f42df00-e947-4928-a51f-ddaa3658cc67-catalog-content\") pod \"1f42df00-e947-4928-a51f-ddaa3658cc67\" (UID: \"1f42df00-e947-4928-a51f-ddaa3658cc67\") " Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.467224 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f42df00-e947-4928-a51f-ddaa3658cc67-utilities" (OuterVolumeSpecName: "utilities") pod "1f42df00-e947-4928-a51f-ddaa3658cc67" (UID: "1f42df00-e947-4928-a51f-ddaa3658cc67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.479847 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f42df00-e947-4928-a51f-ddaa3658cc67-kube-api-access-wsqtr" (OuterVolumeSpecName: "kube-api-access-wsqtr") pod "1f42df00-e947-4928-a51f-ddaa3658cc67" (UID: "1f42df00-e947-4928-a51f-ddaa3658cc67"). InnerVolumeSpecName "kube-api-access-wsqtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.492148 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f42df00-e947-4928-a51f-ddaa3658cc67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f42df00-e947-4928-a51f-ddaa3658cc67" (UID: "1f42df00-e947-4928-a51f-ddaa3658cc67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.568205 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsqtr\" (UniqueName: \"kubernetes.io/projected/1f42df00-e947-4928-a51f-ddaa3658cc67-kube-api-access-wsqtr\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.568742 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f42df00-e947-4928-a51f-ddaa3658cc67-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.568758 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f42df00-e947-4928-a51f-ddaa3658cc67-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.844952 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8f220cb-71e1-4b97-960e-ef8742661130" path="/var/lib/kubelet/pods/f8f220cb-71e1-4b97-960e-ef8742661130/volumes" Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.924204 4746 generic.go:334] "Generic (PLEG): container finished" podID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerID="66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188" exitCode=0 Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.924280 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerDied","Data":"66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188"} Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.924358 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerStarted","Data":"f197cfecdfc241f837aa378849ac18dd600cbcb4c925257a8bc1cdfb97e289aa"} Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.926312 4746 generic.go:334] "Generic (PLEG): container finished" podID="1f42df00-e947-4928-a51f-ddaa3658cc67" containerID="e53569df4787d5c5def247f3bc6929834be74259f6b3599f9fa4a4189d6e2620" exitCode=0 Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.926357 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2t9c" Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.926376 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2t9c" event={"ID":"1f42df00-e947-4928-a51f-ddaa3658cc67","Type":"ContainerDied","Data":"e53569df4787d5c5def247f3bc6929834be74259f6b3599f9fa4a4189d6e2620"} Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.926420 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2t9c" event={"ID":"1f42df00-e947-4928-a51f-ddaa3658cc67","Type":"ContainerDied","Data":"59f40486ad366506798fc88daee90c33d4b58a66f1f18831ade1f670ae41b699"} Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.926444 4746 scope.go:117] "RemoveContainer" containerID="e53569df4787d5c5def247f3bc6929834be74259f6b3599f9fa4a4189d6e2620" Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.943522 4746 scope.go:117] "RemoveContainer" containerID="d64d7a1f62f2db500c6c8da501014bd83a035579a9931041f5819b3ce1f832e2" Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.962359 4746 scope.go:117] "RemoveContainer" containerID="17f2ca649624860330af720f180c4cae93097af14ba360d93ac5e2164112d0c4" Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.977952 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2t9c"] Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.982987 4746 scope.go:117] "RemoveContainer" containerID="e53569df4787d5c5def247f3bc6929834be74259f6b3599f9fa4a4189d6e2620" Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.983883 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2t9c"] Jan 28 20:43:16 crc kubenswrapper[4746]: E0128 20:43:16.986113 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e53569df4787d5c5def247f3bc6929834be74259f6b3599f9fa4a4189d6e2620\": container with ID starting with e53569df4787d5c5def247f3bc6929834be74259f6b3599f9fa4a4189d6e2620 not found: ID does not exist" containerID="e53569df4787d5c5def247f3bc6929834be74259f6b3599f9fa4a4189d6e2620" Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.986197 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e53569df4787d5c5def247f3bc6929834be74259f6b3599f9fa4a4189d6e2620"} err="failed to get container status \"e53569df4787d5c5def247f3bc6929834be74259f6b3599f9fa4a4189d6e2620\": rpc error: code = NotFound desc = could not find container \"e53569df4787d5c5def247f3bc6929834be74259f6b3599f9fa4a4189d6e2620\": container with ID starting with e53569df4787d5c5def247f3bc6929834be74259f6b3599f9fa4a4189d6e2620 not found: ID does not exist" Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.986263 4746 scope.go:117] "RemoveContainer" containerID="d64d7a1f62f2db500c6c8da501014bd83a035579a9931041f5819b3ce1f832e2" Jan 28 20:43:16 crc kubenswrapper[4746]: E0128 20:43:16.986891 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d64d7a1f62f2db500c6c8da501014bd83a035579a9931041f5819b3ce1f832e2\": container with ID starting with d64d7a1f62f2db500c6c8da501014bd83a035579a9931041f5819b3ce1f832e2 not found: ID does not exist" containerID="d64d7a1f62f2db500c6c8da501014bd83a035579a9931041f5819b3ce1f832e2" Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.986920 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d64d7a1f62f2db500c6c8da501014bd83a035579a9931041f5819b3ce1f832e2"} err="failed to get container status \"d64d7a1f62f2db500c6c8da501014bd83a035579a9931041f5819b3ce1f832e2\": rpc error: code = NotFound desc = could not find container \"d64d7a1f62f2db500c6c8da501014bd83a035579a9931041f5819b3ce1f832e2\": container with ID starting with d64d7a1f62f2db500c6c8da501014bd83a035579a9931041f5819b3ce1f832e2 not found: ID does not exist" Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.986942 4746 scope.go:117] "RemoveContainer" containerID="17f2ca649624860330af720f180c4cae93097af14ba360d93ac5e2164112d0c4" Jan 28 20:43:16 crc kubenswrapper[4746]: E0128 20:43:16.987399 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f2ca649624860330af720f180c4cae93097af14ba360d93ac5e2164112d0c4\": container with ID starting with 17f2ca649624860330af720f180c4cae93097af14ba360d93ac5e2164112d0c4 not found: ID does not exist" containerID="17f2ca649624860330af720f180c4cae93097af14ba360d93ac5e2164112d0c4" Jan 28 20:43:16 crc kubenswrapper[4746]: I0128 20:43:16.987429 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f2ca649624860330af720f180c4cae93097af14ba360d93ac5e2164112d0c4"} err="failed to get container status \"17f2ca649624860330af720f180c4cae93097af14ba360d93ac5e2164112d0c4\": rpc error: code = NotFound desc = could not find container \"17f2ca649624860330af720f180c4cae93097af14ba360d93ac5e2164112d0c4\": container with ID starting with 17f2ca649624860330af720f180c4cae93097af14ba360d93ac5e2164112d0c4 not found: ID does not exist" Jan 28 20:43:17 crc kubenswrapper[4746]: I0128 20:43:17.927659 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" podUID="6208130d-52bc-449e-b371-357b1cc21b22" containerName="oauth-openshift" containerID="cri-o://ab259627a2560868a6fdbf9d2cca4fb91aee52f63f4afc20e810c3c81ba3aca9" gracePeriod=15 Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.457357 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.607735 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-session\") pod \"6208130d-52bc-449e-b371-357b1cc21b22\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.607844 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-router-certs\") pod \"6208130d-52bc-449e-b371-357b1cc21b22\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.607879 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-audit-policies\") pod \"6208130d-52bc-449e-b371-357b1cc21b22\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.607901 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-trusted-ca-bundle\") pod \"6208130d-52bc-449e-b371-357b1cc21b22\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.607923 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-cliconfig\") pod \"6208130d-52bc-449e-b371-357b1cc21b22\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.607951 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6208130d-52bc-449e-b371-357b1cc21b22-audit-dir\") pod \"6208130d-52bc-449e-b371-357b1cc21b22\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.607975 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-serving-cert\") pod \"6208130d-52bc-449e-b371-357b1cc21b22\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.607998 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-template-provider-selection\") pod \"6208130d-52bc-449e-b371-357b1cc21b22\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.608020 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6krvq\" (UniqueName: \"kubernetes.io/projected/6208130d-52bc-449e-b371-357b1cc21b22-kube-api-access-6krvq\") pod \"6208130d-52bc-449e-b371-357b1cc21b22\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.608043 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-service-ca\") pod \"6208130d-52bc-449e-b371-357b1cc21b22\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.608068 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-idp-0-file-data\") pod \"6208130d-52bc-449e-b371-357b1cc21b22\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.608067 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6208130d-52bc-449e-b371-357b1cc21b22-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "6208130d-52bc-449e-b371-357b1cc21b22" (UID: "6208130d-52bc-449e-b371-357b1cc21b22"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.608115 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-template-error\") pod \"6208130d-52bc-449e-b371-357b1cc21b22\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.608241 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-template-login\") pod \"6208130d-52bc-449e-b371-357b1cc21b22\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.608303 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-ocp-branding-template\") pod \"6208130d-52bc-449e-b371-357b1cc21b22\" (UID: \"6208130d-52bc-449e-b371-357b1cc21b22\") " Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.608873 4746 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6208130d-52bc-449e-b371-357b1cc21b22-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.608883 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "6208130d-52bc-449e-b371-357b1cc21b22" (UID: "6208130d-52bc-449e-b371-357b1cc21b22"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.610072 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "6208130d-52bc-449e-b371-357b1cc21b22" (UID: "6208130d-52bc-449e-b371-357b1cc21b22"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.611103 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "6208130d-52bc-449e-b371-357b1cc21b22" (UID: "6208130d-52bc-449e-b371-357b1cc21b22"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.614471 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "6208130d-52bc-449e-b371-357b1cc21b22" (UID: "6208130d-52bc-449e-b371-357b1cc21b22"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.614910 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "6208130d-52bc-449e-b371-357b1cc21b22" (UID: "6208130d-52bc-449e-b371-357b1cc21b22"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.615436 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "6208130d-52bc-449e-b371-357b1cc21b22" (UID: "6208130d-52bc-449e-b371-357b1cc21b22"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.615898 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6208130d-52bc-449e-b371-357b1cc21b22-kube-api-access-6krvq" (OuterVolumeSpecName: "kube-api-access-6krvq") pod "6208130d-52bc-449e-b371-357b1cc21b22" (UID: "6208130d-52bc-449e-b371-357b1cc21b22"). InnerVolumeSpecName "kube-api-access-6krvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.615950 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "6208130d-52bc-449e-b371-357b1cc21b22" (UID: "6208130d-52bc-449e-b371-357b1cc21b22"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.622618 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "6208130d-52bc-449e-b371-357b1cc21b22" (UID: "6208130d-52bc-449e-b371-357b1cc21b22"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.623204 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "6208130d-52bc-449e-b371-357b1cc21b22" (UID: "6208130d-52bc-449e-b371-357b1cc21b22"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.623477 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "6208130d-52bc-449e-b371-357b1cc21b22" (UID: "6208130d-52bc-449e-b371-357b1cc21b22"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.623689 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "6208130d-52bc-449e-b371-357b1cc21b22" (UID: "6208130d-52bc-449e-b371-357b1cc21b22"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.624046 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "6208130d-52bc-449e-b371-357b1cc21b22" (UID: "6208130d-52bc-449e-b371-357b1cc21b22"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.710405 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.710448 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.710459 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.710471 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.710485 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6krvq\" (UniqueName: \"kubernetes.io/projected/6208130d-52bc-449e-b371-357b1cc21b22-kube-api-access-6krvq\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.710496 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.710509 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.710521 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.710532 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.710543 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.710554 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.710564 4746 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6208130d-52bc-449e-b371-357b1cc21b22-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.710575 4746 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6208130d-52bc-449e-b371-357b1cc21b22-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.737443 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-dc8f69579-d5l5z"] Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.738169 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" podUID="c53d0862-8035-43b1-9b6d-374d70bec983" containerName="controller-manager" containerID="cri-o://7a15e2b0bfca4caf4d1369cede56c806a1e4907df1b641b3f25aae44c7a2b8ae" gracePeriod=30 Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.834357 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx"] Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.834667 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" podUID="d78c5ee4-f3b6-4203-a658-c64799bbf6ab" containerName="route-controller-manager" containerID="cri-o://273e2a6cd0cae1f380a2631d7e51fbf2897d6f7f32e57f9e0fcc4a40d0fc3091" gracePeriod=30 Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.846873 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f42df00-e947-4928-a51f-ddaa3658cc67" path="/var/lib/kubelet/pods/1f42df00-e947-4928-a51f-ddaa3658cc67/volumes" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.970210 4746 generic.go:334] "Generic (PLEG): container finished" podID="c53d0862-8035-43b1-9b6d-374d70bec983" containerID="7a15e2b0bfca4caf4d1369cede56c806a1e4907df1b641b3f25aae44c7a2b8ae" exitCode=0 Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.970508 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" event={"ID":"c53d0862-8035-43b1-9b6d-374d70bec983","Type":"ContainerDied","Data":"7a15e2b0bfca4caf4d1369cede56c806a1e4907df1b641b3f25aae44c7a2b8ae"} Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.972206 4746 generic.go:334] "Generic (PLEG): container finished" podID="6208130d-52bc-449e-b371-357b1cc21b22" containerID="ab259627a2560868a6fdbf9d2cca4fb91aee52f63f4afc20e810c3c81ba3aca9" exitCode=0 Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.972273 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" event={"ID":"6208130d-52bc-449e-b371-357b1cc21b22","Type":"ContainerDied","Data":"ab259627a2560868a6fdbf9d2cca4fb91aee52f63f4afc20e810c3c81ba3aca9"} Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.972308 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" event={"ID":"6208130d-52bc-449e-b371-357b1cc21b22","Type":"ContainerDied","Data":"a474433d408f6ddbb5883924433e9db5dbc565a20c96add848c696d601d0784f"} Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.972328 4746 scope.go:117] "RemoveContainer" containerID="ab259627a2560868a6fdbf9d2cca4fb91aee52f63f4afc20e810c3c81ba3aca9" Jan 28 20:43:18 crc kubenswrapper[4746]: I0128 20:43:18.972452 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4g4p7" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.000565 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4g4p7"] Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.006388 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4g4p7"] Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.011518 4746 scope.go:117] "RemoveContainer" containerID="ab259627a2560868a6fdbf9d2cca4fb91aee52f63f4afc20e810c3c81ba3aca9" Jan 28 20:43:19 crc kubenswrapper[4746]: E0128 20:43:19.012157 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab259627a2560868a6fdbf9d2cca4fb91aee52f63f4afc20e810c3c81ba3aca9\": container with ID starting with ab259627a2560868a6fdbf9d2cca4fb91aee52f63f4afc20e810c3c81ba3aca9 not found: ID does not exist" containerID="ab259627a2560868a6fdbf9d2cca4fb91aee52f63f4afc20e810c3c81ba3aca9" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.012202 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab259627a2560868a6fdbf9d2cca4fb91aee52f63f4afc20e810c3c81ba3aca9"} err="failed to get container status \"ab259627a2560868a6fdbf9d2cca4fb91aee52f63f4afc20e810c3c81ba3aca9\": rpc error: code = NotFound desc = could not find container \"ab259627a2560868a6fdbf9d2cca4fb91aee52f63f4afc20e810c3c81ba3aca9\": container with ID starting with ab259627a2560868a6fdbf9d2cca4fb91aee52f63f4afc20e810c3c81ba3aca9 not found: ID does not exist" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.400529 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.466248 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.524485 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-serving-cert\") pod \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\" (UID: \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\") " Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.524596 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjtmf\" (UniqueName: \"kubernetes.io/projected/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-kube-api-access-gjtmf\") pod \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\" (UID: \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\") " Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.524709 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-config\") pod \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\" (UID: \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\") " Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.524793 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-client-ca\") pod \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\" (UID: \"d78c5ee4-f3b6-4203-a658-c64799bbf6ab\") " Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.525702 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-client-ca" (OuterVolumeSpecName: "client-ca") pod "d78c5ee4-f3b6-4203-a658-c64799bbf6ab" (UID: "d78c5ee4-f3b6-4203-a658-c64799bbf6ab"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.525749 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-config" (OuterVolumeSpecName: "config") pod "d78c5ee4-f3b6-4203-a658-c64799bbf6ab" (UID: "d78c5ee4-f3b6-4203-a658-c64799bbf6ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.531073 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d78c5ee4-f3b6-4203-a658-c64799bbf6ab" (UID: "d78c5ee4-f3b6-4203-a658-c64799bbf6ab"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.531341 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-kube-api-access-gjtmf" (OuterVolumeSpecName: "kube-api-access-gjtmf") pod "d78c5ee4-f3b6-4203-a658-c64799bbf6ab" (UID: "d78c5ee4-f3b6-4203-a658-c64799bbf6ab"). InnerVolumeSpecName "kube-api-access-gjtmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.626329 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c53d0862-8035-43b1-9b6d-374d70bec983-client-ca\") pod \"c53d0862-8035-43b1-9b6d-374d70bec983\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.626419 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c53d0862-8035-43b1-9b6d-374d70bec983-proxy-ca-bundles\") pod \"c53d0862-8035-43b1-9b6d-374d70bec983\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.626517 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4cv7\" (UniqueName: \"kubernetes.io/projected/c53d0862-8035-43b1-9b6d-374d70bec983-kube-api-access-r4cv7\") pod \"c53d0862-8035-43b1-9b6d-374d70bec983\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.626537 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c53d0862-8035-43b1-9b6d-374d70bec983-serving-cert\") pod \"c53d0862-8035-43b1-9b6d-374d70bec983\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.626581 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c53d0862-8035-43b1-9b6d-374d70bec983-config\") pod \"c53d0862-8035-43b1-9b6d-374d70bec983\" (UID: \"c53d0862-8035-43b1-9b6d-374d70bec983\") " Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.626805 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.626817 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjtmf\" (UniqueName: \"kubernetes.io/projected/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-kube-api-access-gjtmf\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.626828 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.626838 4746 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d78c5ee4-f3b6-4203-a658-c64799bbf6ab-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.627895 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c53d0862-8035-43b1-9b6d-374d70bec983-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c53d0862-8035-43b1-9b6d-374d70bec983" (UID: "c53d0862-8035-43b1-9b6d-374d70bec983"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.628186 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c53d0862-8035-43b1-9b6d-374d70bec983-config" (OuterVolumeSpecName: "config") pod "c53d0862-8035-43b1-9b6d-374d70bec983" (UID: "c53d0862-8035-43b1-9b6d-374d70bec983"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.628440 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c53d0862-8035-43b1-9b6d-374d70bec983-client-ca" (OuterVolumeSpecName: "client-ca") pod "c53d0862-8035-43b1-9b6d-374d70bec983" (UID: "c53d0862-8035-43b1-9b6d-374d70bec983"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.630440 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c53d0862-8035-43b1-9b6d-374d70bec983-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c53d0862-8035-43b1-9b6d-374d70bec983" (UID: "c53d0862-8035-43b1-9b6d-374d70bec983"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.631253 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c53d0862-8035-43b1-9b6d-374d70bec983-kube-api-access-r4cv7" (OuterVolumeSpecName: "kube-api-access-r4cv7") pod "c53d0862-8035-43b1-9b6d-374d70bec983" (UID: "c53d0862-8035-43b1-9b6d-374d70bec983"). InnerVolumeSpecName "kube-api-access-r4cv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.728217 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4cv7\" (UniqueName: \"kubernetes.io/projected/c53d0862-8035-43b1-9b6d-374d70bec983-kube-api-access-r4cv7\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.728277 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c53d0862-8035-43b1-9b6d-374d70bec983-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.728289 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c53d0862-8035-43b1-9b6d-374d70bec983-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.728298 4746 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c53d0862-8035-43b1-9b6d-374d70bec983-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.728308 4746 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c53d0862-8035-43b1-9b6d-374d70bec983-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.989058 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.989048 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" event={"ID":"c53d0862-8035-43b1-9b6d-374d70bec983","Type":"ContainerDied","Data":"503feb6548cedc39815cef9ce48468edaf90a7b018da506af8f948803d6a6f8e"} Jan 28 20:43:19 crc kubenswrapper[4746]: I0128 20:43:19.989353 4746 scope.go:117] "RemoveContainer" containerID="7a15e2b0bfca4caf4d1369cede56c806a1e4907df1b641b3f25aae44c7a2b8ae" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:19.993225 4746 generic.go:334] "Generic (PLEG): container finished" podID="d78c5ee4-f3b6-4203-a658-c64799bbf6ab" containerID="273e2a6cd0cae1f380a2631d7e51fbf2897d6f7f32e57f9e0fcc4a40d0fc3091" exitCode=0 Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:19.993341 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:19.993377 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" event={"ID":"d78c5ee4-f3b6-4203-a658-c64799bbf6ab","Type":"ContainerDied","Data":"273e2a6cd0cae1f380a2631d7e51fbf2897d6f7f32e57f9e0fcc4a40d0fc3091"} Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:19.993416 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx" event={"ID":"d78c5ee4-f3b6-4203-a658-c64799bbf6ab","Type":"ContainerDied","Data":"12092dd133976d5b100a05868e21a0c2938ad0e8f5bb5802a3fb8afe3e28dbf0"} Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.001961 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78b9587cd6-5jh8p"] Jan 28 20:43:20 crc kubenswrapper[4746]: E0128 20:43:20.002544 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f220cb-71e1-4b97-960e-ef8742661130" containerName="extract-content" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.002592 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f220cb-71e1-4b97-960e-ef8742661130" containerName="extract-content" Jan 28 20:43:20 crc kubenswrapper[4746]: E0128 20:43:20.002626 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f41ddf2-d97e-4e32-ac3d-48850a4d47ee" containerName="registry-server" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.002644 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f41ddf2-d97e-4e32-ac3d-48850a4d47ee" containerName="registry-server" Jan 28 20:43:20 crc kubenswrapper[4746]: E0128 20:43:20.002664 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d78c5ee4-f3b6-4203-a658-c64799bbf6ab" containerName="route-controller-manager" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.002680 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="d78c5ee4-f3b6-4203-a658-c64799bbf6ab" containerName="route-controller-manager" Jan 28 20:43:20 crc kubenswrapper[4746]: E0128 20:43:20.002709 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f42df00-e947-4928-a51f-ddaa3658cc67" containerName="registry-server" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.002726 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f42df00-e947-4928-a51f-ddaa3658cc67" containerName="registry-server" Jan 28 20:43:20 crc kubenswrapper[4746]: E0128 20:43:20.002753 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6208130d-52bc-449e-b371-357b1cc21b22" containerName="oauth-openshift" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.002769 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="6208130d-52bc-449e-b371-357b1cc21b22" containerName="oauth-openshift" Jan 28 20:43:20 crc kubenswrapper[4746]: E0128 20:43:20.002792 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c53d0862-8035-43b1-9b6d-374d70bec983" containerName="controller-manager" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.002807 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c53d0862-8035-43b1-9b6d-374d70bec983" containerName="controller-manager" Jan 28 20:43:20 crc kubenswrapper[4746]: E0128 20:43:20.002827 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe404c6-f1c8-4ad6-92b9-7c082b112b50" containerName="extract-content" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.002879 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe404c6-f1c8-4ad6-92b9-7c082b112b50" containerName="extract-content" Jan 28 20:43:20 crc kubenswrapper[4746]: E0128 20:43:20.002899 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe404c6-f1c8-4ad6-92b9-7c082b112b50" containerName="registry-server" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.002917 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe404c6-f1c8-4ad6-92b9-7c082b112b50" containerName="registry-server" Jan 28 20:43:20 crc kubenswrapper[4746]: E0128 20:43:20.002943 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe404c6-f1c8-4ad6-92b9-7c082b112b50" containerName="extract-utilities" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.002963 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe404c6-f1c8-4ad6-92b9-7c082b112b50" containerName="extract-utilities" Jan 28 20:43:20 crc kubenswrapper[4746]: E0128 20:43:20.002993 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f42df00-e947-4928-a51f-ddaa3658cc67" containerName="extract-utilities" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.003010 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f42df00-e947-4928-a51f-ddaa3658cc67" containerName="extract-utilities" Jan 28 20:43:20 crc kubenswrapper[4746]: E0128 20:43:20.003026 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f41ddf2-d97e-4e32-ac3d-48850a4d47ee" containerName="extract-utilities" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.003044 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f41ddf2-d97e-4e32-ac3d-48850a4d47ee" containerName="extract-utilities" Jan 28 20:43:20 crc kubenswrapper[4746]: E0128 20:43:20.003061 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f41ddf2-d97e-4e32-ac3d-48850a4d47ee" containerName="extract-content" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.003111 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f41ddf2-d97e-4e32-ac3d-48850a4d47ee" containerName="extract-content" Jan 28 20:43:20 crc kubenswrapper[4746]: E0128 20:43:20.003135 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f220cb-71e1-4b97-960e-ef8742661130" containerName="extract-utilities" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.003152 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f220cb-71e1-4b97-960e-ef8742661130" containerName="extract-utilities" Jan 28 20:43:20 crc kubenswrapper[4746]: E0128 20:43:20.003175 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8f220cb-71e1-4b97-960e-ef8742661130" containerName="registry-server" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.003190 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f220cb-71e1-4b97-960e-ef8742661130" containerName="registry-server" Jan 28 20:43:20 crc kubenswrapper[4746]: E0128 20:43:20.003222 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f42df00-e947-4928-a51f-ddaa3658cc67" containerName="extract-content" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.003240 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f42df00-e947-4928-a51f-ddaa3658cc67" containerName="extract-content" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.003462 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f42df00-e947-4928-a51f-ddaa3658cc67" containerName="registry-server" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.003483 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f41ddf2-d97e-4e32-ac3d-48850a4d47ee" containerName="registry-server" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.003511 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="6208130d-52bc-449e-b371-357b1cc21b22" containerName="oauth-openshift" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.003536 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8f220cb-71e1-4b97-960e-ef8742661130" containerName="registry-server" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.003560 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="abe404c6-f1c8-4ad6-92b9-7c082b112b50" containerName="registry-server" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.003584 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c53d0862-8035-43b1-9b6d-374d70bec983" containerName="controller-manager" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.003602 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="d78c5ee4-f3b6-4203-a658-c64799bbf6ab" containerName="route-controller-manager" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.004566 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.008233 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg"] Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.009161 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.009424 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.009856 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.010357 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.010502 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.011162 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.011411 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.013758 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.014054 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.014379 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.014977 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.015181 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.015792 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.022804 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.024816 4746 scope.go:117] "RemoveContainer" containerID="273e2a6cd0cae1f380a2631d7e51fbf2897d6f7f32e57f9e0fcc4a40d0fc3091" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.036538 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78b9587cd6-5jh8p"] Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.047150 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg"] Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.120674 4746 scope.go:117] "RemoveContainer" containerID="273e2a6cd0cae1f380a2631d7e51fbf2897d6f7f32e57f9e0fcc4a40d0fc3091" Jan 28 20:43:20 crc kubenswrapper[4746]: E0128 20:43:20.121339 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"273e2a6cd0cae1f380a2631d7e51fbf2897d6f7f32e57f9e0fcc4a40d0fc3091\": container with ID starting with 273e2a6cd0cae1f380a2631d7e51fbf2897d6f7f32e57f9e0fcc4a40d0fc3091 not found: ID does not exist" containerID="273e2a6cd0cae1f380a2631d7e51fbf2897d6f7f32e57f9e0fcc4a40d0fc3091" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.121387 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"273e2a6cd0cae1f380a2631d7e51fbf2897d6f7f32e57f9e0fcc4a40d0fc3091"} err="failed to get container status \"273e2a6cd0cae1f380a2631d7e51fbf2897d6f7f32e57f9e0fcc4a40d0fc3091\": rpc error: code = NotFound desc = could not find container \"273e2a6cd0cae1f380a2631d7e51fbf2897d6f7f32e57f9e0fcc4a40d0fc3091\": container with ID starting with 273e2a6cd0cae1f380a2631d7e51fbf2897d6f7f32e57f9e0fcc4a40d0fc3091 not found: ID does not exist" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.137900 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1837e217-cd33-4aa4-8c46-6f344a75aded-client-ca\") pod \"route-controller-manager-5d7856c4bb-d95sg\" (UID: \"1837e217-cd33-4aa4-8c46-6f344a75aded\") " pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.137951 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-proxy-ca-bundles\") pod \"controller-manager-78b9587cd6-5jh8p\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.137981 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-client-ca\") pod \"controller-manager-78b9587cd6-5jh8p\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.138011 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1837e217-cd33-4aa4-8c46-6f344a75aded-serving-cert\") pod \"route-controller-manager-5d7856c4bb-d95sg\" (UID: \"1837e217-cd33-4aa4-8c46-6f344a75aded\") " pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.138033 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1837e217-cd33-4aa4-8c46-6f344a75aded-config\") pod \"route-controller-manager-5d7856c4bb-d95sg\" (UID: \"1837e217-cd33-4aa4-8c46-6f344a75aded\") " pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.138058 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmffh\" (UniqueName: \"kubernetes.io/projected/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-kube-api-access-pmffh\") pod \"controller-manager-78b9587cd6-5jh8p\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.138096 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlwc2\" (UniqueName: \"kubernetes.io/projected/1837e217-cd33-4aa4-8c46-6f344a75aded-kube-api-access-dlwc2\") pod \"route-controller-manager-5d7856c4bb-d95sg\" (UID: \"1837e217-cd33-4aa4-8c46-6f344a75aded\") " pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.138114 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-serving-cert\") pod \"controller-manager-78b9587cd6-5jh8p\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.138154 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-config\") pod \"controller-manager-78b9587cd6-5jh8p\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.138683 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx"] Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.141536 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c874fbf6-kz5dx"] Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.147858 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-dc8f69579-d5l5z"] Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.161133 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-dc8f69579-d5l5z"] Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.239720 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1837e217-cd33-4aa4-8c46-6f344a75aded-serving-cert\") pod \"route-controller-manager-5d7856c4bb-d95sg\" (UID: \"1837e217-cd33-4aa4-8c46-6f344a75aded\") " pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.239810 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1837e217-cd33-4aa4-8c46-6f344a75aded-config\") pod \"route-controller-manager-5d7856c4bb-d95sg\" (UID: \"1837e217-cd33-4aa4-8c46-6f344a75aded\") " pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.239873 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmffh\" (UniqueName: \"kubernetes.io/projected/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-kube-api-access-pmffh\") pod \"controller-manager-78b9587cd6-5jh8p\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.239918 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlwc2\" (UniqueName: \"kubernetes.io/projected/1837e217-cd33-4aa4-8c46-6f344a75aded-kube-api-access-dlwc2\") pod \"route-controller-manager-5d7856c4bb-d95sg\" (UID: \"1837e217-cd33-4aa4-8c46-6f344a75aded\") " pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.239957 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-serving-cert\") pod \"controller-manager-78b9587cd6-5jh8p\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.240034 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-config\") pod \"controller-manager-78b9587cd6-5jh8p\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.240070 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1837e217-cd33-4aa4-8c46-6f344a75aded-client-ca\") pod \"route-controller-manager-5d7856c4bb-d95sg\" (UID: \"1837e217-cd33-4aa4-8c46-6f344a75aded\") " pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.240165 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-proxy-ca-bundles\") pod \"controller-manager-78b9587cd6-5jh8p\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.240235 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-client-ca\") pod \"controller-manager-78b9587cd6-5jh8p\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.241908 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1837e217-cd33-4aa4-8c46-6f344a75aded-client-ca\") pod \"route-controller-manager-5d7856c4bb-d95sg\" (UID: \"1837e217-cd33-4aa4-8c46-6f344a75aded\") " pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.242505 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-proxy-ca-bundles\") pod \"controller-manager-78b9587cd6-5jh8p\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.242704 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1837e217-cd33-4aa4-8c46-6f344a75aded-config\") pod \"route-controller-manager-5d7856c4bb-d95sg\" (UID: \"1837e217-cd33-4aa4-8c46-6f344a75aded\") " pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.243296 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-client-ca\") pod \"controller-manager-78b9587cd6-5jh8p\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.244606 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-config\") pod \"controller-manager-78b9587cd6-5jh8p\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.245954 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1837e217-cd33-4aa4-8c46-6f344a75aded-serving-cert\") pod \"route-controller-manager-5d7856c4bb-d95sg\" (UID: \"1837e217-cd33-4aa4-8c46-6f344a75aded\") " pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.247628 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-serving-cert\") pod \"controller-manager-78b9587cd6-5jh8p\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.262735 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmffh\" (UniqueName: \"kubernetes.io/projected/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-kube-api-access-pmffh\") pod \"controller-manager-78b9587cd6-5jh8p\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.263903 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlwc2\" (UniqueName: \"kubernetes.io/projected/1837e217-cd33-4aa4-8c46-6f344a75aded-kube-api-access-dlwc2\") pod \"route-controller-manager-5d7856c4bb-d95sg\" (UID: \"1837e217-cd33-4aa4-8c46-6f344a75aded\") " pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.282667 4746 patch_prober.go:28] interesting pod/controller-manager-dc8f69579-d5l5z container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: i/o timeout" start-of-body= Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.282732 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-dc8f69579-d5l5z" podUID="c53d0862-8035-43b1-9b6d-374d70bec983" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: i/o timeout" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.429836 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.436131 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.736207 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg"] Jan 28 20:43:20 crc kubenswrapper[4746]: W0128 20:43:20.741276 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1837e217_cd33_4aa4_8c46_6f344a75aded.slice/crio-8f8d0679a91a56a0f0899050a8254fd14b35112e9b2f2bd0abfe00d092436ed9 WatchSource:0}: Error finding container 8f8d0679a91a56a0f0899050a8254fd14b35112e9b2f2bd0abfe00d092436ed9: Status 404 returned error can't find the container with id 8f8d0679a91a56a0f0899050a8254fd14b35112e9b2f2bd0abfe00d092436ed9 Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.845841 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6208130d-52bc-449e-b371-357b1cc21b22" path="/var/lib/kubelet/pods/6208130d-52bc-449e-b371-357b1cc21b22/volumes" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.847691 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c53d0862-8035-43b1-9b6d-374d70bec983" path="/var/lib/kubelet/pods/c53d0862-8035-43b1-9b6d-374d70bec983/volumes" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.848521 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d78c5ee4-f3b6-4203-a658-c64799bbf6ab" path="/var/lib/kubelet/pods/d78c5ee4-f3b6-4203-a658-c64799bbf6ab/volumes" Jan 28 20:43:20 crc kubenswrapper[4746]: I0128 20:43:20.876374 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78b9587cd6-5jh8p"] Jan 28 20:43:20 crc kubenswrapper[4746]: W0128 20:43:20.884816 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffced8b3_c5f6_4e6b_9848_0fa63a34b1fa.slice/crio-bf98b71b72e961dd2467de92f0a93f8e5a1c7bb98dc7845bf75e2994985af40e WatchSource:0}: Error finding container bf98b71b72e961dd2467de92f0a93f8e5a1c7bb98dc7845bf75e2994985af40e: Status 404 returned error can't find the container with id bf98b71b72e961dd2467de92f0a93f8e5a1c7bb98dc7845bf75e2994985af40e Jan 28 20:43:21 crc kubenswrapper[4746]: I0128 20:43:21.006631 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" event={"ID":"1837e217-cd33-4aa4-8c46-6f344a75aded","Type":"ContainerStarted","Data":"4249062ea83a5500279c1caa99cf02c9cbebe962dbca5691afbe4068d55f7bdc"} Jan 28 20:43:21 crc kubenswrapper[4746]: I0128 20:43:21.006677 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" event={"ID":"1837e217-cd33-4aa4-8c46-6f344a75aded","Type":"ContainerStarted","Data":"8f8d0679a91a56a0f0899050a8254fd14b35112e9b2f2bd0abfe00d092436ed9"} Jan 28 20:43:21 crc kubenswrapper[4746]: I0128 20:43:21.006983 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" Jan 28 20:43:21 crc kubenswrapper[4746]: I0128 20:43:21.010446 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" event={"ID":"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa","Type":"ContainerStarted","Data":"bf98b71b72e961dd2467de92f0a93f8e5a1c7bb98dc7845bf75e2994985af40e"} Jan 28 20:43:21 crc kubenswrapper[4746]: I0128 20:43:21.026749 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" podStartSLOduration=3.026727401 podStartE2EDuration="3.026727401s" podCreationTimestamp="2026-01-28 20:43:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:43:21.023115957 +0000 UTC m=+228.979302311" watchObservedRunningTime="2026-01-28 20:43:21.026727401 +0000 UTC m=+228.982913755" Jan 28 20:43:21 crc kubenswrapper[4746]: I0128 20:43:21.354491 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" Jan 28 20:43:22 crc kubenswrapper[4746]: I0128 20:43:22.017418 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" event={"ID":"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa","Type":"ContainerStarted","Data":"6aa206c1994a75b99540a5d2dac4916054ffcf9e2fd802e89171eae7e01f29bd"} Jan 28 20:43:22 crc kubenswrapper[4746]: I0128 20:43:22.017817 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:22 crc kubenswrapper[4746]: I0128 20:43:22.022800 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:43:22 crc kubenswrapper[4746]: I0128 20:43:22.036834 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" podStartSLOduration=4.03680894 podStartE2EDuration="4.03680894s" podCreationTimestamp="2026-01-28 20:43:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:43:22.034344319 +0000 UTC m=+229.990530663" watchObservedRunningTime="2026-01-28 20:43:22.03680894 +0000 UTC m=+229.992995294" Jan 28 20:43:25 crc kubenswrapper[4746]: I0128 20:43:25.988235 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-8ccb4757-8qk64"] Jan 28 20:43:25 crc kubenswrapper[4746]: I0128 20:43:25.989786 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:25 crc kubenswrapper[4746]: I0128 20:43:25.994795 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 28 20:43:25 crc kubenswrapper[4746]: I0128 20:43:25.994824 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 28 20:43:25 crc kubenswrapper[4746]: I0128 20:43:25.997481 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 28 20:43:25 crc kubenswrapper[4746]: I0128 20:43:25.997704 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 28 20:43:25 crc kubenswrapper[4746]: I0128 20:43:25.997950 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 28 20:43:25 crc kubenswrapper[4746]: I0128 20:43:25.997955 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 28 20:43:25 crc kubenswrapper[4746]: I0128 20:43:25.998057 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 28 20:43:25 crc kubenswrapper[4746]: I0128 20:43:25.998070 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 28 20:43:25 crc kubenswrapper[4746]: I0128 20:43:25.998367 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 28 20:43:25 crc kubenswrapper[4746]: I0128 20:43:25.998505 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 28 20:43:25 crc kubenswrapper[4746]: I0128 20:43:25.998696 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 28 20:43:25 crc kubenswrapper[4746]: I0128 20:43:25.999117 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.004781 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.008839 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.020896 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8ccb4757-8qk64"] Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.025660 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.157544 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-service-ca\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.157635 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.157743 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.157812 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-router-certs\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.157861 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-user-template-login\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.157932 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5a73916-bbee-4c88-9bab-9e6f460ca96d-audit-dir\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.158168 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-session\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.158303 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.158386 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.158539 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d5a73916-bbee-4c88-9bab-9e6f460ca96d-audit-policies\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.158594 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zfhz\" (UniqueName: \"kubernetes.io/projected/d5a73916-bbee-4c88-9bab-9e6f460ca96d-kube-api-access-6zfhz\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.158718 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-user-template-error\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.158795 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.158876 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.260118 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-user-template-error\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.260198 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.260254 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.261288 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-service-ca\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.261329 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.261362 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.261387 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-router-certs\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.261410 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-user-template-login\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.261431 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5a73916-bbee-4c88-9bab-9e6f460ca96d-audit-dir\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.261454 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-session\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.261475 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.261496 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.261494 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-service-ca\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.261524 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d5a73916-bbee-4c88-9bab-9e6f460ca96d-audit-policies\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.261546 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zfhz\" (UniqueName: \"kubernetes.io/projected/d5a73916-bbee-4c88-9bab-9e6f460ca96d-kube-api-access-6zfhz\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.261594 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d5a73916-bbee-4c88-9bab-9e6f460ca96d-audit-dir\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.261977 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.262641 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d5a73916-bbee-4c88-9bab-9e6f460ca96d-audit-policies\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.262729 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.267790 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-user-template-login\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.267997 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.268103 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-user-template-error\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.268943 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.268943 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.270855 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-router-certs\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.273571 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.276398 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d5a73916-bbee-4c88-9bab-9e6f460ca96d-v4-0-config-system-session\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.282526 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zfhz\" (UniqueName: \"kubernetes.io/projected/d5a73916-bbee-4c88-9bab-9e6f460ca96d-kube-api-access-6zfhz\") pod \"oauth-openshift-8ccb4757-8qk64\" (UID: \"d5a73916-bbee-4c88-9bab-9e6f460ca96d\") " pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.326912 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:26 crc kubenswrapper[4746]: I0128 20:43:26.829645 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8ccb4757-8qk64"] Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.072526 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" event={"ID":"d5a73916-bbee-4c88-9bab-9e6f460ca96d","Type":"ContainerStarted","Data":"4bf3c208a2b317beab25d34caf1a4ae5b38a2b3d4f8fa645c4d1b4ccc6838fec"} Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.934112 4746 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.935389 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.936708 4746 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.937056 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c" gracePeriod=15 Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.937074 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3" gracePeriod=15 Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.937125 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530" gracePeriod=15 Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.937190 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3" gracePeriod=15 Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.937204 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5" gracePeriod=15 Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.937671 4746 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 28 20:43:27 crc kubenswrapper[4746]: E0128 20:43:27.937821 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.937843 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 28 20:43:27 crc kubenswrapper[4746]: E0128 20:43:27.937852 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.937860 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 28 20:43:27 crc kubenswrapper[4746]: E0128 20:43:27.937868 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.937873 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 20:43:27 crc kubenswrapper[4746]: E0128 20:43:27.937883 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.937890 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 28 20:43:27 crc kubenswrapper[4746]: E0128 20:43:27.937899 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.937905 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 28 20:43:27 crc kubenswrapper[4746]: E0128 20:43:27.937911 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.937917 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 20:43:27 crc kubenswrapper[4746]: E0128 20:43:27.937926 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.937933 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.938038 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.938050 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.938060 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.938069 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.938094 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.938100 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.938108 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 20:43:27 crc kubenswrapper[4746]: E0128 20:43:27.938216 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.938223 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 28 20:43:27 crc kubenswrapper[4746]: I0128 20:43:27.977652 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.082795 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" event={"ID":"d5a73916-bbee-4c88-9bab-9e6f460ca96d","Type":"ContainerStarted","Data":"bb72334a574e912999552bb6827cc0efc1be291084471a743aab8136f48ba3a3"} Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.083279 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.083787 4746 status_manager.go:851] "Failed to get status for pod" podUID="d5a73916-bbee-4c88-9bab-9e6f460ca96d" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-8ccb4757-8qk64\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.084275 4746 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.084756 4746 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.086051 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.089270 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.090286 4746 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3" exitCode=0 Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.090408 4746 scope.go:117] "RemoveContainer" containerID="56b250a192cfa698ccd7ac8233b75b9acbf2347549c712b47ccc1b89f144aa83" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.090415 4746 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5" exitCode=0 Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.090557 4746 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530" exitCode=0 Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.090583 4746 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3" exitCode=2 Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.091415 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.091538 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.091641 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.091728 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.091857 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.091947 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.092025 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.092146 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.093685 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.094610 4746 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.095300 4746 status_manager.go:851] "Failed to get status for pod" podUID="d5a73916-bbee-4c88-9bab-9e6f460ca96d" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-8ccb4757-8qk64\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.095627 4746 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.193324 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.193801 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.194137 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.193461 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.194781 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.194898 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.195221 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.195417 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.195702 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.196378 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.195846 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.196517 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.197106 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.197110 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.197380 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.197382 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: I0128 20:43:28.275668 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:43:28 crc kubenswrapper[4746]: W0128 20:43:28.305491 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-b579b58e8f5ecc4b090b7e792da0663e101a7316941cc43418f8a5c069b464c2 WatchSource:0}: Error finding container b579b58e8f5ecc4b090b7e792da0663e101a7316941cc43418f8a5c069b464c2: Status 404 returned error can't find the container with id b579b58e8f5ecc4b090b7e792da0663e101a7316941cc43418f8a5c069b464c2 Jan 28 20:43:28 crc kubenswrapper[4746]: E0128 20:43:28.309214 4746 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.201:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188effd85b0b0a15 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-28 20:43:28.308652565 +0000 UTC m=+236.264838929,LastTimestamp:2026-01-28 20:43:28.308652565 +0000 UTC m=+236.264838929,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 28 20:43:29 crc kubenswrapper[4746]: I0128 20:43:29.100771 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"516e4abe96c1a2de5a07ad273f06d4d5abe6209ec1947f4945d0353fba3b4511"} Jan 28 20:43:29 crc kubenswrapper[4746]: I0128 20:43:29.101555 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b579b58e8f5ecc4b090b7e792da0663e101a7316941cc43418f8a5c069b464c2"} Jan 28 20:43:29 crc kubenswrapper[4746]: I0128 20:43:29.102610 4746 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:29 crc kubenswrapper[4746]: I0128 20:43:29.103144 4746 status_manager.go:851] "Failed to get status for pod" podUID="d5a73916-bbee-4c88-9bab-9e6f460ca96d" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-8ccb4757-8qk64\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:29 crc kubenswrapper[4746]: I0128 20:43:29.106597 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 28 20:43:29 crc kubenswrapper[4746]: I0128 20:43:29.111763 4746 generic.go:334] "Generic (PLEG): container finished" podID="b727dece-e4ea-4d38-899f-3c5c4c941a6c" containerID="d2bd3327e728b7371e4489ba1aad2337b15cb382bc8a32c2b74f59da95955c6b" exitCode=0 Jan 28 20:43:29 crc kubenswrapper[4746]: I0128 20:43:29.111893 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b727dece-e4ea-4d38-899f-3c5c4c941a6c","Type":"ContainerDied","Data":"d2bd3327e728b7371e4489ba1aad2337b15cb382bc8a32c2b74f59da95955c6b"} Jan 28 20:43:29 crc kubenswrapper[4746]: I0128 20:43:29.113072 4746 status_manager.go:851] "Failed to get status for pod" podUID="b727dece-e4ea-4d38-899f-3c5c4c941a6c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:29 crc kubenswrapper[4746]: I0128 20:43:29.113799 4746 status_manager.go:851] "Failed to get status for pod" podUID="d5a73916-bbee-4c88-9bab-9e6f460ca96d" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-8ccb4757-8qk64\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:29 crc kubenswrapper[4746]: I0128 20:43:29.114592 4746 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.443065 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.444256 4746 status_manager.go:851] "Failed to get status for pod" podUID="d5a73916-bbee-4c88-9bab-9e6f460ca96d" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-8ccb4757-8qk64\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.444506 4746 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.444708 4746 status_manager.go:851] "Failed to get status for pod" podUID="b727dece-e4ea-4d38-899f-3c5c4c941a6c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.541317 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b727dece-e4ea-4d38-899f-3c5c4c941a6c-var-lock\") pod \"b727dece-e4ea-4d38-899f-3c5c4c941a6c\" (UID: \"b727dece-e4ea-4d38-899f-3c5c4c941a6c\") " Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.541514 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b727dece-e4ea-4d38-899f-3c5c4c941a6c-kubelet-dir\") pod \"b727dece-e4ea-4d38-899f-3c5c4c941a6c\" (UID: \"b727dece-e4ea-4d38-899f-3c5c4c941a6c\") " Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.541495 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b727dece-e4ea-4d38-899f-3c5c4c941a6c-var-lock" (OuterVolumeSpecName: "var-lock") pod "b727dece-e4ea-4d38-899f-3c5c4c941a6c" (UID: "b727dece-e4ea-4d38-899f-3c5c4c941a6c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.541566 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b727dece-e4ea-4d38-899f-3c5c4c941a6c-kube-api-access\") pod \"b727dece-e4ea-4d38-899f-3c5c4c941a6c\" (UID: \"b727dece-e4ea-4d38-899f-3c5c4c941a6c\") " Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.541583 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b727dece-e4ea-4d38-899f-3c5c4c941a6c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b727dece-e4ea-4d38-899f-3c5c4c941a6c" (UID: "b727dece-e4ea-4d38-899f-3c5c4c941a6c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.541880 4746 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b727dece-e4ea-4d38-899f-3c5c4c941a6c-var-lock\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.541913 4746 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b727dece-e4ea-4d38-899f-3c5c4c941a6c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.550164 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b727dece-e4ea-4d38-899f-3c5c4c941a6c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b727dece-e4ea-4d38-899f-3c5c4c941a6c" (UID: "b727dece-e4ea-4d38-899f-3c5c4c941a6c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.643360 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b727dece-e4ea-4d38-899f-3c5c4c941a6c-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.811789 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.813071 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.813943 4746 status_manager.go:851] "Failed to get status for pod" podUID="b727dece-e4ea-4d38-899f-3c5c4c941a6c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.814788 4746 status_manager.go:851] "Failed to get status for pod" podUID="d5a73916-bbee-4c88-9bab-9e6f460ca96d" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-8ccb4757-8qk64\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.815495 4746 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.815885 4746 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.946215 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.946331 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.946410 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.946445 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.946470 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.946569 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.947174 4746 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.947198 4746 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:30 crc kubenswrapper[4746]: I0128 20:43:30.947212 4746 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.134046 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.135201 4746 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c" exitCode=0 Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.135308 4746 scope.go:117] "RemoveContainer" containerID="64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.135466 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.136804 4746 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.137474 4746 status_manager.go:851] "Failed to get status for pod" podUID="b727dece-e4ea-4d38-899f-3c5c4c941a6c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.138046 4746 status_manager.go:851] "Failed to get status for pod" podUID="d5a73916-bbee-4c88-9bab-9e6f460ca96d" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-8ccb4757-8qk64\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.140420 4746 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.140554 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b727dece-e4ea-4d38-899f-3c5c4c941a6c","Type":"ContainerDied","Data":"fa6f7bcd1f424d0b6b5677fbeee1a4ed355f2b279b3ddf100d2ccee5dd87d729"} Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.140629 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa6f7bcd1f424d0b6b5677fbeee1a4ed355f2b279b3ddf100d2ccee5dd87d729" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.140881 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.146370 4746 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.147074 4746 status_manager.go:851] "Failed to get status for pod" podUID="b727dece-e4ea-4d38-899f-3c5c4c941a6c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.147396 4746 status_manager.go:851] "Failed to get status for pod" podUID="d5a73916-bbee-4c88-9bab-9e6f460ca96d" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-8ccb4757-8qk64\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.147654 4746 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.154265 4746 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.154747 4746 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.155065 4746 status_manager.go:851] "Failed to get status for pod" podUID="b727dece-e4ea-4d38-899f-3c5c4c941a6c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.155436 4746 status_manager.go:851] "Failed to get status for pod" podUID="d5a73916-bbee-4c88-9bab-9e6f460ca96d" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-8ccb4757-8qk64\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.164202 4746 scope.go:117] "RemoveContainer" containerID="03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.186781 4746 scope.go:117] "RemoveContainer" containerID="6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.205260 4746 scope.go:117] "RemoveContainer" containerID="f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.228067 4746 scope.go:117] "RemoveContainer" containerID="9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.246370 4746 scope.go:117] "RemoveContainer" containerID="c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.293854 4746 scope.go:117] "RemoveContainer" containerID="64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3" Jan 28 20:43:31 crc kubenswrapper[4746]: E0128 20:43:31.294550 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\": container with ID starting with 64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3 not found: ID does not exist" containerID="64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.294616 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3"} err="failed to get container status \"64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\": rpc error: code = NotFound desc = could not find container \"64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3\": container with ID starting with 64f8ae1fcf8b14a65196ffa6d5a50db39aa846dc4e6cc0b92ba54d9b5f6913e3 not found: ID does not exist" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.294652 4746 scope.go:117] "RemoveContainer" containerID="03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5" Jan 28 20:43:31 crc kubenswrapper[4746]: E0128 20:43:31.295086 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\": container with ID starting with 03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5 not found: ID does not exist" containerID="03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.295132 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5"} err="failed to get container status \"03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\": rpc error: code = NotFound desc = could not find container \"03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5\": container with ID starting with 03a94dee0b9663441e0ecec16b121c5be4e5c557f17c929b8edfb0d7ba00c3d5 not found: ID does not exist" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.295168 4746 scope.go:117] "RemoveContainer" containerID="6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530" Jan 28 20:43:31 crc kubenswrapper[4746]: E0128 20:43:31.295539 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\": container with ID starting with 6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530 not found: ID does not exist" containerID="6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.295573 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530"} err="failed to get container status \"6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\": rpc error: code = NotFound desc = could not find container \"6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530\": container with ID starting with 6af41f85c56be6c24127bec0aee8e02562dbb04bd22e57c6887f6f5e914a7530 not found: ID does not exist" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.295591 4746 scope.go:117] "RemoveContainer" containerID="f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3" Jan 28 20:43:31 crc kubenswrapper[4746]: E0128 20:43:31.296044 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\": container with ID starting with f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3 not found: ID does not exist" containerID="f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.296159 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3"} err="failed to get container status \"f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\": rpc error: code = NotFound desc = could not find container \"f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3\": container with ID starting with f31e9e80576ec4eb59ec92039601fd94bd323d29493c353f9ec6b9c61187b0d3 not found: ID does not exist" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.296177 4746 scope.go:117] "RemoveContainer" containerID="9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c" Jan 28 20:43:31 crc kubenswrapper[4746]: E0128 20:43:31.296549 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\": container with ID starting with 9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c not found: ID does not exist" containerID="9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.296632 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c"} err="failed to get container status \"9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\": rpc error: code = NotFound desc = could not find container \"9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c\": container with ID starting with 9a304b12b4a1e30cbbbb16aa4f91514f1b1a64c716cf8ac023803a279b310f9c not found: ID does not exist" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.296672 4746 scope.go:117] "RemoveContainer" containerID="c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c" Jan 28 20:43:31 crc kubenswrapper[4746]: E0128 20:43:31.297139 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\": container with ID starting with c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c not found: ID does not exist" containerID="c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c" Jan 28 20:43:31 crc kubenswrapper[4746]: I0128 20:43:31.297178 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c"} err="failed to get container status \"c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\": rpc error: code = NotFound desc = could not find container \"c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c\": container with ID starting with c5808a729190035a73529ac58efb0790f2d9c3b98c5f1a8017389e4ca1738c4c not found: ID does not exist" Jan 28 20:43:32 crc kubenswrapper[4746]: I0128 20:43:32.839955 4746 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:32 crc kubenswrapper[4746]: I0128 20:43:32.842330 4746 status_manager.go:851] "Failed to get status for pod" podUID="b727dece-e4ea-4d38-899f-3c5c4c941a6c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:32 crc kubenswrapper[4746]: I0128 20:43:32.843063 4746 status_manager.go:851] "Failed to get status for pod" podUID="d5a73916-bbee-4c88-9bab-9e6f460ca96d" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-8ccb4757-8qk64\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:32 crc kubenswrapper[4746]: I0128 20:43:32.844595 4746 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:32 crc kubenswrapper[4746]: I0128 20:43:32.847728 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 28 20:43:33 crc kubenswrapper[4746]: E0128 20:43:33.850543 4746 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" volumeName="registry-storage" Jan 28 20:43:34 crc kubenswrapper[4746]: E0128 20:43:34.086167 4746 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:34 crc kubenswrapper[4746]: E0128 20:43:34.087771 4746 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:34 crc kubenswrapper[4746]: E0128 20:43:34.088918 4746 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:34 crc kubenswrapper[4746]: E0128 20:43:34.089775 4746 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:34 crc kubenswrapper[4746]: E0128 20:43:34.090427 4746 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:34 crc kubenswrapper[4746]: I0128 20:43:34.090470 4746 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 28 20:43:34 crc kubenswrapper[4746]: E0128 20:43:34.090886 4746 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="200ms" Jan 28 20:43:34 crc kubenswrapper[4746]: E0128 20:43:34.291845 4746 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="400ms" Jan 28 20:43:34 crc kubenswrapper[4746]: E0128 20:43:34.693390 4746 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="800ms" Jan 28 20:43:35 crc kubenswrapper[4746]: E0128 20:43:35.246653 4746 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.201:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188effd85b0b0a15 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-28 20:43:28.308652565 +0000 UTC m=+236.264838929,LastTimestamp:2026-01-28 20:43:28.308652565 +0000 UTC m=+236.264838929,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 28 20:43:35 crc kubenswrapper[4746]: E0128 20:43:35.496235 4746 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="1.6s" Jan 28 20:43:37 crc kubenswrapper[4746]: E0128 20:43:37.097584 4746 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="3.2s" Jan 28 20:43:40 crc kubenswrapper[4746]: E0128 20:43:40.299651 4746 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="6.4s" Jan 28 20:43:40 crc kubenswrapper[4746]: I0128 20:43:40.836331 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:40 crc kubenswrapper[4746]: I0128 20:43:40.838011 4746 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:40 crc kubenswrapper[4746]: I0128 20:43:40.838835 4746 status_manager.go:851] "Failed to get status for pod" podUID="b727dece-e4ea-4d38-899f-3c5c4c941a6c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:40 crc kubenswrapper[4746]: I0128 20:43:40.839331 4746 status_manager.go:851] "Failed to get status for pod" podUID="d5a73916-bbee-4c88-9bab-9e6f460ca96d" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-8ccb4757-8qk64\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:40 crc kubenswrapper[4746]: I0128 20:43:40.857729 4746 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64499f0d-5c10-43a9-b479-9b6c7ef2fb9c" Jan 28 20:43:40 crc kubenswrapper[4746]: I0128 20:43:40.858456 4746 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64499f0d-5c10-43a9-b479-9b6c7ef2fb9c" Jan 28 20:43:40 crc kubenswrapper[4746]: E0128 20:43:40.859371 4746 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:40 crc kubenswrapper[4746]: I0128 20:43:40.860311 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:40 crc kubenswrapper[4746]: W0128 20:43:40.885932 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-49a1ac1d36acc9bf34e66d6eec0d3dff4bf9a4e9d543ea0bdf6328ec4d9a2f30 WatchSource:0}: Error finding container 49a1ac1d36acc9bf34e66d6eec0d3dff4bf9a4e9d543ea0bdf6328ec4d9a2f30: Status 404 returned error can't find the container with id 49a1ac1d36acc9bf34e66d6eec0d3dff4bf9a4e9d543ea0bdf6328ec4d9a2f30 Jan 28 20:43:41 crc kubenswrapper[4746]: I0128 20:43:41.233732 4746 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="4cce517ab2f3010e75e4601178a8cb7b51cccd5618406217e3a7c67e3b9616bb" exitCode=0 Jan 28 20:43:41 crc kubenswrapper[4746]: I0128 20:43:41.233858 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"4cce517ab2f3010e75e4601178a8cb7b51cccd5618406217e3a7c67e3b9616bb"} Jan 28 20:43:41 crc kubenswrapper[4746]: I0128 20:43:41.234328 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"49a1ac1d36acc9bf34e66d6eec0d3dff4bf9a4e9d543ea0bdf6328ec4d9a2f30"} Jan 28 20:43:41 crc kubenswrapper[4746]: I0128 20:43:41.234700 4746 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64499f0d-5c10-43a9-b479-9b6c7ef2fb9c" Jan 28 20:43:41 crc kubenswrapper[4746]: I0128 20:43:41.234721 4746 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64499f0d-5c10-43a9-b479-9b6c7ef2fb9c" Jan 28 20:43:41 crc kubenswrapper[4746]: I0128 20:43:41.235363 4746 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:41 crc kubenswrapper[4746]: E0128 20:43:41.235369 4746 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:41 crc kubenswrapper[4746]: I0128 20:43:41.235763 4746 status_manager.go:851] "Failed to get status for pod" podUID="b727dece-e4ea-4d38-899f-3c5c4c941a6c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:41 crc kubenswrapper[4746]: I0128 20:43:41.236055 4746 status_manager.go:851] "Failed to get status for pod" podUID="d5a73916-bbee-4c88-9bab-9e6f460ca96d" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-8ccb4757-8qk64\": dial tcp 38.102.83.201:6443: connect: connection refused" Jan 28 20:43:42 crc kubenswrapper[4746]: I0128 20:43:42.248831 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 28 20:43:42 crc kubenswrapper[4746]: I0128 20:43:42.248913 4746 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec" exitCode=1 Jan 28 20:43:42 crc kubenswrapper[4746]: I0128 20:43:42.249002 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec"} Jan 28 20:43:42 crc kubenswrapper[4746]: I0128 20:43:42.249622 4746 scope.go:117] "RemoveContainer" containerID="319cbcaa9981b3c79eb0c8f970f0ad776fb255db10c72c9b6a13469906e9e5ec" Jan 28 20:43:42 crc kubenswrapper[4746]: I0128 20:43:42.253967 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4bd8b7bb464d5a8b955f9fabf779ce786000002fb2ac735b74a089ff913ec6d3"} Jan 28 20:43:42 crc kubenswrapper[4746]: I0128 20:43:42.254022 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e3ab3e26c026edfa3c9d621b981afd3ab04171037307dfb2bfa32de051b2f093"} Jan 28 20:43:42 crc kubenswrapper[4746]: I0128 20:43:42.254033 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c23d65ebcb1e26e6277f95be85612fcb5644fd15f72c6686c5ea7031a9d19328"} Jan 28 20:43:43 crc kubenswrapper[4746]: I0128 20:43:43.262581 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5d9c71c19d6be8e03ee0be5af7bd36065379ee038d728aea1b100ad53a6f1990"} Jan 28 20:43:43 crc kubenswrapper[4746]: I0128 20:43:43.263284 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:43 crc kubenswrapper[4746]: I0128 20:43:43.262738 4746 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64499f0d-5c10-43a9-b479-9b6c7ef2fb9c" Jan 28 20:43:43 crc kubenswrapper[4746]: I0128 20:43:43.263314 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d40525a6e97b5a3fa5588ebab30748e57165f6611ca14b88791c51dac5d0f790"} Jan 28 20:43:43 crc kubenswrapper[4746]: I0128 20:43:43.263327 4746 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64499f0d-5c10-43a9-b479-9b6c7ef2fb9c" Jan 28 20:43:43 crc kubenswrapper[4746]: I0128 20:43:43.266412 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 28 20:43:43 crc kubenswrapper[4746]: I0128 20:43:43.266463 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5d6cb7e7e21ac6646d2210a99b0c689073e1c863606a2c798ef52d4f574324d7"} Jan 28 20:43:43 crc kubenswrapper[4746]: I0128 20:43:43.332918 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:43:43 crc kubenswrapper[4746]: I0128 20:43:43.338075 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:43:44 crc kubenswrapper[4746]: I0128 20:43:44.271482 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:43:45 crc kubenswrapper[4746]: I0128 20:43:45.860635 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:45 crc kubenswrapper[4746]: I0128 20:43:45.860701 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:45 crc kubenswrapper[4746]: I0128 20:43:45.870764 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:48 crc kubenswrapper[4746]: I0128 20:43:48.275770 4746 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:49 crc kubenswrapper[4746]: I0128 20:43:49.306031 4746 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64499f0d-5c10-43a9-b479-9b6c7ef2fb9c" Jan 28 20:43:49 crc kubenswrapper[4746]: I0128 20:43:49.306706 4746 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64499f0d-5c10-43a9-b479-9b6c7ef2fb9c" Jan 28 20:43:49 crc kubenswrapper[4746]: I0128 20:43:49.312762 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:43:49 crc kubenswrapper[4746]: I0128 20:43:49.316288 4746 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="9c2d12ff-a3ff-4b27-9a29-c546ba58624d" Jan 28 20:43:50 crc kubenswrapper[4746]: I0128 20:43:50.315462 4746 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64499f0d-5c10-43a9-b479-9b6c7ef2fb9c" Jan 28 20:43:50 crc kubenswrapper[4746]: I0128 20:43:50.315520 4746 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="64499f0d-5c10-43a9-b479-9b6c7ef2fb9c" Jan 28 20:43:52 crc kubenswrapper[4746]: I0128 20:43:52.871404 4746 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="9c2d12ff-a3ff-4b27-9a29-c546ba58624d" Jan 28 20:43:58 crc kubenswrapper[4746]: I0128 20:43:58.217595 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 28 20:43:58 crc kubenswrapper[4746]: I0128 20:43:58.462652 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 28 20:43:58 crc kubenswrapper[4746]: I0128 20:43:58.662284 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 28 20:43:58 crc kubenswrapper[4746]: I0128 20:43:58.779358 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 28 20:43:59 crc kubenswrapper[4746]: I0128 20:43:59.191160 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 28 20:43:59 crc kubenswrapper[4746]: I0128 20:43:59.278173 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 28 20:43:59 crc kubenswrapper[4746]: I0128 20:43:59.335279 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 28 20:43:59 crc kubenswrapper[4746]: I0128 20:43:59.665724 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 28 20:43:59 crc kubenswrapper[4746]: I0128 20:43:59.681035 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 28 20:43:59 crc kubenswrapper[4746]: I0128 20:43:59.688880 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 28 20:43:59 crc kubenswrapper[4746]: I0128 20:43:59.953953 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 28 20:43:59 crc kubenswrapper[4746]: I0128 20:43:59.986747 4746 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 28 20:44:00 crc kubenswrapper[4746]: I0128 20:44:00.043533 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 28 20:44:00 crc kubenswrapper[4746]: I0128 20:44:00.108800 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 28 20:44:00 crc kubenswrapper[4746]: I0128 20:44:00.115379 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 28 20:44:00 crc kubenswrapper[4746]: I0128 20:44:00.317146 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 28 20:44:00 crc kubenswrapper[4746]: I0128 20:44:00.354980 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 28 20:44:00 crc kubenswrapper[4746]: I0128 20:44:00.469914 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 28 20:44:00 crc kubenswrapper[4746]: I0128 20:44:00.528501 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 28 20:44:00 crc kubenswrapper[4746]: I0128 20:44:00.725305 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 28 20:44:00 crc kubenswrapper[4746]: I0128 20:44:00.757606 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 28 20:44:00 crc kubenswrapper[4746]: I0128 20:44:00.869641 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 28 20:44:01 crc kubenswrapper[4746]: I0128 20:44:01.155980 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 28 20:44:01 crc kubenswrapper[4746]: I0128 20:44:01.189405 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 28 20:44:01 crc kubenswrapper[4746]: I0128 20:44:01.373747 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 28 20:44:01 crc kubenswrapper[4746]: I0128 20:44:01.416074 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 28 20:44:01 crc kubenswrapper[4746]: I0128 20:44:01.499606 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 28 20:44:01 crc kubenswrapper[4746]: I0128 20:44:01.548732 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 28 20:44:01 crc kubenswrapper[4746]: I0128 20:44:01.661785 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 28 20:44:01 crc kubenswrapper[4746]: I0128 20:44:01.767909 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 28 20:44:01 crc kubenswrapper[4746]: I0128 20:44:01.881432 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 28 20:44:01 crc kubenswrapper[4746]: I0128 20:44:01.931061 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 28 20:44:02 crc kubenswrapper[4746]: I0128 20:44:02.005218 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 28 20:44:02 crc kubenswrapper[4746]: I0128 20:44:02.084780 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 28 20:44:02 crc kubenswrapper[4746]: I0128 20:44:02.189730 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 28 20:44:02 crc kubenswrapper[4746]: I0128 20:44:02.287566 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 28 20:44:02 crc kubenswrapper[4746]: I0128 20:44:02.614387 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 28 20:44:02 crc kubenswrapper[4746]: I0128 20:44:02.640057 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 28 20:44:02 crc kubenswrapper[4746]: I0128 20:44:02.659019 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 28 20:44:02 crc kubenswrapper[4746]: I0128 20:44:02.686253 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 28 20:44:02 crc kubenswrapper[4746]: I0128 20:44:02.724739 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 28 20:44:02 crc kubenswrapper[4746]: I0128 20:44:02.743713 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 28 20:44:02 crc kubenswrapper[4746]: I0128 20:44:02.801277 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 28 20:44:02 crc kubenswrapper[4746]: I0128 20:44:02.926256 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.144403 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.272413 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.282116 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.329668 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.341542 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.520716 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.534285 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.539235 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.546378 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.631638 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.634438 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.720875 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.750001 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.824520 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.827034 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.840150 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.856745 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.947418 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 28 20:44:03 crc kubenswrapper[4746]: I0128 20:44:03.950744 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.023641 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.031527 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.079402 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.090993 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.107733 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.115807 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.160328 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.208461 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.238760 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.287553 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.310373 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.321321 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.377541 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.414988 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.517789 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.518713 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.526231 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.549469 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.710188 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.777482 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.828452 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.828742 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.828845 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.829650 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.898027 4746 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.899229 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-8ccb4757-8qk64" podStartSLOduration=72.899207433 podStartE2EDuration="1m12.899207433s" podCreationTimestamp="2026-01-28 20:42:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:43:48.130705315 +0000 UTC m=+256.086891679" watchObservedRunningTime="2026-01-28 20:44:04.899207433 +0000 UTC m=+272.855393807" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.900572 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=37.900564511 podStartE2EDuration="37.900564511s" podCreationTimestamp="2026-01-28 20:43:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:43:48.183970436 +0000 UTC m=+256.140156830" watchObservedRunningTime="2026-01-28 20:44:04.900564511 +0000 UTC m=+272.856750885" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.905847 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.905951 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.920449 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 28 20:44:04 crc kubenswrapper[4746]: I0128 20:44:04.948201 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.948176208 podStartE2EDuration="16.948176208s" podCreationTimestamp="2026-01-28 20:43:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:44:04.946231254 +0000 UTC m=+272.902417628" watchObservedRunningTime="2026-01-28 20:44:04.948176208 +0000 UTC m=+272.904362562" Jan 28 20:44:05 crc kubenswrapper[4746]: I0128 20:44:05.024583 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 28 20:44:05 crc kubenswrapper[4746]: I0128 20:44:05.138349 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 28 20:44:05 crc kubenswrapper[4746]: I0128 20:44:05.154460 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 28 20:44:05 crc kubenswrapper[4746]: I0128 20:44:05.165808 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 28 20:44:05 crc kubenswrapper[4746]: I0128 20:44:05.303031 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 28 20:44:05 crc kubenswrapper[4746]: I0128 20:44:05.304765 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 28 20:44:05 crc kubenswrapper[4746]: I0128 20:44:05.389740 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 28 20:44:05 crc kubenswrapper[4746]: I0128 20:44:05.556779 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 28 20:44:05 crc kubenswrapper[4746]: I0128 20:44:05.559905 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 28 20:44:05 crc kubenswrapper[4746]: I0128 20:44:05.574971 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 28 20:44:05 crc kubenswrapper[4746]: I0128 20:44:05.591121 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 28 20:44:05 crc kubenswrapper[4746]: I0128 20:44:05.672726 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 28 20:44:05 crc kubenswrapper[4746]: I0128 20:44:05.777354 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 28 20:44:05 crc kubenswrapper[4746]: I0128 20:44:05.875009 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 28 20:44:05 crc kubenswrapper[4746]: I0128 20:44:05.948005 4746 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 28 20:44:05 crc kubenswrapper[4746]: I0128 20:44:05.971801 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 28 20:44:06 crc kubenswrapper[4746]: I0128 20:44:06.101975 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 28 20:44:06 crc kubenswrapper[4746]: I0128 20:44:06.159807 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 28 20:44:06 crc kubenswrapper[4746]: I0128 20:44:06.171664 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 28 20:44:06 crc kubenswrapper[4746]: I0128 20:44:06.262004 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 28 20:44:06 crc kubenswrapper[4746]: I0128 20:44:06.262878 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 28 20:44:06 crc kubenswrapper[4746]: I0128 20:44:06.415756 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 28 20:44:06 crc kubenswrapper[4746]: I0128 20:44:06.433283 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 28 20:44:06 crc kubenswrapper[4746]: I0128 20:44:06.487434 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 28 20:44:06 crc kubenswrapper[4746]: I0128 20:44:06.495127 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 28 20:44:06 crc kubenswrapper[4746]: I0128 20:44:06.691667 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 28 20:44:06 crc kubenswrapper[4746]: I0128 20:44:06.814637 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 28 20:44:06 crc kubenswrapper[4746]: I0128 20:44:06.819445 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 28 20:44:06 crc kubenswrapper[4746]: I0128 20:44:06.983698 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 28 20:44:06 crc kubenswrapper[4746]: I0128 20:44:06.999600 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.023658 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.049035 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.057019 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.245682 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.255007 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.297902 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.311441 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.518652 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.635301 4746 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.657028 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.680182 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.724754 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.731279 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.736177 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.768131 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.874976 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.877521 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.889229 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 28 20:44:07 crc kubenswrapper[4746]: I0128 20:44:07.891614 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.004110 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.029879 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.036202 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.045517 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.301951 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.345695 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.364973 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.392992 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.468654 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.546720 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.556433 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.605420 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.635597 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.639906 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.644154 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.794562 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.823228 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.882632 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 28 20:44:08 crc kubenswrapper[4746]: I0128 20:44:08.920374 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.069831 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.147950 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.155245 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.178869 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.240884 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.263381 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.289454 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.327644 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.337683 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.341439 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.384798 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.393749 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.436286 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.449798 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.464972 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.467477 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.468742 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.473122 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.479756 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.492199 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.544188 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.556677 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.557032 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.582914 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.678401 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.842067 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.867806 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.930325 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.931205 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.982504 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 28 20:44:09 crc kubenswrapper[4746]: I0128 20:44:09.984021 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 28 20:44:10 crc kubenswrapper[4746]: I0128 20:44:10.015569 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 28 20:44:10 crc kubenswrapper[4746]: I0128 20:44:10.098247 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 28 20:44:10 crc kubenswrapper[4746]: I0128 20:44:10.185225 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 28 20:44:10 crc kubenswrapper[4746]: I0128 20:44:10.278417 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 28 20:44:10 crc kubenswrapper[4746]: I0128 20:44:10.278791 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 28 20:44:10 crc kubenswrapper[4746]: I0128 20:44:10.328382 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 28 20:44:10 crc kubenswrapper[4746]: I0128 20:44:10.340654 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 28 20:44:10 crc kubenswrapper[4746]: I0128 20:44:10.349278 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 28 20:44:10 crc kubenswrapper[4746]: I0128 20:44:10.352782 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 28 20:44:10 crc kubenswrapper[4746]: I0128 20:44:10.470978 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 28 20:44:10 crc kubenswrapper[4746]: I0128 20:44:10.892129 4746 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 28 20:44:10 crc kubenswrapper[4746]: I0128 20:44:10.892434 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://516e4abe96c1a2de5a07ad273f06d4d5abe6209ec1947f4945d0353fba3b4511" gracePeriod=5 Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.142576 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.187775 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.243861 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.265516 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.330800 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.372959 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.494339 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.522141 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.561473 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.570798 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.571606 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.574894 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.583768 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.646349 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.653602 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.669145 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.866138 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.873969 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 28 20:44:11 crc kubenswrapper[4746]: I0128 20:44:11.944488 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 28 20:44:12 crc kubenswrapper[4746]: I0128 20:44:12.027880 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 28 20:44:12 crc kubenswrapper[4746]: I0128 20:44:12.030765 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 28 20:44:12 crc kubenswrapper[4746]: I0128 20:44:12.134624 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 28 20:44:12 crc kubenswrapper[4746]: I0128 20:44:12.247547 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 28 20:44:12 crc kubenswrapper[4746]: I0128 20:44:12.252348 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 28 20:44:12 crc kubenswrapper[4746]: I0128 20:44:12.307548 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 28 20:44:12 crc kubenswrapper[4746]: I0128 20:44:12.358472 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 28 20:44:12 crc kubenswrapper[4746]: I0128 20:44:12.421058 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 28 20:44:12 crc kubenswrapper[4746]: I0128 20:44:12.431547 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 28 20:44:12 crc kubenswrapper[4746]: I0128 20:44:12.486120 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 28 20:44:12 crc kubenswrapper[4746]: I0128 20:44:12.612669 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 28 20:44:12 crc kubenswrapper[4746]: I0128 20:44:12.669266 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 28 20:44:12 crc kubenswrapper[4746]: I0128 20:44:12.676719 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 28 20:44:12 crc kubenswrapper[4746]: I0128 20:44:12.715026 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 28 20:44:12 crc kubenswrapper[4746]: I0128 20:44:12.911445 4746 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 28 20:44:13 crc kubenswrapper[4746]: I0128 20:44:13.013752 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 28 20:44:13 crc kubenswrapper[4746]: I0128 20:44:13.074595 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 28 20:44:13 crc kubenswrapper[4746]: I0128 20:44:13.151447 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 28 20:44:13 crc kubenswrapper[4746]: I0128 20:44:13.321483 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 28 20:44:13 crc kubenswrapper[4746]: I0128 20:44:13.393114 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 28 20:44:13 crc kubenswrapper[4746]: I0128 20:44:13.525584 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 28 20:44:13 crc kubenswrapper[4746]: I0128 20:44:13.557955 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 28 20:44:13 crc kubenswrapper[4746]: I0128 20:44:13.732487 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 28 20:44:13 crc kubenswrapper[4746]: I0128 20:44:13.891754 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 28 20:44:13 crc kubenswrapper[4746]: I0128 20:44:13.949238 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 28 20:44:14 crc kubenswrapper[4746]: I0128 20:44:14.057459 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 28 20:44:14 crc kubenswrapper[4746]: I0128 20:44:14.124199 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 28 20:44:14 crc kubenswrapper[4746]: I0128 20:44:14.316414 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 28 20:44:14 crc kubenswrapper[4746]: I0128 20:44:14.354053 4746 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 28 20:44:14 crc kubenswrapper[4746]: I0128 20:44:14.401178 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 28 20:44:14 crc kubenswrapper[4746]: I0128 20:44:14.439188 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 28 20:44:14 crc kubenswrapper[4746]: I0128 20:44:14.613376 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 28 20:44:14 crc kubenswrapper[4746]: I0128 20:44:14.786109 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 28 20:44:15 crc kubenswrapper[4746]: I0128 20:44:15.371416 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 28 20:44:15 crc kubenswrapper[4746]: I0128 20:44:15.682118 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.013224 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.034117 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.034221 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.106123 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.106176 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.106266 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.106340 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.106248 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.106376 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.106338 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.106486 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.106525 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.106823 4746 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.106843 4746 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.106859 4746 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.106872 4746 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.116259 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.209924 4746 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.492095 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.492175 4746 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="516e4abe96c1a2de5a07ad273f06d4d5abe6209ec1947f4945d0353fba3b4511" exitCode=137 Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.492250 4746 scope.go:117] "RemoveContainer" containerID="516e4abe96c1a2de5a07ad273f06d4d5abe6209ec1947f4945d0353fba3b4511" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.492260 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.513817 4746 scope.go:117] "RemoveContainer" containerID="516e4abe96c1a2de5a07ad273f06d4d5abe6209ec1947f4945d0353fba3b4511" Jan 28 20:44:16 crc kubenswrapper[4746]: E0128 20:44:16.514521 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"516e4abe96c1a2de5a07ad273f06d4d5abe6209ec1947f4945d0353fba3b4511\": container with ID starting with 516e4abe96c1a2de5a07ad273f06d4d5abe6209ec1947f4945d0353fba3b4511 not found: ID does not exist" containerID="516e4abe96c1a2de5a07ad273f06d4d5abe6209ec1947f4945d0353fba3b4511" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.514575 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"516e4abe96c1a2de5a07ad273f06d4d5abe6209ec1947f4945d0353fba3b4511"} err="failed to get container status \"516e4abe96c1a2de5a07ad273f06d4d5abe6209ec1947f4945d0353fba3b4511\": rpc error: code = NotFound desc = could not find container \"516e4abe96c1a2de5a07ad273f06d4d5abe6209ec1947f4945d0353fba3b4511\": container with ID starting with 516e4abe96c1a2de5a07ad273f06d4d5abe6209ec1947f4945d0353fba3b4511 not found: ID does not exist" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.846133 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.846453 4746 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.857665 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.857731 4746 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="a4a263a4-9ee1-4472-a320-0916270530c7" Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.861445 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 28 20:44:16 crc kubenswrapper[4746]: I0128 20:44:16.861472 4746 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="a4a263a4-9ee1-4472-a320-0916270530c7" Jan 28 20:44:17 crc kubenswrapper[4746]: I0128 20:44:17.253032 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 28 20:44:18 crc kubenswrapper[4746]: I0128 20:44:18.754555 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78b9587cd6-5jh8p"] Jan 28 20:44:18 crc kubenswrapper[4746]: I0128 20:44:18.754932 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" podUID="ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa" containerName="controller-manager" containerID="cri-o://6aa206c1994a75b99540a5d2dac4916054ffcf9e2fd802e89171eae7e01f29bd" gracePeriod=30 Jan 28 20:44:18 crc kubenswrapper[4746]: I0128 20:44:18.844932 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg"] Jan 28 20:44:18 crc kubenswrapper[4746]: I0128 20:44:18.845181 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" podUID="1837e217-cd33-4aa4-8c46-6f344a75aded" containerName="route-controller-manager" containerID="cri-o://4249062ea83a5500279c1caa99cf02c9cbebe962dbca5691afbe4068d55f7bdc" gracePeriod=30 Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.207529 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.276038 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.386663 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlwc2\" (UniqueName: \"kubernetes.io/projected/1837e217-cd33-4aa4-8c46-6f344a75aded-kube-api-access-dlwc2\") pod \"1837e217-cd33-4aa4-8c46-6f344a75aded\" (UID: \"1837e217-cd33-4aa4-8c46-6f344a75aded\") " Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.386738 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-config\") pod \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.386845 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1837e217-cd33-4aa4-8c46-6f344a75aded-serving-cert\") pod \"1837e217-cd33-4aa4-8c46-6f344a75aded\" (UID: \"1837e217-cd33-4aa4-8c46-6f344a75aded\") " Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.386870 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1837e217-cd33-4aa4-8c46-6f344a75aded-client-ca\") pod \"1837e217-cd33-4aa4-8c46-6f344a75aded\" (UID: \"1837e217-cd33-4aa4-8c46-6f344a75aded\") " Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.386902 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-proxy-ca-bundles\") pod \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.386923 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-client-ca\") pod \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.386961 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-serving-cert\") pod \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.386990 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmffh\" (UniqueName: \"kubernetes.io/projected/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-kube-api-access-pmffh\") pod \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\" (UID: \"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa\") " Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.387011 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1837e217-cd33-4aa4-8c46-6f344a75aded-config\") pod \"1837e217-cd33-4aa4-8c46-6f344a75aded\" (UID: \"1837e217-cd33-4aa4-8c46-6f344a75aded\") " Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.387925 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-config" (OuterVolumeSpecName: "config") pod "ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa" (UID: "ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.387959 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-client-ca" (OuterVolumeSpecName: "client-ca") pod "ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa" (UID: "ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.388424 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1837e217-cd33-4aa4-8c46-6f344a75aded-config" (OuterVolumeSpecName: "config") pod "1837e217-cd33-4aa4-8c46-6f344a75aded" (UID: "1837e217-cd33-4aa4-8c46-6f344a75aded"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.388492 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa" (UID: "ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.388574 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1837e217-cd33-4aa4-8c46-6f344a75aded-client-ca" (OuterVolumeSpecName: "client-ca") pod "1837e217-cd33-4aa4-8c46-6f344a75aded" (UID: "1837e217-cd33-4aa4-8c46-6f344a75aded"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.393889 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-kube-api-access-pmffh" (OuterVolumeSpecName: "kube-api-access-pmffh") pod "ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa" (UID: "ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa"). InnerVolumeSpecName "kube-api-access-pmffh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.394018 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa" (UID: "ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.394781 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1837e217-cd33-4aa4-8c46-6f344a75aded-kube-api-access-dlwc2" (OuterVolumeSpecName: "kube-api-access-dlwc2") pod "1837e217-cd33-4aa4-8c46-6f344a75aded" (UID: "1837e217-cd33-4aa4-8c46-6f344a75aded"). InnerVolumeSpecName "kube-api-access-dlwc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.396372 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1837e217-cd33-4aa4-8c46-6f344a75aded-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1837e217-cd33-4aa4-8c46-6f344a75aded" (UID: "1837e217-cd33-4aa4-8c46-6f344a75aded"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.489345 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmffh\" (UniqueName: \"kubernetes.io/projected/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-kube-api-access-pmffh\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.489895 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1837e217-cd33-4aa4-8c46-6f344a75aded-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.490145 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlwc2\" (UniqueName: \"kubernetes.io/projected/1837e217-cd33-4aa4-8c46-6f344a75aded-kube-api-access-dlwc2\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.490296 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.490439 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1837e217-cd33-4aa4-8c46-6f344a75aded-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.490565 4746 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1837e217-cd33-4aa4-8c46-6f344a75aded-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.490680 4746 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.490805 4746 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.490930 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.514634 4746 generic.go:334] "Generic (PLEG): container finished" podID="1837e217-cd33-4aa4-8c46-6f344a75aded" containerID="4249062ea83a5500279c1caa99cf02c9cbebe962dbca5691afbe4068d55f7bdc" exitCode=0 Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.514814 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" event={"ID":"1837e217-cd33-4aa4-8c46-6f344a75aded","Type":"ContainerDied","Data":"4249062ea83a5500279c1caa99cf02c9cbebe962dbca5691afbe4068d55f7bdc"} Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.514918 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" event={"ID":"1837e217-cd33-4aa4-8c46-6f344a75aded","Type":"ContainerDied","Data":"8f8d0679a91a56a0f0899050a8254fd14b35112e9b2f2bd0abfe00d092436ed9"} Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.514969 4746 scope.go:117] "RemoveContainer" containerID="4249062ea83a5500279c1caa99cf02c9cbebe962dbca5691afbe4068d55f7bdc" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.515377 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.517206 4746 generic.go:334] "Generic (PLEG): container finished" podID="ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa" containerID="6aa206c1994a75b99540a5d2dac4916054ffcf9e2fd802e89171eae7e01f29bd" exitCode=0 Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.517283 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" event={"ID":"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa","Type":"ContainerDied","Data":"6aa206c1994a75b99540a5d2dac4916054ffcf9e2fd802e89171eae7e01f29bd"} Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.517324 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" event={"ID":"ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa","Type":"ContainerDied","Data":"bf98b71b72e961dd2467de92f0a93f8e5a1c7bb98dc7845bf75e2994985af40e"} Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.517979 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78b9587cd6-5jh8p" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.547536 4746 scope.go:117] "RemoveContainer" containerID="4249062ea83a5500279c1caa99cf02c9cbebe962dbca5691afbe4068d55f7bdc" Jan 28 20:44:19 crc kubenswrapper[4746]: E0128 20:44:19.548756 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4249062ea83a5500279c1caa99cf02c9cbebe962dbca5691afbe4068d55f7bdc\": container with ID starting with 4249062ea83a5500279c1caa99cf02c9cbebe962dbca5691afbe4068d55f7bdc not found: ID does not exist" containerID="4249062ea83a5500279c1caa99cf02c9cbebe962dbca5691afbe4068d55f7bdc" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.548836 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4249062ea83a5500279c1caa99cf02c9cbebe962dbca5691afbe4068d55f7bdc"} err="failed to get container status \"4249062ea83a5500279c1caa99cf02c9cbebe962dbca5691afbe4068d55f7bdc\": rpc error: code = NotFound desc = could not find container \"4249062ea83a5500279c1caa99cf02c9cbebe962dbca5691afbe4068d55f7bdc\": container with ID starting with 4249062ea83a5500279c1caa99cf02c9cbebe962dbca5691afbe4068d55f7bdc not found: ID does not exist" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.548875 4746 scope.go:117] "RemoveContainer" containerID="6aa206c1994a75b99540a5d2dac4916054ffcf9e2fd802e89171eae7e01f29bd" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.566628 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg"] Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.569628 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d7856c4bb-d95sg"] Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.579445 4746 scope.go:117] "RemoveContainer" containerID="6aa206c1994a75b99540a5d2dac4916054ffcf9e2fd802e89171eae7e01f29bd" Jan 28 20:44:19 crc kubenswrapper[4746]: E0128 20:44:19.580291 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aa206c1994a75b99540a5d2dac4916054ffcf9e2fd802e89171eae7e01f29bd\": container with ID starting with 6aa206c1994a75b99540a5d2dac4916054ffcf9e2fd802e89171eae7e01f29bd not found: ID does not exist" containerID="6aa206c1994a75b99540a5d2dac4916054ffcf9e2fd802e89171eae7e01f29bd" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.580374 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aa206c1994a75b99540a5d2dac4916054ffcf9e2fd802e89171eae7e01f29bd"} err="failed to get container status \"6aa206c1994a75b99540a5d2dac4916054ffcf9e2fd802e89171eae7e01f29bd\": rpc error: code = NotFound desc = could not find container \"6aa206c1994a75b99540a5d2dac4916054ffcf9e2fd802e89171eae7e01f29bd\": container with ID starting with 6aa206c1994a75b99540a5d2dac4916054ffcf9e2fd802e89171eae7e01f29bd not found: ID does not exist" Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.582857 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78b9587cd6-5jh8p"] Jan 28 20:44:19 crc kubenswrapper[4746]: I0128 20:44:19.587635 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-78b9587cd6-5jh8p"] Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.037245 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-699b895958-sjrh2"] Jan 28 20:44:20 crc kubenswrapper[4746]: E0128 20:44:20.038639 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.038857 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 28 20:44:20 crc kubenswrapper[4746]: E0128 20:44:20.038979 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b727dece-e4ea-4d38-899f-3c5c4c941a6c" containerName="installer" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.039273 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="b727dece-e4ea-4d38-899f-3c5c4c941a6c" containerName="installer" Jan 28 20:44:20 crc kubenswrapper[4746]: E0128 20:44:20.039423 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1837e217-cd33-4aa4-8c46-6f344a75aded" containerName="route-controller-manager" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.039538 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="1837e217-cd33-4aa4-8c46-6f344a75aded" containerName="route-controller-manager" Jan 28 20:44:20 crc kubenswrapper[4746]: E0128 20:44:20.039671 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa" containerName="controller-manager" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.039798 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa" containerName="controller-manager" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.040150 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="1837e217-cd33-4aa4-8c46-6f344a75aded" containerName="route-controller-manager" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.040314 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.040438 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="b727dece-e4ea-4d38-899f-3c5c4c941a6c" containerName="installer" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.040570 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa" containerName="controller-manager" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.041320 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c"] Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.042267 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.043536 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.046812 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.047323 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.047768 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.048195 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.048572 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.049959 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.050536 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.050797 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.051025 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.051066 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.051322 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.050529 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-699b895958-sjrh2"] Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.052787 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.060795 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.106672 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c"] Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.200064 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9629e562-2adf-453b-8dfb-37e9195b996e-config\") pod \"route-controller-manager-55969676b6-nm42c\" (UID: \"9629e562-2adf-453b-8dfb-37e9195b996e\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.200133 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-client-ca\") pod \"controller-manager-699b895958-sjrh2\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.200203 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-proxy-ca-bundles\") pod \"controller-manager-699b895958-sjrh2\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.200546 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nfwq\" (UniqueName: \"kubernetes.io/projected/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-kube-api-access-8nfwq\") pod \"controller-manager-699b895958-sjrh2\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.200622 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9629e562-2adf-453b-8dfb-37e9195b996e-serving-cert\") pod \"route-controller-manager-55969676b6-nm42c\" (UID: \"9629e562-2adf-453b-8dfb-37e9195b996e\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.200743 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9629e562-2adf-453b-8dfb-37e9195b996e-client-ca\") pod \"route-controller-manager-55969676b6-nm42c\" (UID: \"9629e562-2adf-453b-8dfb-37e9195b996e\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.200813 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-config\") pod \"controller-manager-699b895958-sjrh2\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.201512 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwdfd\" (UniqueName: \"kubernetes.io/projected/9629e562-2adf-453b-8dfb-37e9195b996e-kube-api-access-hwdfd\") pod \"route-controller-manager-55969676b6-nm42c\" (UID: \"9629e562-2adf-453b-8dfb-37e9195b996e\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.201559 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-serving-cert\") pod \"controller-manager-699b895958-sjrh2\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.303462 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nfwq\" (UniqueName: \"kubernetes.io/projected/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-kube-api-access-8nfwq\") pod \"controller-manager-699b895958-sjrh2\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.303526 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9629e562-2adf-453b-8dfb-37e9195b996e-serving-cert\") pod \"route-controller-manager-55969676b6-nm42c\" (UID: \"9629e562-2adf-453b-8dfb-37e9195b996e\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.303574 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9629e562-2adf-453b-8dfb-37e9195b996e-client-ca\") pod \"route-controller-manager-55969676b6-nm42c\" (UID: \"9629e562-2adf-453b-8dfb-37e9195b996e\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.303616 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-config\") pod \"controller-manager-699b895958-sjrh2\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.303657 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwdfd\" (UniqueName: \"kubernetes.io/projected/9629e562-2adf-453b-8dfb-37e9195b996e-kube-api-access-hwdfd\") pod \"route-controller-manager-55969676b6-nm42c\" (UID: \"9629e562-2adf-453b-8dfb-37e9195b996e\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.303680 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-serving-cert\") pod \"controller-manager-699b895958-sjrh2\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.303720 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9629e562-2adf-453b-8dfb-37e9195b996e-config\") pod \"route-controller-manager-55969676b6-nm42c\" (UID: \"9629e562-2adf-453b-8dfb-37e9195b996e\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.303745 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-client-ca\") pod \"controller-manager-699b895958-sjrh2\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.303795 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-proxy-ca-bundles\") pod \"controller-manager-699b895958-sjrh2\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.305426 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-proxy-ca-bundles\") pod \"controller-manager-699b895958-sjrh2\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.306922 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-client-ca\") pod \"controller-manager-699b895958-sjrh2\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.307072 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-config\") pod \"controller-manager-699b895958-sjrh2\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.307268 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9629e562-2adf-453b-8dfb-37e9195b996e-client-ca\") pod \"route-controller-manager-55969676b6-nm42c\" (UID: \"9629e562-2adf-453b-8dfb-37e9195b996e\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.309247 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9629e562-2adf-453b-8dfb-37e9195b996e-config\") pod \"route-controller-manager-55969676b6-nm42c\" (UID: \"9629e562-2adf-453b-8dfb-37e9195b996e\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.311237 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9629e562-2adf-453b-8dfb-37e9195b996e-serving-cert\") pod \"route-controller-manager-55969676b6-nm42c\" (UID: \"9629e562-2adf-453b-8dfb-37e9195b996e\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.311264 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-serving-cert\") pod \"controller-manager-699b895958-sjrh2\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.335880 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nfwq\" (UniqueName: \"kubernetes.io/projected/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-kube-api-access-8nfwq\") pod \"controller-manager-699b895958-sjrh2\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.336412 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwdfd\" (UniqueName: \"kubernetes.io/projected/9629e562-2adf-453b-8dfb-37e9195b996e-kube-api-access-hwdfd\") pod \"route-controller-manager-55969676b6-nm42c\" (UID: \"9629e562-2adf-453b-8dfb-37e9195b996e\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.410434 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.416140 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.657200 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c"] Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.703755 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-699b895958-sjrh2"] Jan 28 20:44:20 crc kubenswrapper[4746]: W0128 20:44:20.704738 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea55a0dc_b65d_4a6a_b497_5c23f4ef6d1d.slice/crio-58c6a8e51b10b8d194c10aa508840e171a87595752281f5addea1902a26d5702 WatchSource:0}: Error finding container 58c6a8e51b10b8d194c10aa508840e171a87595752281f5addea1902a26d5702: Status 404 returned error can't find the container with id 58c6a8e51b10b8d194c10aa508840e171a87595752281f5addea1902a26d5702 Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.843976 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1837e217-cd33-4aa4-8c46-6f344a75aded" path="/var/lib/kubelet/pods/1837e217-cd33-4aa4-8c46-6f344a75aded/volumes" Jan 28 20:44:20 crc kubenswrapper[4746]: I0128 20:44:20.845294 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa" path="/var/lib/kubelet/pods/ffced8b3-c5f6-4e6b-9848-0fa63a34b1fa/volumes" Jan 28 20:44:21 crc kubenswrapper[4746]: I0128 20:44:21.534965 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" event={"ID":"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d","Type":"ContainerStarted","Data":"32b460399a59ae6cccc37e51c952d58cfdc97c15d21a41d6064a2de0051eae83"} Jan 28 20:44:21 crc kubenswrapper[4746]: I0128 20:44:21.535476 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" event={"ID":"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d","Type":"ContainerStarted","Data":"58c6a8e51b10b8d194c10aa508840e171a87595752281f5addea1902a26d5702"} Jan 28 20:44:21 crc kubenswrapper[4746]: I0128 20:44:21.537757 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:21 crc kubenswrapper[4746]: I0128 20:44:21.539182 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" event={"ID":"9629e562-2adf-453b-8dfb-37e9195b996e","Type":"ContainerStarted","Data":"cc5d41ae4726f814b4dfa4ceff7391a250ab5f3fd0305acd5cef81fce402e610"} Jan 28 20:44:21 crc kubenswrapper[4746]: I0128 20:44:21.539208 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" event={"ID":"9629e562-2adf-453b-8dfb-37e9195b996e","Type":"ContainerStarted","Data":"da2ed9f0cc3c2fb3bb93f40e32a382f2f62edc00897027fcd47573a42bfb49c1"} Jan 28 20:44:21 crc kubenswrapper[4746]: I0128 20:44:21.539953 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" Jan 28 20:44:21 crc kubenswrapper[4746]: I0128 20:44:21.542029 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:21 crc kubenswrapper[4746]: I0128 20:44:21.546292 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" Jan 28 20:44:21 crc kubenswrapper[4746]: I0128 20:44:21.596070 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" podStartSLOduration=3.596032741 podStartE2EDuration="3.596032741s" podCreationTimestamp="2026-01-28 20:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:44:21.561895843 +0000 UTC m=+289.518082197" watchObservedRunningTime="2026-01-28 20:44:21.596032741 +0000 UTC m=+289.552219135" Jan 28 20:44:21 crc kubenswrapper[4746]: I0128 20:44:21.617061 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" podStartSLOduration=3.617041471 podStartE2EDuration="3.617041471s" podCreationTimestamp="2026-01-28 20:44:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:44:21.613520603 +0000 UTC m=+289.569707007" watchObservedRunningTime="2026-01-28 20:44:21.617041471 +0000 UTC m=+289.573227825" Jan 28 20:44:32 crc kubenswrapper[4746]: I0128 20:44:32.624799 4746 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 28 20:44:33 crc kubenswrapper[4746]: I0128 20:44:33.614683 4746 generic.go:334] "Generic (PLEG): container finished" podID="3edaca00-e1a6-4b56-9290-cad6311263ee" containerID="247a6dffe30c252f58a1ab345b064bf16a2db97efef470149722bc5c23ef722d" exitCode=0 Jan 28 20:44:33 crc kubenswrapper[4746]: I0128 20:44:33.615194 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" event={"ID":"3edaca00-e1a6-4b56-9290-cad6311263ee","Type":"ContainerDied","Data":"247a6dffe30c252f58a1ab345b064bf16a2db97efef470149722bc5c23ef722d"} Jan 28 20:44:33 crc kubenswrapper[4746]: I0128 20:44:33.615689 4746 scope.go:117] "RemoveContainer" containerID="247a6dffe30c252f58a1ab345b064bf16a2db97efef470149722bc5c23ef722d" Jan 28 20:44:34 crc kubenswrapper[4746]: I0128 20:44:34.625114 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" event={"ID":"3edaca00-e1a6-4b56-9290-cad6311263ee","Type":"ContainerStarted","Data":"3c9b761d21788e5d9286b6f107962e6f405210483aad309dac0ce9df1dcdffbb"} Jan 28 20:44:34 crc kubenswrapper[4746]: I0128 20:44:34.625901 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" Jan 28 20:44:34 crc kubenswrapper[4746]: I0128 20:44:34.627992 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" Jan 28 20:44:38 crc kubenswrapper[4746]: I0128 20:44:38.757655 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-699b895958-sjrh2"] Jan 28 20:44:38 crc kubenswrapper[4746]: I0128 20:44:38.758411 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" podUID="ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d" containerName="controller-manager" containerID="cri-o://32b460399a59ae6cccc37e51c952d58cfdc97c15d21a41d6064a2de0051eae83" gracePeriod=30 Jan 28 20:44:38 crc kubenswrapper[4746]: I0128 20:44:38.790993 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c"] Jan 28 20:44:38 crc kubenswrapper[4746]: I0128 20:44:38.791547 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" podUID="9629e562-2adf-453b-8dfb-37e9195b996e" containerName="route-controller-manager" containerID="cri-o://cc5d41ae4726f814b4dfa4ceff7391a250ab5f3fd0305acd5cef81fce402e610" gracePeriod=30 Jan 28 20:44:39 crc kubenswrapper[4746]: I0128 20:44:39.657587 4746 generic.go:334] "Generic (PLEG): container finished" podID="ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d" containerID="32b460399a59ae6cccc37e51c952d58cfdc97c15d21a41d6064a2de0051eae83" exitCode=0 Jan 28 20:44:39 crc kubenswrapper[4746]: I0128 20:44:39.657682 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" event={"ID":"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d","Type":"ContainerDied","Data":"32b460399a59ae6cccc37e51c952d58cfdc97c15d21a41d6064a2de0051eae83"} Jan 28 20:44:39 crc kubenswrapper[4746]: I0128 20:44:39.659726 4746 generic.go:334] "Generic (PLEG): container finished" podID="9629e562-2adf-453b-8dfb-37e9195b996e" containerID="cc5d41ae4726f814b4dfa4ceff7391a250ab5f3fd0305acd5cef81fce402e610" exitCode=0 Jan 28 20:44:39 crc kubenswrapper[4746]: I0128 20:44:39.659796 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" event={"ID":"9629e562-2adf-453b-8dfb-37e9195b996e","Type":"ContainerDied","Data":"cc5d41ae4726f814b4dfa4ceff7391a250ab5f3fd0305acd5cef81fce402e610"} Jan 28 20:44:39 crc kubenswrapper[4746]: I0128 20:44:39.950370 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" Jan 28 20:44:39 crc kubenswrapper[4746]: I0128 20:44:39.991613 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5"] Jan 28 20:44:39 crc kubenswrapper[4746]: E0128 20:44:39.992155 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9629e562-2adf-453b-8dfb-37e9195b996e" containerName="route-controller-manager" Jan 28 20:44:39 crc kubenswrapper[4746]: I0128 20:44:39.992170 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="9629e562-2adf-453b-8dfb-37e9195b996e" containerName="route-controller-manager" Jan 28 20:44:39 crc kubenswrapper[4746]: I0128 20:44:39.992275 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="9629e562-2adf-453b-8dfb-37e9195b996e" containerName="route-controller-manager" Jan 28 20:44:39 crc kubenswrapper[4746]: I0128 20:44:39.992759 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" Jan 28 20:44:39 crc kubenswrapper[4746]: I0128 20:44:39.997361 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5"] Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.059386 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.084495 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9629e562-2adf-453b-8dfb-37e9195b996e-client-ca\") pod \"9629e562-2adf-453b-8dfb-37e9195b996e\" (UID: \"9629e562-2adf-453b-8dfb-37e9195b996e\") " Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.084887 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9629e562-2adf-453b-8dfb-37e9195b996e-serving-cert\") pod \"9629e562-2adf-453b-8dfb-37e9195b996e\" (UID: \"9629e562-2adf-453b-8dfb-37e9195b996e\") " Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.084945 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9629e562-2adf-453b-8dfb-37e9195b996e-config\") pod \"9629e562-2adf-453b-8dfb-37e9195b996e\" (UID: \"9629e562-2adf-453b-8dfb-37e9195b996e\") " Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.084971 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwdfd\" (UniqueName: \"kubernetes.io/projected/9629e562-2adf-453b-8dfb-37e9195b996e-kube-api-access-hwdfd\") pod \"9629e562-2adf-453b-8dfb-37e9195b996e\" (UID: \"9629e562-2adf-453b-8dfb-37e9195b996e\") " Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.085229 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec78c48e-432a-4d99-9010-65ea9dc8642f-config\") pod \"route-controller-manager-75ddb6d6df-nr5j5\" (UID: \"ec78c48e-432a-4d99-9010-65ea9dc8642f\") " pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.085263 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dth4k\" (UniqueName: \"kubernetes.io/projected/ec78c48e-432a-4d99-9010-65ea9dc8642f-kube-api-access-dth4k\") pod \"route-controller-manager-75ddb6d6df-nr5j5\" (UID: \"ec78c48e-432a-4d99-9010-65ea9dc8642f\") " pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.085326 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec78c48e-432a-4d99-9010-65ea9dc8642f-serving-cert\") pod \"route-controller-manager-75ddb6d6df-nr5j5\" (UID: \"ec78c48e-432a-4d99-9010-65ea9dc8642f\") " pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.085359 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec78c48e-432a-4d99-9010-65ea9dc8642f-client-ca\") pod \"route-controller-manager-75ddb6d6df-nr5j5\" (UID: \"ec78c48e-432a-4d99-9010-65ea9dc8642f\") " pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.085666 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9629e562-2adf-453b-8dfb-37e9195b996e-client-ca" (OuterVolumeSpecName: "client-ca") pod "9629e562-2adf-453b-8dfb-37e9195b996e" (UID: "9629e562-2adf-453b-8dfb-37e9195b996e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.086417 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9629e562-2adf-453b-8dfb-37e9195b996e-config" (OuterVolumeSpecName: "config") pod "9629e562-2adf-453b-8dfb-37e9195b996e" (UID: "9629e562-2adf-453b-8dfb-37e9195b996e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.093275 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9629e562-2adf-453b-8dfb-37e9195b996e-kube-api-access-hwdfd" (OuterVolumeSpecName: "kube-api-access-hwdfd") pod "9629e562-2adf-453b-8dfb-37e9195b996e" (UID: "9629e562-2adf-453b-8dfb-37e9195b996e"). InnerVolumeSpecName "kube-api-access-hwdfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.099048 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9629e562-2adf-453b-8dfb-37e9195b996e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9629e562-2adf-453b-8dfb-37e9195b996e" (UID: "9629e562-2adf-453b-8dfb-37e9195b996e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.186401 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-proxy-ca-bundles\") pod \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.186575 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nfwq\" (UniqueName: \"kubernetes.io/projected/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-kube-api-access-8nfwq\") pod \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.186644 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-serving-cert\") pod \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.187987 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d" (UID: "ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.188203 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-client-ca\") pod \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.188308 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-config\") pod \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\" (UID: \"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d\") " Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.188732 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec78c48e-432a-4d99-9010-65ea9dc8642f-serving-cert\") pod \"route-controller-manager-75ddb6d6df-nr5j5\" (UID: \"ec78c48e-432a-4d99-9010-65ea9dc8642f\") " pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.188807 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec78c48e-432a-4d99-9010-65ea9dc8642f-client-ca\") pod \"route-controller-manager-75ddb6d6df-nr5j5\" (UID: \"ec78c48e-432a-4d99-9010-65ea9dc8642f\") " pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.188893 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec78c48e-432a-4d99-9010-65ea9dc8642f-config\") pod \"route-controller-manager-75ddb6d6df-nr5j5\" (UID: \"ec78c48e-432a-4d99-9010-65ea9dc8642f\") " pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.188948 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dth4k\" (UniqueName: \"kubernetes.io/projected/ec78c48e-432a-4d99-9010-65ea9dc8642f-kube-api-access-dth4k\") pod \"route-controller-manager-75ddb6d6df-nr5j5\" (UID: \"ec78c48e-432a-4d99-9010-65ea9dc8642f\") " pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.188990 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-config" (OuterVolumeSpecName: "config") pod "ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d" (UID: "ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.189286 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9629e562-2adf-453b-8dfb-37e9195b996e-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.189442 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9629e562-2adf-453b-8dfb-37e9195b996e-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.189535 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwdfd\" (UniqueName: \"kubernetes.io/projected/9629e562-2adf-453b-8dfb-37e9195b996e-kube-api-access-hwdfd\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.189636 4746 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.189703 4746 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9629e562-2adf-453b-8dfb-37e9195b996e-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.189930 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-client-ca" (OuterVolumeSpecName: "client-ca") pod "ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d" (UID: "ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.190145 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec78c48e-432a-4d99-9010-65ea9dc8642f-client-ca\") pod \"route-controller-manager-75ddb6d6df-nr5j5\" (UID: \"ec78c48e-432a-4d99-9010-65ea9dc8642f\") " pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.191050 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec78c48e-432a-4d99-9010-65ea9dc8642f-config\") pod \"route-controller-manager-75ddb6d6df-nr5j5\" (UID: \"ec78c48e-432a-4d99-9010-65ea9dc8642f\") " pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.191236 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-kube-api-access-8nfwq" (OuterVolumeSpecName: "kube-api-access-8nfwq") pod "ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d" (UID: "ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d"). InnerVolumeSpecName "kube-api-access-8nfwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.191290 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d" (UID: "ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.195592 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec78c48e-432a-4d99-9010-65ea9dc8642f-serving-cert\") pod \"route-controller-manager-75ddb6d6df-nr5j5\" (UID: \"ec78c48e-432a-4d99-9010-65ea9dc8642f\") " pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.212705 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dth4k\" (UniqueName: \"kubernetes.io/projected/ec78c48e-432a-4d99-9010-65ea9dc8642f-kube-api-access-dth4k\") pod \"route-controller-manager-75ddb6d6df-nr5j5\" (UID: \"ec78c48e-432a-4d99-9010-65ea9dc8642f\") " pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.294472 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nfwq\" (UniqueName: \"kubernetes.io/projected/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-kube-api-access-8nfwq\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.294540 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.294556 4746 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.294570 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.356774 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.667165 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" event={"ID":"9629e562-2adf-453b-8dfb-37e9195b996e","Type":"ContainerDied","Data":"da2ed9f0cc3c2fb3bb93f40e32a382f2f62edc00897027fcd47573a42bfb49c1"} Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.667716 4746 scope.go:117] "RemoveContainer" containerID="cc5d41ae4726f814b4dfa4ceff7391a250ab5f3fd0305acd5cef81fce402e610" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.667230 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.668432 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" event={"ID":"ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d","Type":"ContainerDied","Data":"58c6a8e51b10b8d194c10aa508840e171a87595752281f5addea1902a26d5702"} Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.668479 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-699b895958-sjrh2" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.689043 4746 scope.go:117] "RemoveContainer" containerID="32b460399a59ae6cccc37e51c952d58cfdc97c15d21a41d6064a2de0051eae83" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.702387 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c"] Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.708926 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55969676b6-nm42c"] Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.728362 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-699b895958-sjrh2"] Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.732530 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-699b895958-sjrh2"] Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.845271 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9629e562-2adf-453b-8dfb-37e9195b996e" path="/var/lib/kubelet/pods/9629e562-2adf-453b-8dfb-37e9195b996e/volumes" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.846330 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d" path="/var/lib/kubelet/pods/ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d/volumes" Jan 28 20:44:40 crc kubenswrapper[4746]: I0128 20:44:40.847636 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5"] Jan 28 20:44:40 crc kubenswrapper[4746]: W0128 20:44:40.848294 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec78c48e_432a_4d99_9010_65ea9dc8642f.slice/crio-a24ef2e44d4def6d400af49cb575bdcaa3043a2ffa5718502e5ca26cf3dc6f27 WatchSource:0}: Error finding container a24ef2e44d4def6d400af49cb575bdcaa3043a2ffa5718502e5ca26cf3dc6f27: Status 404 returned error can't find the container with id a24ef2e44d4def6d400af49cb575bdcaa3043a2ffa5718502e5ca26cf3dc6f27 Jan 28 20:44:41 crc kubenswrapper[4746]: I0128 20:44:41.677577 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" event={"ID":"ec78c48e-432a-4d99-9010-65ea9dc8642f","Type":"ContainerStarted","Data":"1e8ddcf0621502f2aac6c119442326d492dca6f7742ec123e55d8f6bbd56cb0e"} Jan 28 20:44:41 crc kubenswrapper[4746]: I0128 20:44:41.677634 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" event={"ID":"ec78c48e-432a-4d99-9010-65ea9dc8642f","Type":"ContainerStarted","Data":"a24ef2e44d4def6d400af49cb575bdcaa3043a2ffa5718502e5ca26cf3dc6f27"} Jan 28 20:44:41 crc kubenswrapper[4746]: I0128 20:44:41.679829 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" Jan 28 20:44:41 crc kubenswrapper[4746]: I0128 20:44:41.685705 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" Jan 28 20:44:41 crc kubenswrapper[4746]: I0128 20:44:41.696621 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" podStartSLOduration=3.696599971 podStartE2EDuration="3.696599971s" podCreationTimestamp="2026-01-28 20:44:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:44:41.694645856 +0000 UTC m=+309.650832220" watchObservedRunningTime="2026-01-28 20:44:41.696599971 +0000 UTC m=+309.652786325" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.508566 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76bbb855b9-nr44n"] Jan 28 20:44:42 crc kubenswrapper[4746]: E0128 20:44:42.509692 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d" containerName="controller-manager" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.509726 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d" containerName="controller-manager" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.510006 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea55a0dc-b65d-4a6a-b497-5c23f4ef6d1d" containerName="controller-manager" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.510949 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.514680 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.514791 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.515379 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.515501 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.516232 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.518020 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.524519 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.526551 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76bbb855b9-nr44n"] Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.530958 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313e5ac4-1f33-4961-a247-3a72fe8138d0-serving-cert\") pod \"controller-manager-76bbb855b9-nr44n\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.531066 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313e5ac4-1f33-4961-a247-3a72fe8138d0-config\") pod \"controller-manager-76bbb855b9-nr44n\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.531133 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313e5ac4-1f33-4961-a247-3a72fe8138d0-client-ca\") pod \"controller-manager-76bbb855b9-nr44n\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.531192 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7wxc\" (UniqueName: \"kubernetes.io/projected/313e5ac4-1f33-4961-a247-3a72fe8138d0-kube-api-access-h7wxc\") pod \"controller-manager-76bbb855b9-nr44n\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.531269 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/313e5ac4-1f33-4961-a247-3a72fe8138d0-proxy-ca-bundles\") pod \"controller-manager-76bbb855b9-nr44n\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.632216 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313e5ac4-1f33-4961-a247-3a72fe8138d0-serving-cert\") pod \"controller-manager-76bbb855b9-nr44n\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.632366 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313e5ac4-1f33-4961-a247-3a72fe8138d0-config\") pod \"controller-manager-76bbb855b9-nr44n\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.632389 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313e5ac4-1f33-4961-a247-3a72fe8138d0-client-ca\") pod \"controller-manager-76bbb855b9-nr44n\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.634438 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7wxc\" (UniqueName: \"kubernetes.io/projected/313e5ac4-1f33-4961-a247-3a72fe8138d0-kube-api-access-h7wxc\") pod \"controller-manager-76bbb855b9-nr44n\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.634530 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313e5ac4-1f33-4961-a247-3a72fe8138d0-config\") pod \"controller-manager-76bbb855b9-nr44n\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.634452 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313e5ac4-1f33-4961-a247-3a72fe8138d0-client-ca\") pod \"controller-manager-76bbb855b9-nr44n\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.634644 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/313e5ac4-1f33-4961-a247-3a72fe8138d0-proxy-ca-bundles\") pod \"controller-manager-76bbb855b9-nr44n\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.636022 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/313e5ac4-1f33-4961-a247-3a72fe8138d0-proxy-ca-bundles\") pod \"controller-manager-76bbb855b9-nr44n\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.644173 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313e5ac4-1f33-4961-a247-3a72fe8138d0-serving-cert\") pod \"controller-manager-76bbb855b9-nr44n\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.651159 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7wxc\" (UniqueName: \"kubernetes.io/projected/313e5ac4-1f33-4961-a247-3a72fe8138d0-kube-api-access-h7wxc\") pod \"controller-manager-76bbb855b9-nr44n\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:42 crc kubenswrapper[4746]: I0128 20:44:42.834250 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:43 crc kubenswrapper[4746]: I0128 20:44:43.289603 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76bbb855b9-nr44n"] Jan 28 20:44:43 crc kubenswrapper[4746]: I0128 20:44:43.695155 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" event={"ID":"313e5ac4-1f33-4961-a247-3a72fe8138d0","Type":"ContainerStarted","Data":"dbfe8b15e772bf05ecba765ba9ad63f04980cd1a7bb4c11ea556997addf5e8e7"} Jan 28 20:44:43 crc kubenswrapper[4746]: I0128 20:44:43.695232 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" event={"ID":"313e5ac4-1f33-4961-a247-3a72fe8138d0","Type":"ContainerStarted","Data":"6ca802f7c9166668186654a279e699e1884f7216549c930d82b20ca11addea3b"} Jan 28 20:44:43 crc kubenswrapper[4746]: I0128 20:44:43.716365 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" podStartSLOduration=5.716333842 podStartE2EDuration="5.716333842s" podCreationTimestamp="2026-01-28 20:44:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:44:43.712509304 +0000 UTC m=+311.668695658" watchObservedRunningTime="2026-01-28 20:44:43.716333842 +0000 UTC m=+311.672520196" Jan 28 20:44:44 crc kubenswrapper[4746]: I0128 20:44:44.701025 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:44 crc kubenswrapper[4746]: I0128 20:44:44.706283 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:58 crc kubenswrapper[4746]: I0128 20:44:58.747543 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76bbb855b9-nr44n"] Jan 28 20:44:58 crc kubenswrapper[4746]: I0128 20:44:58.748427 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" podUID="313e5ac4-1f33-4961-a247-3a72fe8138d0" containerName="controller-manager" containerID="cri-o://dbfe8b15e772bf05ecba765ba9ad63f04980cd1a7bb4c11ea556997addf5e8e7" gracePeriod=30 Jan 28 20:44:58 crc kubenswrapper[4746]: I0128 20:44:58.767985 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5"] Jan 28 20:44:58 crc kubenswrapper[4746]: I0128 20:44:58.768289 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" podUID="ec78c48e-432a-4d99-9010-65ea9dc8642f" containerName="route-controller-manager" containerID="cri-o://1e8ddcf0621502f2aac6c119442326d492dca6f7742ec123e55d8f6bbd56cb0e" gracePeriod=30 Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.370102 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.431013 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec78c48e-432a-4d99-9010-65ea9dc8642f-config\") pod \"ec78c48e-432a-4d99-9010-65ea9dc8642f\" (UID: \"ec78c48e-432a-4d99-9010-65ea9dc8642f\") " Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.431216 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec78c48e-432a-4d99-9010-65ea9dc8642f-client-ca\") pod \"ec78c48e-432a-4d99-9010-65ea9dc8642f\" (UID: \"ec78c48e-432a-4d99-9010-65ea9dc8642f\") " Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.431294 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec78c48e-432a-4d99-9010-65ea9dc8642f-serving-cert\") pod \"ec78c48e-432a-4d99-9010-65ea9dc8642f\" (UID: \"ec78c48e-432a-4d99-9010-65ea9dc8642f\") " Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.431363 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dth4k\" (UniqueName: \"kubernetes.io/projected/ec78c48e-432a-4d99-9010-65ea9dc8642f-kube-api-access-dth4k\") pod \"ec78c48e-432a-4d99-9010-65ea9dc8642f\" (UID: \"ec78c48e-432a-4d99-9010-65ea9dc8642f\") " Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.432279 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec78c48e-432a-4d99-9010-65ea9dc8642f-config" (OuterVolumeSpecName: "config") pod "ec78c48e-432a-4d99-9010-65ea9dc8642f" (UID: "ec78c48e-432a-4d99-9010-65ea9dc8642f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.432375 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec78c48e-432a-4d99-9010-65ea9dc8642f-client-ca" (OuterVolumeSpecName: "client-ca") pod "ec78c48e-432a-4d99-9010-65ea9dc8642f" (UID: "ec78c48e-432a-4d99-9010-65ea9dc8642f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.438999 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec78c48e-432a-4d99-9010-65ea9dc8642f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ec78c48e-432a-4d99-9010-65ea9dc8642f" (UID: "ec78c48e-432a-4d99-9010-65ea9dc8642f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.450261 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec78c48e-432a-4d99-9010-65ea9dc8642f-kube-api-access-dth4k" (OuterVolumeSpecName: "kube-api-access-dth4k") pod "ec78c48e-432a-4d99-9010-65ea9dc8642f" (UID: "ec78c48e-432a-4d99-9010-65ea9dc8642f"). InnerVolumeSpecName "kube-api-access-dth4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.484502 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.532645 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313e5ac4-1f33-4961-a247-3a72fe8138d0-client-ca\") pod \"313e5ac4-1f33-4961-a247-3a72fe8138d0\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.532743 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7wxc\" (UniqueName: \"kubernetes.io/projected/313e5ac4-1f33-4961-a247-3a72fe8138d0-kube-api-access-h7wxc\") pod \"313e5ac4-1f33-4961-a247-3a72fe8138d0\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.532806 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313e5ac4-1f33-4961-a247-3a72fe8138d0-config\") pod \"313e5ac4-1f33-4961-a247-3a72fe8138d0\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.532896 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/313e5ac4-1f33-4961-a247-3a72fe8138d0-proxy-ca-bundles\") pod \"313e5ac4-1f33-4961-a247-3a72fe8138d0\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.532968 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313e5ac4-1f33-4961-a247-3a72fe8138d0-serving-cert\") pod \"313e5ac4-1f33-4961-a247-3a72fe8138d0\" (UID: \"313e5ac4-1f33-4961-a247-3a72fe8138d0\") " Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.533485 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/313e5ac4-1f33-4961-a247-3a72fe8138d0-client-ca" (OuterVolumeSpecName: "client-ca") pod "313e5ac4-1f33-4961-a247-3a72fe8138d0" (UID: "313e5ac4-1f33-4961-a247-3a72fe8138d0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.533753 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/313e5ac4-1f33-4961-a247-3a72fe8138d0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "313e5ac4-1f33-4961-a247-3a72fe8138d0" (UID: "313e5ac4-1f33-4961-a247-3a72fe8138d0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.533942 4746 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec78c48e-432a-4d99-9010-65ea9dc8642f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.533955 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/313e5ac4-1f33-4961-a247-3a72fe8138d0-config" (OuterVolumeSpecName: "config") pod "313e5ac4-1f33-4961-a247-3a72fe8138d0" (UID: "313e5ac4-1f33-4961-a247-3a72fe8138d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.533986 4746 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/313e5ac4-1f33-4961-a247-3a72fe8138d0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.534113 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec78c48e-432a-4d99-9010-65ea9dc8642f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.534139 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dth4k\" (UniqueName: \"kubernetes.io/projected/ec78c48e-432a-4d99-9010-65ea9dc8642f-kube-api-access-dth4k\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.534161 4746 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/313e5ac4-1f33-4961-a247-3a72fe8138d0-client-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.534184 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec78c48e-432a-4d99-9010-65ea9dc8642f-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.536260 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/313e5ac4-1f33-4961-a247-3a72fe8138d0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "313e5ac4-1f33-4961-a247-3a72fe8138d0" (UID: "313e5ac4-1f33-4961-a247-3a72fe8138d0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.537249 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/313e5ac4-1f33-4961-a247-3a72fe8138d0-kube-api-access-h7wxc" (OuterVolumeSpecName: "kube-api-access-h7wxc") pod "313e5ac4-1f33-4961-a247-3a72fe8138d0" (UID: "313e5ac4-1f33-4961-a247-3a72fe8138d0"). InnerVolumeSpecName "kube-api-access-h7wxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.635445 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7wxc\" (UniqueName: \"kubernetes.io/projected/313e5ac4-1f33-4961-a247-3a72fe8138d0-kube-api-access-h7wxc\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.635493 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/313e5ac4-1f33-4961-a247-3a72fe8138d0-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.635505 4746 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/313e5ac4-1f33-4961-a247-3a72fe8138d0-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.796533 4746 generic.go:334] "Generic (PLEG): container finished" podID="313e5ac4-1f33-4961-a247-3a72fe8138d0" containerID="dbfe8b15e772bf05ecba765ba9ad63f04980cd1a7bb4c11ea556997addf5e8e7" exitCode=0 Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.796662 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" event={"ID":"313e5ac4-1f33-4961-a247-3a72fe8138d0","Type":"ContainerDied","Data":"dbfe8b15e772bf05ecba765ba9ad63f04980cd1a7bb4c11ea556997addf5e8e7"} Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.796684 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.796718 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76bbb855b9-nr44n" event={"ID":"313e5ac4-1f33-4961-a247-3a72fe8138d0","Type":"ContainerDied","Data":"6ca802f7c9166668186654a279e699e1884f7216549c930d82b20ca11addea3b"} Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.796742 4746 scope.go:117] "RemoveContainer" containerID="dbfe8b15e772bf05ecba765ba9ad63f04980cd1a7bb4c11ea556997addf5e8e7" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.800328 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.801373 4746 generic.go:334] "Generic (PLEG): container finished" podID="ec78c48e-432a-4d99-9010-65ea9dc8642f" containerID="1e8ddcf0621502f2aac6c119442326d492dca6f7742ec123e55d8f6bbd56cb0e" exitCode=0 Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.801468 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" event={"ID":"ec78c48e-432a-4d99-9010-65ea9dc8642f","Type":"ContainerDied","Data":"1e8ddcf0621502f2aac6c119442326d492dca6f7742ec123e55d8f6bbd56cb0e"} Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.801893 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5" event={"ID":"ec78c48e-432a-4d99-9010-65ea9dc8642f","Type":"ContainerDied","Data":"a24ef2e44d4def6d400af49cb575bdcaa3043a2ffa5718502e5ca26cf3dc6f27"} Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.827944 4746 scope.go:117] "RemoveContainer" containerID="dbfe8b15e772bf05ecba765ba9ad63f04980cd1a7bb4c11ea556997addf5e8e7" Jan 28 20:44:59 crc kubenswrapper[4746]: E0128 20:44:59.828820 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbfe8b15e772bf05ecba765ba9ad63f04980cd1a7bb4c11ea556997addf5e8e7\": container with ID starting with dbfe8b15e772bf05ecba765ba9ad63f04980cd1a7bb4c11ea556997addf5e8e7 not found: ID does not exist" containerID="dbfe8b15e772bf05ecba765ba9ad63f04980cd1a7bb4c11ea556997addf5e8e7" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.828914 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbfe8b15e772bf05ecba765ba9ad63f04980cd1a7bb4c11ea556997addf5e8e7"} err="failed to get container status \"dbfe8b15e772bf05ecba765ba9ad63f04980cd1a7bb4c11ea556997addf5e8e7\": rpc error: code = NotFound desc = could not find container \"dbfe8b15e772bf05ecba765ba9ad63f04980cd1a7bb4c11ea556997addf5e8e7\": container with ID starting with dbfe8b15e772bf05ecba765ba9ad63f04980cd1a7bb4c11ea556997addf5e8e7 not found: ID does not exist" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.828975 4746 scope.go:117] "RemoveContainer" containerID="1e8ddcf0621502f2aac6c119442326d492dca6f7742ec123e55d8f6bbd56cb0e" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.848226 4746 scope.go:117] "RemoveContainer" containerID="1e8ddcf0621502f2aac6c119442326d492dca6f7742ec123e55d8f6bbd56cb0e" Jan 28 20:44:59 crc kubenswrapper[4746]: E0128 20:44:59.848944 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e8ddcf0621502f2aac6c119442326d492dca6f7742ec123e55d8f6bbd56cb0e\": container with ID starting with 1e8ddcf0621502f2aac6c119442326d492dca6f7742ec123e55d8f6bbd56cb0e not found: ID does not exist" containerID="1e8ddcf0621502f2aac6c119442326d492dca6f7742ec123e55d8f6bbd56cb0e" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.849003 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e8ddcf0621502f2aac6c119442326d492dca6f7742ec123e55d8f6bbd56cb0e"} err="failed to get container status \"1e8ddcf0621502f2aac6c119442326d492dca6f7742ec123e55d8f6bbd56cb0e\": rpc error: code = NotFound desc = could not find container \"1e8ddcf0621502f2aac6c119442326d492dca6f7742ec123e55d8f6bbd56cb0e\": container with ID starting with 1e8ddcf0621502f2aac6c119442326d492dca6f7742ec123e55d8f6bbd56cb0e not found: ID does not exist" Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.850499 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76bbb855b9-nr44n"] Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.860273 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-76bbb855b9-nr44n"] Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.871615 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5"] Jan 28 20:44:59 crc kubenswrapper[4746]: I0128 20:44:59.877119 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75ddb6d6df-nr5j5"] Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.202800 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9"] Jan 28 20:45:00 crc kubenswrapper[4746]: E0128 20:45:00.203031 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec78c48e-432a-4d99-9010-65ea9dc8642f" containerName="route-controller-manager" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.203044 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec78c48e-432a-4d99-9010-65ea9dc8642f" containerName="route-controller-manager" Jan 28 20:45:00 crc kubenswrapper[4746]: E0128 20:45:00.203063 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313e5ac4-1f33-4961-a247-3a72fe8138d0" containerName="controller-manager" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.203070 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="313e5ac4-1f33-4961-a247-3a72fe8138d0" containerName="controller-manager" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.203214 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="313e5ac4-1f33-4961-a247-3a72fe8138d0" containerName="controller-manager" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.203232 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec78c48e-432a-4d99-9010-65ea9dc8642f" containerName="route-controller-manager" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.203633 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.207301 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.210032 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.215039 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9"] Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.243701 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h62vp\" (UniqueName: \"kubernetes.io/projected/189cf38e-34c5-4cd9-ad46-db8bc26b458e-kube-api-access-h62vp\") pod \"collect-profiles-29493885-pgrr9\" (UID: \"189cf38e-34c5-4cd9-ad46-db8bc26b458e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.243798 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/189cf38e-34c5-4cd9-ad46-db8bc26b458e-config-volume\") pod \"collect-profiles-29493885-pgrr9\" (UID: \"189cf38e-34c5-4cd9-ad46-db8bc26b458e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.243835 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/189cf38e-34c5-4cd9-ad46-db8bc26b458e-secret-volume\") pod \"collect-profiles-29493885-pgrr9\" (UID: \"189cf38e-34c5-4cd9-ad46-db8bc26b458e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.345747 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h62vp\" (UniqueName: \"kubernetes.io/projected/189cf38e-34c5-4cd9-ad46-db8bc26b458e-kube-api-access-h62vp\") pod \"collect-profiles-29493885-pgrr9\" (UID: \"189cf38e-34c5-4cd9-ad46-db8bc26b458e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.345825 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/189cf38e-34c5-4cd9-ad46-db8bc26b458e-config-volume\") pod \"collect-profiles-29493885-pgrr9\" (UID: \"189cf38e-34c5-4cd9-ad46-db8bc26b458e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.345857 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/189cf38e-34c5-4cd9-ad46-db8bc26b458e-secret-volume\") pod \"collect-profiles-29493885-pgrr9\" (UID: \"189cf38e-34c5-4cd9-ad46-db8bc26b458e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.346955 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/189cf38e-34c5-4cd9-ad46-db8bc26b458e-config-volume\") pod \"collect-profiles-29493885-pgrr9\" (UID: \"189cf38e-34c5-4cd9-ad46-db8bc26b458e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.352853 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/189cf38e-34c5-4cd9-ad46-db8bc26b458e-secret-volume\") pod \"collect-profiles-29493885-pgrr9\" (UID: \"189cf38e-34c5-4cd9-ad46-db8bc26b458e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.364039 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h62vp\" (UniqueName: \"kubernetes.io/projected/189cf38e-34c5-4cd9-ad46-db8bc26b458e-kube-api-access-h62vp\") pod \"collect-profiles-29493885-pgrr9\" (UID: \"189cf38e-34c5-4cd9-ad46-db8bc26b458e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.520521 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.526709 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-699b895958-xsn9c"] Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.528219 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.531715 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.532151 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.532754 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.533608 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.533840 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.538137 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.540035 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.541194 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt"] Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.542398 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.544147 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.546041 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.546326 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.547166 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.547314 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.547437 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.548208 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt"] Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.548373 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10cb3a36-c59d-417d-9811-563e33354461-serving-cert\") pod \"controller-manager-699b895958-xsn9c\" (UID: \"10cb3a36-c59d-417d-9811-563e33354461\") " pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.548459 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10cb3a36-c59d-417d-9811-563e33354461-proxy-ca-bundles\") pod \"controller-manager-699b895958-xsn9c\" (UID: \"10cb3a36-c59d-417d-9811-563e33354461\") " pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.548509 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10cb3a36-c59d-417d-9811-563e33354461-client-ca\") pod \"controller-manager-699b895958-xsn9c\" (UID: \"10cb3a36-c59d-417d-9811-563e33354461\") " pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.548556 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10cb3a36-c59d-417d-9811-563e33354461-config\") pod \"controller-manager-699b895958-xsn9c\" (UID: \"10cb3a36-c59d-417d-9811-563e33354461\") " pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.548624 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crkrn\" (UniqueName: \"kubernetes.io/projected/10cb3a36-c59d-417d-9811-563e33354461-kube-api-access-crkrn\") pod \"controller-manager-699b895958-xsn9c\" (UID: \"10cb3a36-c59d-417d-9811-563e33354461\") " pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.555489 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-699b895958-xsn9c"] Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.650203 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10cb3a36-c59d-417d-9811-563e33354461-serving-cert\") pod \"controller-manager-699b895958-xsn9c\" (UID: \"10cb3a36-c59d-417d-9811-563e33354461\") " pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.650739 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a0e9a24-0499-4392-8326-d54cb4cf91ec-serving-cert\") pod \"route-controller-manager-55969676b6-hhpzt\" (UID: \"4a0e9a24-0499-4392-8326-d54cb4cf91ec\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.650784 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10cb3a36-c59d-417d-9811-563e33354461-proxy-ca-bundles\") pod \"controller-manager-699b895958-xsn9c\" (UID: \"10cb3a36-c59d-417d-9811-563e33354461\") " pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.650820 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10cb3a36-c59d-417d-9811-563e33354461-client-ca\") pod \"controller-manager-699b895958-xsn9c\" (UID: \"10cb3a36-c59d-417d-9811-563e33354461\") " pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.650849 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a0e9a24-0499-4392-8326-d54cb4cf91ec-client-ca\") pod \"route-controller-manager-55969676b6-hhpzt\" (UID: \"4a0e9a24-0499-4392-8326-d54cb4cf91ec\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.650897 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10cb3a36-c59d-417d-9811-563e33354461-config\") pod \"controller-manager-699b895958-xsn9c\" (UID: \"10cb3a36-c59d-417d-9811-563e33354461\") " pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.650933 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crkrn\" (UniqueName: \"kubernetes.io/projected/10cb3a36-c59d-417d-9811-563e33354461-kube-api-access-crkrn\") pod \"controller-manager-699b895958-xsn9c\" (UID: \"10cb3a36-c59d-417d-9811-563e33354461\") " pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.650970 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a0e9a24-0499-4392-8326-d54cb4cf91ec-config\") pod \"route-controller-manager-55969676b6-hhpzt\" (UID: \"4a0e9a24-0499-4392-8326-d54cb4cf91ec\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.651009 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqk9c\" (UniqueName: \"kubernetes.io/projected/4a0e9a24-0499-4392-8326-d54cb4cf91ec-kube-api-access-hqk9c\") pod \"route-controller-manager-55969676b6-hhpzt\" (UID: \"4a0e9a24-0499-4392-8326-d54cb4cf91ec\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.652980 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/10cb3a36-c59d-417d-9811-563e33354461-proxy-ca-bundles\") pod \"controller-manager-699b895958-xsn9c\" (UID: \"10cb3a36-c59d-417d-9811-563e33354461\") " pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.653300 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10cb3a36-c59d-417d-9811-563e33354461-config\") pod \"controller-manager-699b895958-xsn9c\" (UID: \"10cb3a36-c59d-417d-9811-563e33354461\") " pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.653924 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/10cb3a36-c59d-417d-9811-563e33354461-client-ca\") pod \"controller-manager-699b895958-xsn9c\" (UID: \"10cb3a36-c59d-417d-9811-563e33354461\") " pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.662827 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10cb3a36-c59d-417d-9811-563e33354461-serving-cert\") pod \"controller-manager-699b895958-xsn9c\" (UID: \"10cb3a36-c59d-417d-9811-563e33354461\") " pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.670764 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crkrn\" (UniqueName: \"kubernetes.io/projected/10cb3a36-c59d-417d-9811-563e33354461-kube-api-access-crkrn\") pod \"controller-manager-699b895958-xsn9c\" (UID: \"10cb3a36-c59d-417d-9811-563e33354461\") " pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.745529 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9"] Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.752333 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a0e9a24-0499-4392-8326-d54cb4cf91ec-client-ca\") pod \"route-controller-manager-55969676b6-hhpzt\" (UID: \"4a0e9a24-0499-4392-8326-d54cb4cf91ec\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.752424 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a0e9a24-0499-4392-8326-d54cb4cf91ec-config\") pod \"route-controller-manager-55969676b6-hhpzt\" (UID: \"4a0e9a24-0499-4392-8326-d54cb4cf91ec\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.752453 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqk9c\" (UniqueName: \"kubernetes.io/projected/4a0e9a24-0499-4392-8326-d54cb4cf91ec-kube-api-access-hqk9c\") pod \"route-controller-manager-55969676b6-hhpzt\" (UID: \"4a0e9a24-0499-4392-8326-d54cb4cf91ec\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.752490 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a0e9a24-0499-4392-8326-d54cb4cf91ec-serving-cert\") pod \"route-controller-manager-55969676b6-hhpzt\" (UID: \"4a0e9a24-0499-4392-8326-d54cb4cf91ec\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.753357 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a0e9a24-0499-4392-8326-d54cb4cf91ec-client-ca\") pod \"route-controller-manager-55969676b6-hhpzt\" (UID: \"4a0e9a24-0499-4392-8326-d54cb4cf91ec\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.754254 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a0e9a24-0499-4392-8326-d54cb4cf91ec-config\") pod \"route-controller-manager-55969676b6-hhpzt\" (UID: \"4a0e9a24-0499-4392-8326-d54cb4cf91ec\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.757535 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a0e9a24-0499-4392-8326-d54cb4cf91ec-serving-cert\") pod \"route-controller-manager-55969676b6-hhpzt\" (UID: \"4a0e9a24-0499-4392-8326-d54cb4cf91ec\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.770502 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqk9c\" (UniqueName: \"kubernetes.io/projected/4a0e9a24-0499-4392-8326-d54cb4cf91ec-kube-api-access-hqk9c\") pod \"route-controller-manager-55969676b6-hhpzt\" (UID: \"4a0e9a24-0499-4392-8326-d54cb4cf91ec\") " pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.808717 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9" event={"ID":"189cf38e-34c5-4cd9-ad46-db8bc26b458e","Type":"ContainerStarted","Data":"1891265f86c306d32998c393fc9baeb1ac0c5c292ebd02bcc584ba69bdee285e"} Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.842712 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="313e5ac4-1f33-4961-a247-3a72fe8138d0" path="/var/lib/kubelet/pods/313e5ac4-1f33-4961-a247-3a72fe8138d0/volumes" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.843612 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec78c48e-432a-4d99-9010-65ea9dc8642f" path="/var/lib/kubelet/pods/ec78c48e-432a-4d99-9010-65ea9dc8642f/volumes" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.897210 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:00 crc kubenswrapper[4746]: I0128 20:45:00.905992 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" Jan 28 20:45:01 crc kubenswrapper[4746]: I0128 20:45:01.187687 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-699b895958-xsn9c"] Jan 28 20:45:01 crc kubenswrapper[4746]: W0128 20:45:01.204755 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10cb3a36_c59d_417d_9811_563e33354461.slice/crio-8eb1d64fa3a641f04279cd84ba12ca85581e29038366ad29736e4c7e299d077c WatchSource:0}: Error finding container 8eb1d64fa3a641f04279cd84ba12ca85581e29038366ad29736e4c7e299d077c: Status 404 returned error can't find the container with id 8eb1d64fa3a641f04279cd84ba12ca85581e29038366ad29736e4c7e299d077c Jan 28 20:45:01 crc kubenswrapper[4746]: I0128 20:45:01.332456 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt"] Jan 28 20:45:01 crc kubenswrapper[4746]: W0128 20:45:01.341355 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a0e9a24_0499_4392_8326_d54cb4cf91ec.slice/crio-28d600a407f50f9b96626e74ed4520e598a90d0aa822bdd55d650b232c15859a WatchSource:0}: Error finding container 28d600a407f50f9b96626e74ed4520e598a90d0aa822bdd55d650b232c15859a: Status 404 returned error can't find the container with id 28d600a407f50f9b96626e74ed4520e598a90d0aa822bdd55d650b232c15859a Jan 28 20:45:01 crc kubenswrapper[4746]: I0128 20:45:01.818919 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" event={"ID":"4a0e9a24-0499-4392-8326-d54cb4cf91ec","Type":"ContainerStarted","Data":"1216c22cdf9c519323385619d45e432f6d5d14989a7004764bd46d9972921eda"} Jan 28 20:45:01 crc kubenswrapper[4746]: I0128 20:45:01.818996 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" event={"ID":"4a0e9a24-0499-4392-8326-d54cb4cf91ec","Type":"ContainerStarted","Data":"28d600a407f50f9b96626e74ed4520e598a90d0aa822bdd55d650b232c15859a"} Jan 28 20:45:01 crc kubenswrapper[4746]: I0128 20:45:01.819349 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" Jan 28 20:45:01 crc kubenswrapper[4746]: I0128 20:45:01.820656 4746 generic.go:334] "Generic (PLEG): container finished" podID="189cf38e-34c5-4cd9-ad46-db8bc26b458e" containerID="53d24f4b0daeba9c3cedfb4178cddf963faa1db0f701a2bdaf0499613c7e320c" exitCode=0 Jan 28 20:45:01 crc kubenswrapper[4746]: I0128 20:45:01.820717 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9" event={"ID":"189cf38e-34c5-4cd9-ad46-db8bc26b458e","Type":"ContainerDied","Data":"53d24f4b0daeba9c3cedfb4178cddf963faa1db0f701a2bdaf0499613c7e320c"} Jan 28 20:45:01 crc kubenswrapper[4746]: I0128 20:45:01.822234 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" event={"ID":"10cb3a36-c59d-417d-9811-563e33354461","Type":"ContainerStarted","Data":"66c46d9447da074fd8a647063cb647d770a7abca50a129ce18f13422624f47fd"} Jan 28 20:45:01 crc kubenswrapper[4746]: I0128 20:45:01.822262 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" event={"ID":"10cb3a36-c59d-417d-9811-563e33354461","Type":"ContainerStarted","Data":"8eb1d64fa3a641f04279cd84ba12ca85581e29038366ad29736e4c7e299d077c"} Jan 28 20:45:01 crc kubenswrapper[4746]: I0128 20:45:01.823997 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:01 crc kubenswrapper[4746]: I0128 20:45:01.827117 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" Jan 28 20:45:01 crc kubenswrapper[4746]: I0128 20:45:01.868256 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" podStartSLOduration=3.868232114 podStartE2EDuration="3.868232114s" podCreationTimestamp="2026-01-28 20:44:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:45:01.843727276 +0000 UTC m=+329.799913630" watchObservedRunningTime="2026-01-28 20:45:01.868232114 +0000 UTC m=+329.824418468" Jan 28 20:45:01 crc kubenswrapper[4746]: I0128 20:45:01.890253 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-699b895958-xsn9c" podStartSLOduration=3.890230652 podStartE2EDuration="3.890230652s" podCreationTimestamp="2026-01-28 20:44:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:45:01.86594583 +0000 UTC m=+329.822132184" watchObservedRunningTime="2026-01-28 20:45:01.890230652 +0000 UTC m=+329.846417006" Jan 28 20:45:02 crc kubenswrapper[4746]: I0128 20:45:02.051839 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55969676b6-hhpzt" Jan 28 20:45:03 crc kubenswrapper[4746]: I0128 20:45:03.220562 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9" Jan 28 20:45:03 crc kubenswrapper[4746]: I0128 20:45:03.289673 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/189cf38e-34c5-4cd9-ad46-db8bc26b458e-config-volume\") pod \"189cf38e-34c5-4cd9-ad46-db8bc26b458e\" (UID: \"189cf38e-34c5-4cd9-ad46-db8bc26b458e\") " Jan 28 20:45:03 crc kubenswrapper[4746]: I0128 20:45:03.289725 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h62vp\" (UniqueName: \"kubernetes.io/projected/189cf38e-34c5-4cd9-ad46-db8bc26b458e-kube-api-access-h62vp\") pod \"189cf38e-34c5-4cd9-ad46-db8bc26b458e\" (UID: \"189cf38e-34c5-4cd9-ad46-db8bc26b458e\") " Jan 28 20:45:03 crc kubenswrapper[4746]: I0128 20:45:03.289794 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/189cf38e-34c5-4cd9-ad46-db8bc26b458e-secret-volume\") pod \"189cf38e-34c5-4cd9-ad46-db8bc26b458e\" (UID: \"189cf38e-34c5-4cd9-ad46-db8bc26b458e\") " Jan 28 20:45:03 crc kubenswrapper[4746]: I0128 20:45:03.290676 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/189cf38e-34c5-4cd9-ad46-db8bc26b458e-config-volume" (OuterVolumeSpecName: "config-volume") pod "189cf38e-34c5-4cd9-ad46-db8bc26b458e" (UID: "189cf38e-34c5-4cd9-ad46-db8bc26b458e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:45:03 crc kubenswrapper[4746]: I0128 20:45:03.295998 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/189cf38e-34c5-4cd9-ad46-db8bc26b458e-kube-api-access-h62vp" (OuterVolumeSpecName: "kube-api-access-h62vp") pod "189cf38e-34c5-4cd9-ad46-db8bc26b458e" (UID: "189cf38e-34c5-4cd9-ad46-db8bc26b458e"). InnerVolumeSpecName "kube-api-access-h62vp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:45:03 crc kubenswrapper[4746]: I0128 20:45:03.296055 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/189cf38e-34c5-4cd9-ad46-db8bc26b458e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "189cf38e-34c5-4cd9-ad46-db8bc26b458e" (UID: "189cf38e-34c5-4cd9-ad46-db8bc26b458e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:45:03 crc kubenswrapper[4746]: I0128 20:45:03.390926 4746 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/189cf38e-34c5-4cd9-ad46-db8bc26b458e-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:03 crc kubenswrapper[4746]: I0128 20:45:03.390970 4746 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/189cf38e-34c5-4cd9-ad46-db8bc26b458e-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:03 crc kubenswrapper[4746]: I0128 20:45:03.390986 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h62vp\" (UniqueName: \"kubernetes.io/projected/189cf38e-34c5-4cd9-ad46-db8bc26b458e-kube-api-access-h62vp\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:03 crc kubenswrapper[4746]: I0128 20:45:03.839110 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9" event={"ID":"189cf38e-34c5-4cd9-ad46-db8bc26b458e","Type":"ContainerDied","Data":"1891265f86c306d32998c393fc9baeb1ac0c5c292ebd02bcc584ba69bdee285e"} Jan 28 20:45:03 crc kubenswrapper[4746]: I0128 20:45:03.839480 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1891265f86c306d32998c393fc9baeb1ac0c5c292ebd02bcc584ba69bdee285e" Jan 28 20:45:03 crc kubenswrapper[4746]: I0128 20:45:03.839374 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9" Jan 28 20:45:04 crc kubenswrapper[4746]: I0128 20:45:04.780734 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mxrrr"] Jan 28 20:45:04 crc kubenswrapper[4746]: E0128 20:45:04.783223 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189cf38e-34c5-4cd9-ad46-db8bc26b458e" containerName="collect-profiles" Jan 28 20:45:04 crc kubenswrapper[4746]: I0128 20:45:04.783243 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="189cf38e-34c5-4cd9-ad46-db8bc26b458e" containerName="collect-profiles" Jan 28 20:45:04 crc kubenswrapper[4746]: I0128 20:45:04.783408 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="189cf38e-34c5-4cd9-ad46-db8bc26b458e" containerName="collect-profiles" Jan 28 20:45:04 crc kubenswrapper[4746]: I0128 20:45:04.783986 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:04 crc kubenswrapper[4746]: I0128 20:45:04.798598 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mxrrr"] Jan 28 20:45:04 crc kubenswrapper[4746]: I0128 20:45:04.945907 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:04 crc kubenswrapper[4746]: I0128 20:45:04.946157 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5lzx\" (UniqueName: \"kubernetes.io/projected/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-kube-api-access-f5lzx\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:04 crc kubenswrapper[4746]: I0128 20:45:04.946298 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:04 crc kubenswrapper[4746]: I0128 20:45:04.946368 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-registry-certificates\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:04 crc kubenswrapper[4746]: I0128 20:45:04.946489 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:04 crc kubenswrapper[4746]: I0128 20:45:04.946609 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-trusted-ca\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:04 crc kubenswrapper[4746]: I0128 20:45:04.946640 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-bound-sa-token\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:04 crc kubenswrapper[4746]: I0128 20:45:04.946694 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-registry-tls\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:04 crc kubenswrapper[4746]: I0128 20:45:04.974322 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.048137 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.048202 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-registry-certificates\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.048262 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-trusted-ca\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.048288 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-bound-sa-token\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.048315 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-registry-tls\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.048352 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.048396 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5lzx\" (UniqueName: \"kubernetes.io/projected/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-kube-api-access-f5lzx\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.049480 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.050566 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-trusted-ca\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.051188 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-registry-certificates\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.053552 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-registry-tls\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.053873 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.066116 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-bound-sa-token\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.066969 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5lzx\" (UniqueName: \"kubernetes.io/projected/4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31-kube-api-access-f5lzx\") pod \"image-registry-66df7c8f76-mxrrr\" (UID: \"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31\") " pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.145484 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.569765 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mxrrr"] Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.858015 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" event={"ID":"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31","Type":"ContainerStarted","Data":"3b935cbee1a59b7c7d022604c573b77b2a69da7e0b82651ab1101b0926b684a3"} Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.858099 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" event={"ID":"4fc3ec9e-c1a7-42fc-8d07-4ea53d614c31","Type":"ContainerStarted","Data":"d6bb76eaf2d8cf4aeb6a9c062fce949d9ba86d4fc497ed4463eb94cefac35ccb"} Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.858367 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:05 crc kubenswrapper[4746]: I0128 20:45:05.887829 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" podStartSLOduration=1.887810987 podStartE2EDuration="1.887810987s" podCreationTimestamp="2026-01-28 20:45:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:45:05.886610403 +0000 UTC m=+333.842796767" watchObservedRunningTime="2026-01-28 20:45:05.887810987 +0000 UTC m=+333.843997341" Jan 28 20:45:25 crc kubenswrapper[4746]: I0128 20:45:25.158447 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mxrrr" Jan 28 20:45:25 crc kubenswrapper[4746]: I0128 20:45:25.257497 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lwhk4"] Jan 28 20:45:33 crc kubenswrapper[4746]: I0128 20:45:33.879527 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-85ttn"] Jan 28 20:45:33 crc kubenswrapper[4746]: I0128 20:45:33.880791 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-85ttn" podUID="6c585264-9fea-4d40-910d-68a31c553f76" containerName="registry-server" containerID="cri-o://3209e2910865c45825fc585990fa2b0e07aa9b8fcd03af6eb5bddeea419f0bcf" gracePeriod=30 Jan 28 20:45:33 crc kubenswrapper[4746]: I0128 20:45:33.888032 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ghx5p"] Jan 28 20:45:33 crc kubenswrapper[4746]: I0128 20:45:33.888382 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ghx5p" podUID="fa890224-0942-4671-a9d8-97b6f465b0df" containerName="registry-server" containerID="cri-o://a97c4a3cc18a3236a655a4072cc1a10ac809fb9a1cffaec7954c6199ad833df8" gracePeriod=30 Jan 28 20:45:33 crc kubenswrapper[4746]: I0128 20:45:33.906856 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sqh2q"] Jan 28 20:45:33 crc kubenswrapper[4746]: I0128 20:45:33.907240 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" podUID="3edaca00-e1a6-4b56-9290-cad6311263ee" containerName="marketplace-operator" containerID="cri-o://3c9b761d21788e5d9286b6f107962e6f405210483aad309dac0ce9df1dcdffbb" gracePeriod=30 Jan 28 20:45:33 crc kubenswrapper[4746]: I0128 20:45:33.911309 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4t66g"] Jan 28 20:45:33 crc kubenswrapper[4746]: I0128 20:45:33.911626 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4t66g" podUID="18cbfd39-cf22-428c-ab2a-708082df0357" containerName="registry-server" containerID="cri-o://143e6acaec90736206f109f4678283fa5aa04d423876a786f92adf3ec3991ecf" gracePeriod=30 Jan 28 20:45:33 crc kubenswrapper[4746]: I0128 20:45:33.916214 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mlqqs"] Jan 28 20:45:33 crc kubenswrapper[4746]: I0128 20:45:33.916442 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mlqqs" podUID="8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" containerName="registry-server" containerID="cri-o://1b916d1fcc57694a8d2d4f44b18cff98f131a258a4a52f7839523c53e042b66e" gracePeriod=30 Jan 28 20:45:33 crc kubenswrapper[4746]: I0128 20:45:33.930602 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bgtlc"] Jan 28 20:45:33 crc kubenswrapper[4746]: I0128 20:45:33.931446 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bgtlc" Jan 28 20:45:33 crc kubenswrapper[4746]: I0128 20:45:33.949670 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bgtlc"] Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.068714 4746 generic.go:334] "Generic (PLEG): container finished" podID="fa890224-0942-4671-a9d8-97b6f465b0df" containerID="a97c4a3cc18a3236a655a4072cc1a10ac809fb9a1cffaec7954c6199ad833df8" exitCode=0 Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.068796 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ghx5p" event={"ID":"fa890224-0942-4671-a9d8-97b6f465b0df","Type":"ContainerDied","Data":"a97c4a3cc18a3236a655a4072cc1a10ac809fb9a1cffaec7954c6199ad833df8"} Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.075373 4746 generic.go:334] "Generic (PLEG): container finished" podID="3edaca00-e1a6-4b56-9290-cad6311263ee" containerID="3c9b761d21788e5d9286b6f107962e6f405210483aad309dac0ce9df1dcdffbb" exitCode=0 Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.075475 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" event={"ID":"3edaca00-e1a6-4b56-9290-cad6311263ee","Type":"ContainerDied","Data":"3c9b761d21788e5d9286b6f107962e6f405210483aad309dac0ce9df1dcdffbb"} Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.075548 4746 scope.go:117] "RemoveContainer" containerID="247a6dffe30c252f58a1ab345b064bf16a2db97efef470149722bc5c23ef722d" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.079098 4746 generic.go:334] "Generic (PLEG): container finished" podID="18cbfd39-cf22-428c-ab2a-708082df0357" containerID="143e6acaec90736206f109f4678283fa5aa04d423876a786f92adf3ec3991ecf" exitCode=0 Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.079189 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t66g" event={"ID":"18cbfd39-cf22-428c-ab2a-708082df0357","Type":"ContainerDied","Data":"143e6acaec90736206f109f4678283fa5aa04d423876a786f92adf3ec3991ecf"} Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.081342 4746 generic.go:334] "Generic (PLEG): container finished" podID="8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" containerID="1b916d1fcc57694a8d2d4f44b18cff98f131a258a4a52f7839523c53e042b66e" exitCode=0 Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.081373 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlqqs" event={"ID":"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b","Type":"ContainerDied","Data":"1b916d1fcc57694a8d2d4f44b18cff98f131a258a4a52f7839523c53e042b66e"} Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.087364 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6663df81-0144-46d7-90a2-a1ff5edb9474-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bgtlc\" (UID: \"6663df81-0144-46d7-90a2-a1ff5edb9474\") " pod="openshift-marketplace/marketplace-operator-79b997595-bgtlc" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.087440 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p52d\" (UniqueName: \"kubernetes.io/projected/6663df81-0144-46d7-90a2-a1ff5edb9474-kube-api-access-8p52d\") pod \"marketplace-operator-79b997595-bgtlc\" (UID: \"6663df81-0144-46d7-90a2-a1ff5edb9474\") " pod="openshift-marketplace/marketplace-operator-79b997595-bgtlc" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.087464 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6663df81-0144-46d7-90a2-a1ff5edb9474-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bgtlc\" (UID: \"6663df81-0144-46d7-90a2-a1ff5edb9474\") " pod="openshift-marketplace/marketplace-operator-79b997595-bgtlc" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.189188 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6663df81-0144-46d7-90a2-a1ff5edb9474-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bgtlc\" (UID: \"6663df81-0144-46d7-90a2-a1ff5edb9474\") " pod="openshift-marketplace/marketplace-operator-79b997595-bgtlc" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.189240 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p52d\" (UniqueName: \"kubernetes.io/projected/6663df81-0144-46d7-90a2-a1ff5edb9474-kube-api-access-8p52d\") pod \"marketplace-operator-79b997595-bgtlc\" (UID: \"6663df81-0144-46d7-90a2-a1ff5edb9474\") " pod="openshift-marketplace/marketplace-operator-79b997595-bgtlc" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.189268 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6663df81-0144-46d7-90a2-a1ff5edb9474-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bgtlc\" (UID: \"6663df81-0144-46d7-90a2-a1ff5edb9474\") " pod="openshift-marketplace/marketplace-operator-79b997595-bgtlc" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.190506 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6663df81-0144-46d7-90a2-a1ff5edb9474-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bgtlc\" (UID: \"6663df81-0144-46d7-90a2-a1ff5edb9474\") " pod="openshift-marketplace/marketplace-operator-79b997595-bgtlc" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.197003 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6663df81-0144-46d7-90a2-a1ff5edb9474-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bgtlc\" (UID: \"6663df81-0144-46d7-90a2-a1ff5edb9474\") " pod="openshift-marketplace/marketplace-operator-79b997595-bgtlc" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.207864 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p52d\" (UniqueName: \"kubernetes.io/projected/6663df81-0144-46d7-90a2-a1ff5edb9474-kube-api-access-8p52d\") pod \"marketplace-operator-79b997595-bgtlc\" (UID: \"6663df81-0144-46d7-90a2-a1ff5edb9474\") " pod="openshift-marketplace/marketplace-operator-79b997595-bgtlc" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.255382 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bgtlc" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.460228 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4t66g" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.522785 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.583681 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-85ttn" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.594647 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n9sg\" (UniqueName: \"kubernetes.io/projected/18cbfd39-cf22-428c-ab2a-708082df0357-kube-api-access-6n9sg\") pod \"18cbfd39-cf22-428c-ab2a-708082df0357\" (UID: \"18cbfd39-cf22-428c-ab2a-708082df0357\") " Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.594770 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18cbfd39-cf22-428c-ab2a-708082df0357-catalog-content\") pod \"18cbfd39-cf22-428c-ab2a-708082df0357\" (UID: \"18cbfd39-cf22-428c-ab2a-708082df0357\") " Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.594878 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18cbfd39-cf22-428c-ab2a-708082df0357-utilities\") pod \"18cbfd39-cf22-428c-ab2a-708082df0357\" (UID: \"18cbfd39-cf22-428c-ab2a-708082df0357\") " Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.596222 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18cbfd39-cf22-428c-ab2a-708082df0357-utilities" (OuterVolumeSpecName: "utilities") pod "18cbfd39-cf22-428c-ab2a-708082df0357" (UID: "18cbfd39-cf22-428c-ab2a-708082df0357"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.596542 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18cbfd39-cf22-428c-ab2a-708082df0357-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.600640 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18cbfd39-cf22-428c-ab2a-708082df0357-kube-api-access-6n9sg" (OuterVolumeSpecName: "kube-api-access-6n9sg") pod "18cbfd39-cf22-428c-ab2a-708082df0357" (UID: "18cbfd39-cf22-428c-ab2a-708082df0357"). InnerVolumeSpecName "kube-api-access-6n9sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.639286 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18cbfd39-cf22-428c-ab2a-708082df0357-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18cbfd39-cf22-428c-ab2a-708082df0357" (UID: "18cbfd39-cf22-428c-ab2a-708082df0357"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:45:34 crc kubenswrapper[4746]: E0128 20:45:34.651671 4746 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1b916d1fcc57694a8d2d4f44b18cff98f131a258a4a52f7839523c53e042b66e is running failed: container process not found" containerID="1b916d1fcc57694a8d2d4f44b18cff98f131a258a4a52f7839523c53e042b66e" cmd=["grpc_health_probe","-addr=:50051"] Jan 28 20:45:34 crc kubenswrapper[4746]: E0128 20:45:34.652285 4746 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1b916d1fcc57694a8d2d4f44b18cff98f131a258a4a52f7839523c53e042b66e is running failed: container process not found" containerID="1b916d1fcc57694a8d2d4f44b18cff98f131a258a4a52f7839523c53e042b66e" cmd=["grpc_health_probe","-addr=:50051"] Jan 28 20:45:34 crc kubenswrapper[4746]: E0128 20:45:34.652847 4746 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1b916d1fcc57694a8d2d4f44b18cff98f131a258a4a52f7839523c53e042b66e is running failed: container process not found" containerID="1b916d1fcc57694a8d2d4f44b18cff98f131a258a4a52f7839523c53e042b66e" cmd=["grpc_health_probe","-addr=:50051"] Jan 28 20:45:34 crc kubenswrapper[4746]: E0128 20:45:34.653024 4746 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1b916d1fcc57694a8d2d4f44b18cff98f131a258a4a52f7839523c53e042b66e is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-mlqqs" podUID="8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" containerName="registry-server" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.698127 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c585264-9fea-4d40-910d-68a31c553f76-utilities\") pod \"6c585264-9fea-4d40-910d-68a31c553f76\" (UID: \"6c585264-9fea-4d40-910d-68a31c553f76\") " Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.698196 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c585264-9fea-4d40-910d-68a31c553f76-catalog-content\") pod \"6c585264-9fea-4d40-910d-68a31c553f76\" (UID: \"6c585264-9fea-4d40-910d-68a31c553f76\") " Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.698251 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3edaca00-e1a6-4b56-9290-cad6311263ee-marketplace-trusted-ca\") pod \"3edaca00-e1a6-4b56-9290-cad6311263ee\" (UID: \"3edaca00-e1a6-4b56-9290-cad6311263ee\") " Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.698324 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvnnm\" (UniqueName: \"kubernetes.io/projected/6c585264-9fea-4d40-910d-68a31c553f76-kube-api-access-pvnnm\") pod \"6c585264-9fea-4d40-910d-68a31c553f76\" (UID: \"6c585264-9fea-4d40-910d-68a31c553f76\") " Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.698348 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbstf\" (UniqueName: \"kubernetes.io/projected/3edaca00-e1a6-4b56-9290-cad6311263ee-kube-api-access-kbstf\") pod \"3edaca00-e1a6-4b56-9290-cad6311263ee\" (UID: \"3edaca00-e1a6-4b56-9290-cad6311263ee\") " Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.698473 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3edaca00-e1a6-4b56-9290-cad6311263ee-marketplace-operator-metrics\") pod \"3edaca00-e1a6-4b56-9290-cad6311263ee\" (UID: \"3edaca00-e1a6-4b56-9290-cad6311263ee\") " Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.698757 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n9sg\" (UniqueName: \"kubernetes.io/projected/18cbfd39-cf22-428c-ab2a-708082df0357-kube-api-access-6n9sg\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.698775 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18cbfd39-cf22-428c-ab2a-708082df0357-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.699156 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c585264-9fea-4d40-910d-68a31c553f76-utilities" (OuterVolumeSpecName: "utilities") pod "6c585264-9fea-4d40-910d-68a31c553f76" (UID: "6c585264-9fea-4d40-910d-68a31c553f76"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.699727 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3edaca00-e1a6-4b56-9290-cad6311263ee-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "3edaca00-e1a6-4b56-9290-cad6311263ee" (UID: "3edaca00-e1a6-4b56-9290-cad6311263ee"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.704485 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3edaca00-e1a6-4b56-9290-cad6311263ee-kube-api-access-kbstf" (OuterVolumeSpecName: "kube-api-access-kbstf") pod "3edaca00-e1a6-4b56-9290-cad6311263ee" (UID: "3edaca00-e1a6-4b56-9290-cad6311263ee"). InnerVolumeSpecName "kube-api-access-kbstf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.704634 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c585264-9fea-4d40-910d-68a31c553f76-kube-api-access-pvnnm" (OuterVolumeSpecName: "kube-api-access-pvnnm") pod "6c585264-9fea-4d40-910d-68a31c553f76" (UID: "6c585264-9fea-4d40-910d-68a31c553f76"). InnerVolumeSpecName "kube-api-access-pvnnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.705112 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3edaca00-e1a6-4b56-9290-cad6311263ee-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "3edaca00-e1a6-4b56-9290-cad6311263ee" (UID: "3edaca00-e1a6-4b56-9290-cad6311263ee"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.748539 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c585264-9fea-4d40-910d-68a31c553f76-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c585264-9fea-4d40-910d-68a31c553f76" (UID: "6c585264-9fea-4d40-910d-68a31c553f76"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:45:34 crc kubenswrapper[4746]: W0128 20:45:34.772193 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6663df81_0144_46d7_90a2_a1ff5edb9474.slice/crio-1d2feb3a40f27df22f757a3885e93f8cc6599639c499db636492efd779262a7f WatchSource:0}: Error finding container 1d2feb3a40f27df22f757a3885e93f8cc6599639c499db636492efd779262a7f: Status 404 returned error can't find the container with id 1d2feb3a40f27df22f757a3885e93f8cc6599639c499db636492efd779262a7f Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.775363 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bgtlc"] Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.800466 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvnnm\" (UniqueName: \"kubernetes.io/projected/6c585264-9fea-4d40-910d-68a31c553f76-kube-api-access-pvnnm\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.800500 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbstf\" (UniqueName: \"kubernetes.io/projected/3edaca00-e1a6-4b56-9290-cad6311263ee-kube-api-access-kbstf\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.800515 4746 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3edaca00-e1a6-4b56-9290-cad6311263ee-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.800531 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c585264-9fea-4d40-910d-68a31c553f76-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.800547 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c585264-9fea-4d40-910d-68a31c553f76-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.800559 4746 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3edaca00-e1a6-4b56-9290-cad6311263ee-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.854363 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ghx5p" Jan 28 20:45:34 crc kubenswrapper[4746]: I0128 20:45:34.938637 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mlqqs" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.004421 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2s85\" (UniqueName: \"kubernetes.io/projected/fa890224-0942-4671-a9d8-97b6f465b0df-kube-api-access-k2s85\") pod \"fa890224-0942-4671-a9d8-97b6f465b0df\" (UID: \"fa890224-0942-4671-a9d8-97b6f465b0df\") " Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.004623 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa890224-0942-4671-a9d8-97b6f465b0df-catalog-content\") pod \"fa890224-0942-4671-a9d8-97b6f465b0df\" (UID: \"fa890224-0942-4671-a9d8-97b6f465b0df\") " Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.004657 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa890224-0942-4671-a9d8-97b6f465b0df-utilities\") pod \"fa890224-0942-4671-a9d8-97b6f465b0df\" (UID: \"fa890224-0942-4671-a9d8-97b6f465b0df\") " Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.005770 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa890224-0942-4671-a9d8-97b6f465b0df-utilities" (OuterVolumeSpecName: "utilities") pod "fa890224-0942-4671-a9d8-97b6f465b0df" (UID: "fa890224-0942-4671-a9d8-97b6f465b0df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.013195 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa890224-0942-4671-a9d8-97b6f465b0df-kube-api-access-k2s85" (OuterVolumeSpecName: "kube-api-access-k2s85") pod "fa890224-0942-4671-a9d8-97b6f465b0df" (UID: "fa890224-0942-4671-a9d8-97b6f465b0df"). InnerVolumeSpecName "kube-api-access-k2s85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.061654 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa890224-0942-4671-a9d8-97b6f465b0df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa890224-0942-4671-a9d8-97b6f465b0df" (UID: "fa890224-0942-4671-a9d8-97b6f465b0df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.089504 4746 generic.go:334] "Generic (PLEG): container finished" podID="6c585264-9fea-4d40-910d-68a31c553f76" containerID="3209e2910865c45825fc585990fa2b0e07aa9b8fcd03af6eb5bddeea419f0bcf" exitCode=0 Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.089616 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85ttn" event={"ID":"6c585264-9fea-4d40-910d-68a31c553f76","Type":"ContainerDied","Data":"3209e2910865c45825fc585990fa2b0e07aa9b8fcd03af6eb5bddeea419f0bcf"} Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.089628 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-85ttn" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.089661 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85ttn" event={"ID":"6c585264-9fea-4d40-910d-68a31c553f76","Type":"ContainerDied","Data":"1c152bc56d2dadac40f6ef2599da3caee82ce68ae0450e9ed48e6d99dec850bc"} Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.089689 4746 scope.go:117] "RemoveContainer" containerID="3209e2910865c45825fc585990fa2b0e07aa9b8fcd03af6eb5bddeea419f0bcf" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.092177 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.092172 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sqh2q" event={"ID":"3edaca00-e1a6-4b56-9290-cad6311263ee","Type":"ContainerDied","Data":"bf46698bee7daf86d3e201e06503c2f58587dd4227307d61a9381a6addafc3cf"} Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.095658 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4t66g" event={"ID":"18cbfd39-cf22-428c-ab2a-708082df0357","Type":"ContainerDied","Data":"dbcf39c2a0ca5808d27f687c44c2a3c58ea2e8dd4303076e253d69c06dd6de74"} Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.095760 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4t66g" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.098591 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mlqqs" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.098588 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlqqs" event={"ID":"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b","Type":"ContainerDied","Data":"813e0f60a52a5e3ac9fdec874e3f88c3de70aa414cf90abda02a22436f40adaa"} Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.101611 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bgtlc" event={"ID":"6663df81-0144-46d7-90a2-a1ff5edb9474","Type":"ContainerStarted","Data":"a9bb2ed5bf770de13001b86ab7fac91e2e100d9983572c64c81f01a182f506ab"} Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.101652 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bgtlc" event={"ID":"6663df81-0144-46d7-90a2-a1ff5edb9474","Type":"ContainerStarted","Data":"1d2feb3a40f27df22f757a3885e93f8cc6599639c499db636492efd779262a7f"} Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.102781 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bgtlc" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.114441 4746 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bgtlc container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.72:8080/healthz\": dial tcp 10.217.0.72:8080: connect: connection refused" start-of-body= Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.114540 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bgtlc" podUID="6663df81-0144-46d7-90a2-a1ff5edb9474" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.72:8080/healthz\": dial tcp 10.217.0.72:8080: connect: connection refused" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.115617 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b-utilities\") pod \"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b\" (UID: \"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b\") " Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.115667 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b-catalog-content\") pod \"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b\" (UID: \"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b\") " Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.115747 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvjxs\" (UniqueName: \"kubernetes.io/projected/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b-kube-api-access-xvjxs\") pod \"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b\" (UID: \"8b55bfa9-466f-44d9-8bc7-753bff9b7a7b\") " Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.116438 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2s85\" (UniqueName: \"kubernetes.io/projected/fa890224-0942-4671-a9d8-97b6f465b0df-kube-api-access-k2s85\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.116466 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa890224-0942-4671-a9d8-97b6f465b0df-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.116477 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa890224-0942-4671-a9d8-97b6f465b0df-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.118276 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b-utilities" (OuterVolumeSpecName: "utilities") pod "8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" (UID: "8b55bfa9-466f-44d9-8bc7-753bff9b7a7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.120421 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b-kube-api-access-xvjxs" (OuterVolumeSpecName: "kube-api-access-xvjxs") pod "8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" (UID: "8b55bfa9-466f-44d9-8bc7-753bff9b7a7b"). InnerVolumeSpecName "kube-api-access-xvjxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.121730 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ghx5p" event={"ID":"fa890224-0942-4671-a9d8-97b6f465b0df","Type":"ContainerDied","Data":"1ec69a65aba434976ee0a2f50750e942a4b966ba8d025d472492feb7673348a6"} Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.121919 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ghx5p" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.136711 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4t66g"] Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.141861 4746 scope.go:117] "RemoveContainer" containerID="2705ebede777955e1f157a73a21abb2133c3b3f0ad608862ab9a313f6568a8a8" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.143367 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4t66g"] Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.154642 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sqh2q"] Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.164191 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sqh2q"] Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.165880 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-85ttn"] Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.168796 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-85ttn"] Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.178918 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bgtlc" podStartSLOduration=2.17888948 podStartE2EDuration="2.17888948s" podCreationTimestamp="2026-01-28 20:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:45:35.174784805 +0000 UTC m=+363.130971159" watchObservedRunningTime="2026-01-28 20:45:35.17888948 +0000 UTC m=+363.135075834" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.180595 4746 scope.go:117] "RemoveContainer" containerID="6fac53a10394eaceda79307d69b7558b0c7a7c5ebc41fd8e2a9dabdab713dee5" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.193617 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ghx5p"] Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.202483 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ghx5p"] Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.205691 4746 scope.go:117] "RemoveContainer" containerID="3209e2910865c45825fc585990fa2b0e07aa9b8fcd03af6eb5bddeea419f0bcf" Jan 28 20:45:35 crc kubenswrapper[4746]: E0128 20:45:35.206065 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3209e2910865c45825fc585990fa2b0e07aa9b8fcd03af6eb5bddeea419f0bcf\": container with ID starting with 3209e2910865c45825fc585990fa2b0e07aa9b8fcd03af6eb5bddeea419f0bcf not found: ID does not exist" containerID="3209e2910865c45825fc585990fa2b0e07aa9b8fcd03af6eb5bddeea419f0bcf" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.206114 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3209e2910865c45825fc585990fa2b0e07aa9b8fcd03af6eb5bddeea419f0bcf"} err="failed to get container status \"3209e2910865c45825fc585990fa2b0e07aa9b8fcd03af6eb5bddeea419f0bcf\": rpc error: code = NotFound desc = could not find container \"3209e2910865c45825fc585990fa2b0e07aa9b8fcd03af6eb5bddeea419f0bcf\": container with ID starting with 3209e2910865c45825fc585990fa2b0e07aa9b8fcd03af6eb5bddeea419f0bcf not found: ID does not exist" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.206143 4746 scope.go:117] "RemoveContainer" containerID="2705ebede777955e1f157a73a21abb2133c3b3f0ad608862ab9a313f6568a8a8" Jan 28 20:45:35 crc kubenswrapper[4746]: E0128 20:45:35.206887 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2705ebede777955e1f157a73a21abb2133c3b3f0ad608862ab9a313f6568a8a8\": container with ID starting with 2705ebede777955e1f157a73a21abb2133c3b3f0ad608862ab9a313f6568a8a8 not found: ID does not exist" containerID="2705ebede777955e1f157a73a21abb2133c3b3f0ad608862ab9a313f6568a8a8" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.206911 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2705ebede777955e1f157a73a21abb2133c3b3f0ad608862ab9a313f6568a8a8"} err="failed to get container status \"2705ebede777955e1f157a73a21abb2133c3b3f0ad608862ab9a313f6568a8a8\": rpc error: code = NotFound desc = could not find container \"2705ebede777955e1f157a73a21abb2133c3b3f0ad608862ab9a313f6568a8a8\": container with ID starting with 2705ebede777955e1f157a73a21abb2133c3b3f0ad608862ab9a313f6568a8a8 not found: ID does not exist" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.206926 4746 scope.go:117] "RemoveContainer" containerID="6fac53a10394eaceda79307d69b7558b0c7a7c5ebc41fd8e2a9dabdab713dee5" Jan 28 20:45:35 crc kubenswrapper[4746]: E0128 20:45:35.207240 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fac53a10394eaceda79307d69b7558b0c7a7c5ebc41fd8e2a9dabdab713dee5\": container with ID starting with 6fac53a10394eaceda79307d69b7558b0c7a7c5ebc41fd8e2a9dabdab713dee5 not found: ID does not exist" containerID="6fac53a10394eaceda79307d69b7558b0c7a7c5ebc41fd8e2a9dabdab713dee5" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.207307 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fac53a10394eaceda79307d69b7558b0c7a7c5ebc41fd8e2a9dabdab713dee5"} err="failed to get container status \"6fac53a10394eaceda79307d69b7558b0c7a7c5ebc41fd8e2a9dabdab713dee5\": rpc error: code = NotFound desc = could not find container \"6fac53a10394eaceda79307d69b7558b0c7a7c5ebc41fd8e2a9dabdab713dee5\": container with ID starting with 6fac53a10394eaceda79307d69b7558b0c7a7c5ebc41fd8e2a9dabdab713dee5 not found: ID does not exist" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.207344 4746 scope.go:117] "RemoveContainer" containerID="3c9b761d21788e5d9286b6f107962e6f405210483aad309dac0ce9df1dcdffbb" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.220552 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.220653 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvjxs\" (UniqueName: \"kubernetes.io/projected/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b-kube-api-access-xvjxs\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.224569 4746 scope.go:117] "RemoveContainer" containerID="143e6acaec90736206f109f4678283fa5aa04d423876a786f92adf3ec3991ecf" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.243085 4746 scope.go:117] "RemoveContainer" containerID="3d2f7c7bdc3aee38214514d5420ddf8c61bd01380a6c2d114f424e19461defae" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.262419 4746 scope.go:117] "RemoveContainer" containerID="640584f5445acc292be3e899fbe66326f8914115659d5783f447fd6048c07e74" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.278797 4746 scope.go:117] "RemoveContainer" containerID="1b916d1fcc57694a8d2d4f44b18cff98f131a258a4a52f7839523c53e042b66e" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.282600 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" (UID: "8b55bfa9-466f-44d9-8bc7-753bff9b7a7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.298463 4746 scope.go:117] "RemoveContainer" containerID="578f2b2763780b4281717d9382666e95eb03ed3741ab3a72e97691170c3313bd" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.322043 4746 scope.go:117] "RemoveContainer" containerID="9499d6f79c0f3462bab4e907e6efe9120566a0596c9ba1f9be584a03dd93d8d1" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.323272 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.338308 4746 scope.go:117] "RemoveContainer" containerID="a97c4a3cc18a3236a655a4072cc1a10ac809fb9a1cffaec7954c6199ad833df8" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.357712 4746 scope.go:117] "RemoveContainer" containerID="21fc934156582dff48d7bbd6ca5af885c135910d225d3ce4e1850dd45c5df869" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.377471 4746 scope.go:117] "RemoveContainer" containerID="7d8495ae7bda633303f679c164a3d194416ae004312340eab8c9690fba4331f2" Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.432807 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mlqqs"] Jan 28 20:45:35 crc kubenswrapper[4746]: I0128 20:45:35.438536 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mlqqs"] Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.099281 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-525kj"] Jan 28 20:45:36 crc kubenswrapper[4746]: E0128 20:45:36.099963 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c585264-9fea-4d40-910d-68a31c553f76" containerName="extract-content" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.099982 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c585264-9fea-4d40-910d-68a31c553f76" containerName="extract-content" Jan 28 20:45:36 crc kubenswrapper[4746]: E0128 20:45:36.099997 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3edaca00-e1a6-4b56-9290-cad6311263ee" containerName="marketplace-operator" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100005 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="3edaca00-e1a6-4b56-9290-cad6311263ee" containerName="marketplace-operator" Jan 28 20:45:36 crc kubenswrapper[4746]: E0128 20:45:36.100018 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" containerName="extract-content" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100027 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" containerName="extract-content" Jan 28 20:45:36 crc kubenswrapper[4746]: E0128 20:45:36.100038 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa890224-0942-4671-a9d8-97b6f465b0df" containerName="extract-utilities" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100046 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa890224-0942-4671-a9d8-97b6f465b0df" containerName="extract-utilities" Jan 28 20:45:36 crc kubenswrapper[4746]: E0128 20:45:36.100059 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa890224-0942-4671-a9d8-97b6f465b0df" containerName="extract-content" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100066 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa890224-0942-4671-a9d8-97b6f465b0df" containerName="extract-content" Jan 28 20:45:36 crc kubenswrapper[4746]: E0128 20:45:36.100096 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3edaca00-e1a6-4b56-9290-cad6311263ee" containerName="marketplace-operator" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100104 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="3edaca00-e1a6-4b56-9290-cad6311263ee" containerName="marketplace-operator" Jan 28 20:45:36 crc kubenswrapper[4746]: E0128 20:45:36.100116 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c585264-9fea-4d40-910d-68a31c553f76" containerName="extract-utilities" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100124 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c585264-9fea-4d40-910d-68a31c553f76" containerName="extract-utilities" Jan 28 20:45:36 crc kubenswrapper[4746]: E0128 20:45:36.100137 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18cbfd39-cf22-428c-ab2a-708082df0357" containerName="registry-server" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100145 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="18cbfd39-cf22-428c-ab2a-708082df0357" containerName="registry-server" Jan 28 20:45:36 crc kubenswrapper[4746]: E0128 20:45:36.100158 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c585264-9fea-4d40-910d-68a31c553f76" containerName="registry-server" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100166 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c585264-9fea-4d40-910d-68a31c553f76" containerName="registry-server" Jan 28 20:45:36 crc kubenswrapper[4746]: E0128 20:45:36.100179 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18cbfd39-cf22-428c-ab2a-708082df0357" containerName="extract-utilities" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100186 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="18cbfd39-cf22-428c-ab2a-708082df0357" containerName="extract-utilities" Jan 28 20:45:36 crc kubenswrapper[4746]: E0128 20:45:36.100197 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18cbfd39-cf22-428c-ab2a-708082df0357" containerName="extract-content" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100204 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="18cbfd39-cf22-428c-ab2a-708082df0357" containerName="extract-content" Jan 28 20:45:36 crc kubenswrapper[4746]: E0128 20:45:36.100215 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa890224-0942-4671-a9d8-97b6f465b0df" containerName="registry-server" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100222 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa890224-0942-4671-a9d8-97b6f465b0df" containerName="registry-server" Jan 28 20:45:36 crc kubenswrapper[4746]: E0128 20:45:36.100234 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" containerName="registry-server" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100241 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" containerName="registry-server" Jan 28 20:45:36 crc kubenswrapper[4746]: E0128 20:45:36.100252 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" containerName="extract-utilities" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100259 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" containerName="extract-utilities" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100374 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="18cbfd39-cf22-428c-ab2a-708082df0357" containerName="registry-server" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100388 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" containerName="registry-server" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100398 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="3edaca00-e1a6-4b56-9290-cad6311263ee" containerName="marketplace-operator" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100407 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa890224-0942-4671-a9d8-97b6f465b0df" containerName="registry-server" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100423 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c585264-9fea-4d40-910d-68a31c553f76" containerName="registry-server" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.100433 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="3edaca00-e1a6-4b56-9290-cad6311263ee" containerName="marketplace-operator" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.101437 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-525kj" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.104595 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.124223 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-525kj"] Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.134825 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjqxq\" (UniqueName: \"kubernetes.io/projected/13f17d99-49bb-4710-8d6b-e01933b5d396-kube-api-access-pjqxq\") pod \"redhat-marketplace-525kj\" (UID: \"13f17d99-49bb-4710-8d6b-e01933b5d396\") " pod="openshift-marketplace/redhat-marketplace-525kj" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.134903 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13f17d99-49bb-4710-8d6b-e01933b5d396-catalog-content\") pod \"redhat-marketplace-525kj\" (UID: \"13f17d99-49bb-4710-8d6b-e01933b5d396\") " pod="openshift-marketplace/redhat-marketplace-525kj" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.134940 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13f17d99-49bb-4710-8d6b-e01933b5d396-utilities\") pod \"redhat-marketplace-525kj\" (UID: \"13f17d99-49bb-4710-8d6b-e01933b5d396\") " pod="openshift-marketplace/redhat-marketplace-525kj" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.175749 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bgtlc" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.236027 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjqxq\" (UniqueName: \"kubernetes.io/projected/13f17d99-49bb-4710-8d6b-e01933b5d396-kube-api-access-pjqxq\") pod \"redhat-marketplace-525kj\" (UID: \"13f17d99-49bb-4710-8d6b-e01933b5d396\") " pod="openshift-marketplace/redhat-marketplace-525kj" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.236114 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13f17d99-49bb-4710-8d6b-e01933b5d396-catalog-content\") pod \"redhat-marketplace-525kj\" (UID: \"13f17d99-49bb-4710-8d6b-e01933b5d396\") " pod="openshift-marketplace/redhat-marketplace-525kj" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.236150 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13f17d99-49bb-4710-8d6b-e01933b5d396-utilities\") pod \"redhat-marketplace-525kj\" (UID: \"13f17d99-49bb-4710-8d6b-e01933b5d396\") " pod="openshift-marketplace/redhat-marketplace-525kj" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.237491 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13f17d99-49bb-4710-8d6b-e01933b5d396-catalog-content\") pod \"redhat-marketplace-525kj\" (UID: \"13f17d99-49bb-4710-8d6b-e01933b5d396\") " pod="openshift-marketplace/redhat-marketplace-525kj" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.238020 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13f17d99-49bb-4710-8d6b-e01933b5d396-utilities\") pod \"redhat-marketplace-525kj\" (UID: \"13f17d99-49bb-4710-8d6b-e01933b5d396\") " pod="openshift-marketplace/redhat-marketplace-525kj" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.260651 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjqxq\" (UniqueName: \"kubernetes.io/projected/13f17d99-49bb-4710-8d6b-e01933b5d396-kube-api-access-pjqxq\") pod \"redhat-marketplace-525kj\" (UID: \"13f17d99-49bb-4710-8d6b-e01933b5d396\") " pod="openshift-marketplace/redhat-marketplace-525kj" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.298989 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hp8gs"] Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.304363 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hp8gs" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.307994 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.313138 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hp8gs"] Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.439431 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5a70df4-251e-4d72-b220-9772f0b70727-utilities\") pod \"redhat-operators-hp8gs\" (UID: \"b5a70df4-251e-4d72-b220-9772f0b70727\") " pod="openshift-marketplace/redhat-operators-hp8gs" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.439501 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6k26\" (UniqueName: \"kubernetes.io/projected/b5a70df4-251e-4d72-b220-9772f0b70727-kube-api-access-c6k26\") pod \"redhat-operators-hp8gs\" (UID: \"b5a70df4-251e-4d72-b220-9772f0b70727\") " pod="openshift-marketplace/redhat-operators-hp8gs" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.439573 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5a70df4-251e-4d72-b220-9772f0b70727-catalog-content\") pod \"redhat-operators-hp8gs\" (UID: \"b5a70df4-251e-4d72-b220-9772f0b70727\") " pod="openshift-marketplace/redhat-operators-hp8gs" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.485393 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-525kj" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.540564 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5a70df4-251e-4d72-b220-9772f0b70727-utilities\") pod \"redhat-operators-hp8gs\" (UID: \"b5a70df4-251e-4d72-b220-9772f0b70727\") " pod="openshift-marketplace/redhat-operators-hp8gs" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.541066 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6k26\" (UniqueName: \"kubernetes.io/projected/b5a70df4-251e-4d72-b220-9772f0b70727-kube-api-access-c6k26\") pod \"redhat-operators-hp8gs\" (UID: \"b5a70df4-251e-4d72-b220-9772f0b70727\") " pod="openshift-marketplace/redhat-operators-hp8gs" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.541264 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5a70df4-251e-4d72-b220-9772f0b70727-catalog-content\") pod \"redhat-operators-hp8gs\" (UID: \"b5a70df4-251e-4d72-b220-9772f0b70727\") " pod="openshift-marketplace/redhat-operators-hp8gs" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.541520 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5a70df4-251e-4d72-b220-9772f0b70727-utilities\") pod \"redhat-operators-hp8gs\" (UID: \"b5a70df4-251e-4d72-b220-9772f0b70727\") " pod="openshift-marketplace/redhat-operators-hp8gs" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.541677 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5a70df4-251e-4d72-b220-9772f0b70727-catalog-content\") pod \"redhat-operators-hp8gs\" (UID: \"b5a70df4-251e-4d72-b220-9772f0b70727\") " pod="openshift-marketplace/redhat-operators-hp8gs" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.572883 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6k26\" (UniqueName: \"kubernetes.io/projected/b5a70df4-251e-4d72-b220-9772f0b70727-kube-api-access-c6k26\") pod \"redhat-operators-hp8gs\" (UID: \"b5a70df4-251e-4d72-b220-9772f0b70727\") " pod="openshift-marketplace/redhat-operators-hp8gs" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.632446 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hp8gs" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.845195 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18cbfd39-cf22-428c-ab2a-708082df0357" path="/var/lib/kubelet/pods/18cbfd39-cf22-428c-ab2a-708082df0357/volumes" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.848096 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3edaca00-e1a6-4b56-9290-cad6311263ee" path="/var/lib/kubelet/pods/3edaca00-e1a6-4b56-9290-cad6311263ee/volumes" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.848669 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c585264-9fea-4d40-910d-68a31c553f76" path="/var/lib/kubelet/pods/6c585264-9fea-4d40-910d-68a31c553f76/volumes" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.849779 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b55bfa9-466f-44d9-8bc7-753bff9b7a7b" path="/var/lib/kubelet/pods/8b55bfa9-466f-44d9-8bc7-753bff9b7a7b/volumes" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.850460 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa890224-0942-4671-a9d8-97b6f465b0df" path="/var/lib/kubelet/pods/fa890224-0942-4671-a9d8-97b6f465b0df/volumes" Jan 28 20:45:36 crc kubenswrapper[4746]: I0128 20:45:36.909178 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-525kj"] Jan 28 20:45:36 crc kubenswrapper[4746]: W0128 20:45:36.913754 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13f17d99_49bb_4710_8d6b_e01933b5d396.slice/crio-6db160a8f6c3902a7846233f5e3e4de70118b98d1ec2b12e78cf75e58a35b67a WatchSource:0}: Error finding container 6db160a8f6c3902a7846233f5e3e4de70118b98d1ec2b12e78cf75e58a35b67a: Status 404 returned error can't find the container with id 6db160a8f6c3902a7846233f5e3e4de70118b98d1ec2b12e78cf75e58a35b67a Jan 28 20:45:37 crc kubenswrapper[4746]: I0128 20:45:37.071786 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hp8gs"] Jan 28 20:45:37 crc kubenswrapper[4746]: I0128 20:45:37.147558 4746 generic.go:334] "Generic (PLEG): container finished" podID="13f17d99-49bb-4710-8d6b-e01933b5d396" containerID="3ceebdeacf394149f4b2c9eadd7e4a6ac8a11dbe0f71b2fac5f193c6a049d9f0" exitCode=0 Jan 28 20:45:37 crc kubenswrapper[4746]: I0128 20:45:37.147678 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-525kj" event={"ID":"13f17d99-49bb-4710-8d6b-e01933b5d396","Type":"ContainerDied","Data":"3ceebdeacf394149f4b2c9eadd7e4a6ac8a11dbe0f71b2fac5f193c6a049d9f0"} Jan 28 20:45:37 crc kubenswrapper[4746]: I0128 20:45:37.147734 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-525kj" event={"ID":"13f17d99-49bb-4710-8d6b-e01933b5d396","Type":"ContainerStarted","Data":"6db160a8f6c3902a7846233f5e3e4de70118b98d1ec2b12e78cf75e58a35b67a"} Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.157275 4746 generic.go:334] "Generic (PLEG): container finished" podID="13f17d99-49bb-4710-8d6b-e01933b5d396" containerID="04aa225a227adc0e87333ebad82eff21319d3e60d5d98b12c6b056811d9b2cf6" exitCode=0 Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.157550 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-525kj" event={"ID":"13f17d99-49bb-4710-8d6b-e01933b5d396","Type":"ContainerDied","Data":"04aa225a227adc0e87333ebad82eff21319d3e60d5d98b12c6b056811d9b2cf6"} Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.159841 4746 generic.go:334] "Generic (PLEG): container finished" podID="b5a70df4-251e-4d72-b220-9772f0b70727" containerID="8a14af09e46bf1f74810f8efbe07c01dc9b39174909690dcdeba0c17796fd26f" exitCode=0 Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.160699 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp8gs" event={"ID":"b5a70df4-251e-4d72-b220-9772f0b70727","Type":"ContainerDied","Data":"8a14af09e46bf1f74810f8efbe07c01dc9b39174909690dcdeba0c17796fd26f"} Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.160757 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp8gs" event={"ID":"b5a70df4-251e-4d72-b220-9772f0b70727","Type":"ContainerStarted","Data":"efa3521a931ac8891b91ffc82005fafa278880a191cf26af125b4554ca561068"} Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.499373 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s6mqb"] Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.500909 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s6mqb" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.512667 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.518007 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s6mqb"] Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.575792 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f854deb7-4783-4ba8-8357-ffe4d1124a12-utilities\") pod \"certified-operators-s6mqb\" (UID: \"f854deb7-4783-4ba8-8357-ffe4d1124a12\") " pod="openshift-marketplace/certified-operators-s6mqb" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.575840 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn9jz\" (UniqueName: \"kubernetes.io/projected/f854deb7-4783-4ba8-8357-ffe4d1124a12-kube-api-access-zn9jz\") pod \"certified-operators-s6mqb\" (UID: \"f854deb7-4783-4ba8-8357-ffe4d1124a12\") " pod="openshift-marketplace/certified-operators-s6mqb" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.575880 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f854deb7-4783-4ba8-8357-ffe4d1124a12-catalog-content\") pod \"certified-operators-s6mqb\" (UID: \"f854deb7-4783-4ba8-8357-ffe4d1124a12\") " pod="openshift-marketplace/certified-operators-s6mqb" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.677613 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f854deb7-4783-4ba8-8357-ffe4d1124a12-catalog-content\") pod \"certified-operators-s6mqb\" (UID: \"f854deb7-4783-4ba8-8357-ffe4d1124a12\") " pod="openshift-marketplace/certified-operators-s6mqb" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.677753 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f854deb7-4783-4ba8-8357-ffe4d1124a12-utilities\") pod \"certified-operators-s6mqb\" (UID: \"f854deb7-4783-4ba8-8357-ffe4d1124a12\") " pod="openshift-marketplace/certified-operators-s6mqb" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.677787 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn9jz\" (UniqueName: \"kubernetes.io/projected/f854deb7-4783-4ba8-8357-ffe4d1124a12-kube-api-access-zn9jz\") pod \"certified-operators-s6mqb\" (UID: \"f854deb7-4783-4ba8-8357-ffe4d1124a12\") " pod="openshift-marketplace/certified-operators-s6mqb" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.678231 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f854deb7-4783-4ba8-8357-ffe4d1124a12-catalog-content\") pod \"certified-operators-s6mqb\" (UID: \"f854deb7-4783-4ba8-8357-ffe4d1124a12\") " pod="openshift-marketplace/certified-operators-s6mqb" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.678653 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f854deb7-4783-4ba8-8357-ffe4d1124a12-utilities\") pod \"certified-operators-s6mqb\" (UID: \"f854deb7-4783-4ba8-8357-ffe4d1124a12\") " pod="openshift-marketplace/certified-operators-s6mqb" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.700713 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-88s2s"] Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.702617 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-88s2s" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.704110 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn9jz\" (UniqueName: \"kubernetes.io/projected/f854deb7-4783-4ba8-8357-ffe4d1124a12-kube-api-access-zn9jz\") pod \"certified-operators-s6mqb\" (UID: \"f854deb7-4783-4ba8-8357-ffe4d1124a12\") " pod="openshift-marketplace/certified-operators-s6mqb" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.705509 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.711947 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-88s2s"] Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.826986 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s6mqb" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.884568 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv5tl\" (UniqueName: \"kubernetes.io/projected/2f7998d2-06e5-4567-8502-79dadd37daec-kube-api-access-mv5tl\") pod \"community-operators-88s2s\" (UID: \"2f7998d2-06e5-4567-8502-79dadd37daec\") " pod="openshift-marketplace/community-operators-88s2s" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.885096 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f7998d2-06e5-4567-8502-79dadd37daec-utilities\") pod \"community-operators-88s2s\" (UID: \"2f7998d2-06e5-4567-8502-79dadd37daec\") " pod="openshift-marketplace/community-operators-88s2s" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.885120 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f7998d2-06e5-4567-8502-79dadd37daec-catalog-content\") pod \"community-operators-88s2s\" (UID: \"2f7998d2-06e5-4567-8502-79dadd37daec\") " pod="openshift-marketplace/community-operators-88s2s" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.986371 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f7998d2-06e5-4567-8502-79dadd37daec-utilities\") pod \"community-operators-88s2s\" (UID: \"2f7998d2-06e5-4567-8502-79dadd37daec\") " pod="openshift-marketplace/community-operators-88s2s" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.986414 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f7998d2-06e5-4567-8502-79dadd37daec-catalog-content\") pod \"community-operators-88s2s\" (UID: \"2f7998d2-06e5-4567-8502-79dadd37daec\") " pod="openshift-marketplace/community-operators-88s2s" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.986488 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv5tl\" (UniqueName: \"kubernetes.io/projected/2f7998d2-06e5-4567-8502-79dadd37daec-kube-api-access-mv5tl\") pod \"community-operators-88s2s\" (UID: \"2f7998d2-06e5-4567-8502-79dadd37daec\") " pod="openshift-marketplace/community-operators-88s2s" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.986954 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f7998d2-06e5-4567-8502-79dadd37daec-utilities\") pod \"community-operators-88s2s\" (UID: \"2f7998d2-06e5-4567-8502-79dadd37daec\") " pod="openshift-marketplace/community-operators-88s2s" Jan 28 20:45:38 crc kubenswrapper[4746]: I0128 20:45:38.986954 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f7998d2-06e5-4567-8502-79dadd37daec-catalog-content\") pod \"community-operators-88s2s\" (UID: \"2f7998d2-06e5-4567-8502-79dadd37daec\") " pod="openshift-marketplace/community-operators-88s2s" Jan 28 20:45:39 crc kubenswrapper[4746]: I0128 20:45:39.011697 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv5tl\" (UniqueName: \"kubernetes.io/projected/2f7998d2-06e5-4567-8502-79dadd37daec-kube-api-access-mv5tl\") pod \"community-operators-88s2s\" (UID: \"2f7998d2-06e5-4567-8502-79dadd37daec\") " pod="openshift-marketplace/community-operators-88s2s" Jan 28 20:45:39 crc kubenswrapper[4746]: I0128 20:45:39.061187 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-88s2s" Jan 28 20:45:39 crc kubenswrapper[4746]: I0128 20:45:39.170261 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp8gs" event={"ID":"b5a70df4-251e-4d72-b220-9772f0b70727","Type":"ContainerStarted","Data":"fd0180da72b1d94f4e7566f9896afa4f50b1eecb776290499fe115bba7f04b3d"} Jan 28 20:45:39 crc kubenswrapper[4746]: I0128 20:45:39.178582 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-525kj" event={"ID":"13f17d99-49bb-4710-8d6b-e01933b5d396","Type":"ContainerStarted","Data":"83fcccb9809160fee3793908e81c2af023e162caf6d6326c4dcf7d1eeeba9289"} Jan 28 20:45:39 crc kubenswrapper[4746]: I0128 20:45:39.226842 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-525kj" podStartSLOduration=1.779423209 podStartE2EDuration="3.226824089s" podCreationTimestamp="2026-01-28 20:45:36 +0000 UTC" firstStartedPulling="2026-01-28 20:45:37.149158532 +0000 UTC m=+365.105344886" lastFinishedPulling="2026-01-28 20:45:38.596559412 +0000 UTC m=+366.552745766" observedRunningTime="2026-01-28 20:45:39.221185271 +0000 UTC m=+367.177371625" watchObservedRunningTime="2026-01-28 20:45:39.226824089 +0000 UTC m=+367.183010443" Jan 28 20:45:39 crc kubenswrapper[4746]: I0128 20:45:39.292414 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s6mqb"] Jan 28 20:45:39 crc kubenswrapper[4746]: W0128 20:45:39.301107 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf854deb7_4783_4ba8_8357_ffe4d1124a12.slice/crio-413fa9a2c20067530673b39d23f12794682407c91bc2aeec0469025bf0dddada WatchSource:0}: Error finding container 413fa9a2c20067530673b39d23f12794682407c91bc2aeec0469025bf0dddada: Status 404 returned error can't find the container with id 413fa9a2c20067530673b39d23f12794682407c91bc2aeec0469025bf0dddada Jan 28 20:45:39 crc kubenswrapper[4746]: I0128 20:45:39.486010 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-88s2s"] Jan 28 20:45:39 crc kubenswrapper[4746]: W0128 20:45:39.515833 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f7998d2_06e5_4567_8502_79dadd37daec.slice/crio-d5f6bc8a861feb6422d15d73e40085348dec960ad0fba7150c1a825053353132 WatchSource:0}: Error finding container d5f6bc8a861feb6422d15d73e40085348dec960ad0fba7150c1a825053353132: Status 404 returned error can't find the container with id d5f6bc8a861feb6422d15d73e40085348dec960ad0fba7150c1a825053353132 Jan 28 20:45:40 crc kubenswrapper[4746]: I0128 20:45:40.189307 4746 generic.go:334] "Generic (PLEG): container finished" podID="b5a70df4-251e-4d72-b220-9772f0b70727" containerID="fd0180da72b1d94f4e7566f9896afa4f50b1eecb776290499fe115bba7f04b3d" exitCode=0 Jan 28 20:45:40 crc kubenswrapper[4746]: I0128 20:45:40.189396 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp8gs" event={"ID":"b5a70df4-251e-4d72-b220-9772f0b70727","Type":"ContainerDied","Data":"fd0180da72b1d94f4e7566f9896afa4f50b1eecb776290499fe115bba7f04b3d"} Jan 28 20:45:40 crc kubenswrapper[4746]: I0128 20:45:40.193622 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6mqb" event={"ID":"f854deb7-4783-4ba8-8357-ffe4d1124a12","Type":"ContainerDied","Data":"fc089671a7b7efa345349612efbe95f8f41036144c13d84a03aca4e0aa1da969"} Jan 28 20:45:40 crc kubenswrapper[4746]: I0128 20:45:40.193562 4746 generic.go:334] "Generic (PLEG): container finished" podID="f854deb7-4783-4ba8-8357-ffe4d1124a12" containerID="fc089671a7b7efa345349612efbe95f8f41036144c13d84a03aca4e0aa1da969" exitCode=0 Jan 28 20:45:40 crc kubenswrapper[4746]: I0128 20:45:40.193883 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6mqb" event={"ID":"f854deb7-4783-4ba8-8357-ffe4d1124a12","Type":"ContainerStarted","Data":"413fa9a2c20067530673b39d23f12794682407c91bc2aeec0469025bf0dddada"} Jan 28 20:45:40 crc kubenswrapper[4746]: I0128 20:45:40.198421 4746 generic.go:334] "Generic (PLEG): container finished" podID="2f7998d2-06e5-4567-8502-79dadd37daec" containerID="c470b7c9d0247bb7f0c5a6c290e638b4b01fcb6ed1358d7876525bd7eed3baef" exitCode=0 Jan 28 20:45:40 crc kubenswrapper[4746]: I0128 20:45:40.199612 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-88s2s" event={"ID":"2f7998d2-06e5-4567-8502-79dadd37daec","Type":"ContainerDied","Data":"c470b7c9d0247bb7f0c5a6c290e638b4b01fcb6ed1358d7876525bd7eed3baef"} Jan 28 20:45:40 crc kubenswrapper[4746]: I0128 20:45:40.199637 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-88s2s" event={"ID":"2f7998d2-06e5-4567-8502-79dadd37daec","Type":"ContainerStarted","Data":"d5f6bc8a861feb6422d15d73e40085348dec960ad0fba7150c1a825053353132"} Jan 28 20:45:42 crc kubenswrapper[4746]: I0128 20:45:42.228408 4746 generic.go:334] "Generic (PLEG): container finished" podID="2f7998d2-06e5-4567-8502-79dadd37daec" containerID="2ea5cc17978ea75750badf376cea48cb3fda092f44fdc2342f800669531e3b98" exitCode=0 Jan 28 20:45:42 crc kubenswrapper[4746]: I0128 20:45:42.228651 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-88s2s" event={"ID":"2f7998d2-06e5-4567-8502-79dadd37daec","Type":"ContainerDied","Data":"2ea5cc17978ea75750badf376cea48cb3fda092f44fdc2342f800669531e3b98"} Jan 28 20:45:42 crc kubenswrapper[4746]: I0128 20:45:42.236430 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hp8gs" event={"ID":"b5a70df4-251e-4d72-b220-9772f0b70727","Type":"ContainerStarted","Data":"dfddfa5c9047a2ef7cd63544d1c7f24a99691b3d15f2e413ada6695095233ada"} Jan 28 20:45:42 crc kubenswrapper[4746]: I0128 20:45:42.240612 4746 generic.go:334] "Generic (PLEG): container finished" podID="f854deb7-4783-4ba8-8357-ffe4d1124a12" containerID="d5b2313d4b6fc875d1d2879c981862d46af0d11e2a6849219d9ed8db75b88804" exitCode=0 Jan 28 20:45:42 crc kubenswrapper[4746]: I0128 20:45:42.240678 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6mqb" event={"ID":"f854deb7-4783-4ba8-8357-ffe4d1124a12","Type":"ContainerDied","Data":"d5b2313d4b6fc875d1d2879c981862d46af0d11e2a6849219d9ed8db75b88804"} Jan 28 20:45:42 crc kubenswrapper[4746]: I0128 20:45:42.296915 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hp8gs" podStartSLOduration=3.40243427 podStartE2EDuration="6.296885241s" podCreationTimestamp="2026-01-28 20:45:36 +0000 UTC" firstStartedPulling="2026-01-28 20:45:38.161331352 +0000 UTC m=+366.117517706" lastFinishedPulling="2026-01-28 20:45:41.055782323 +0000 UTC m=+369.011968677" observedRunningTime="2026-01-28 20:45:42.278251298 +0000 UTC m=+370.234437652" watchObservedRunningTime="2026-01-28 20:45:42.296885241 +0000 UTC m=+370.253071615" Jan 28 20:45:43 crc kubenswrapper[4746]: I0128 20:45:43.250281 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6mqb" event={"ID":"f854deb7-4783-4ba8-8357-ffe4d1124a12","Type":"ContainerStarted","Data":"dcefab010bdb0c8724a3b3e438a70f00b63153f577ea95b2a1a2f6aaef94bd83"} Jan 28 20:45:43 crc kubenswrapper[4746]: I0128 20:45:43.253937 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-88s2s" event={"ID":"2f7998d2-06e5-4567-8502-79dadd37daec","Type":"ContainerStarted","Data":"240630b3f929a7174a368a00a6d6b5373aacd39f665bb7935f5d4e3915a0e412"} Jan 28 20:45:43 crc kubenswrapper[4746]: I0128 20:45:43.272275 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s6mqb" podStartSLOduration=2.806487552 podStartE2EDuration="5.272247567s" podCreationTimestamp="2026-01-28 20:45:38 +0000 UTC" firstStartedPulling="2026-01-28 20:45:40.197452082 +0000 UTC m=+368.153638436" lastFinishedPulling="2026-01-28 20:45:42.663212097 +0000 UTC m=+370.619398451" observedRunningTime="2026-01-28 20:45:43.26913195 +0000 UTC m=+371.225318304" watchObservedRunningTime="2026-01-28 20:45:43.272247567 +0000 UTC m=+371.228433931" Jan 28 20:45:43 crc kubenswrapper[4746]: I0128 20:45:43.293384 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-88s2s" podStartSLOduration=2.7814693310000003 podStartE2EDuration="5.29335422s" podCreationTimestamp="2026-01-28 20:45:38 +0000 UTC" firstStartedPulling="2026-01-28 20:45:40.200126908 +0000 UTC m=+368.156313272" lastFinishedPulling="2026-01-28 20:45:42.712011807 +0000 UTC m=+370.668198161" observedRunningTime="2026-01-28 20:45:43.289853682 +0000 UTC m=+371.246040036" watchObservedRunningTime="2026-01-28 20:45:43.29335422 +0000 UTC m=+371.249540584" Jan 28 20:45:45 crc kubenswrapper[4746]: I0128 20:45:45.872166 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:45:45 crc kubenswrapper[4746]: I0128 20:45:45.872936 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:45:46 crc kubenswrapper[4746]: I0128 20:45:46.486717 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-525kj" Jan 28 20:45:46 crc kubenswrapper[4746]: I0128 20:45:46.487316 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-525kj" Jan 28 20:45:46 crc kubenswrapper[4746]: I0128 20:45:46.549775 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-525kj" Jan 28 20:45:46 crc kubenswrapper[4746]: I0128 20:45:46.632910 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hp8gs" Jan 28 20:45:46 crc kubenswrapper[4746]: I0128 20:45:46.633160 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hp8gs" Jan 28 20:45:47 crc kubenswrapper[4746]: I0128 20:45:47.334768 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-525kj" Jan 28 20:45:47 crc kubenswrapper[4746]: I0128 20:45:47.685823 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hp8gs" podUID="b5a70df4-251e-4d72-b220-9772f0b70727" containerName="registry-server" probeResult="failure" output=< Jan 28 20:45:47 crc kubenswrapper[4746]: timeout: failed to connect service ":50051" within 1s Jan 28 20:45:47 crc kubenswrapper[4746]: > Jan 28 20:45:48 crc kubenswrapper[4746]: I0128 20:45:48.827864 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s6mqb" Jan 28 20:45:48 crc kubenswrapper[4746]: I0128 20:45:48.828492 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s6mqb" Jan 28 20:45:48 crc kubenswrapper[4746]: I0128 20:45:48.881309 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s6mqb" Jan 28 20:45:49 crc kubenswrapper[4746]: I0128 20:45:49.062234 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-88s2s" Jan 28 20:45:49 crc kubenswrapper[4746]: I0128 20:45:49.062331 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-88s2s" Jan 28 20:45:49 crc kubenswrapper[4746]: I0128 20:45:49.126869 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-88s2s" Jan 28 20:45:49 crc kubenswrapper[4746]: I0128 20:45:49.340187 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s6mqb" Jan 28 20:45:49 crc kubenswrapper[4746]: I0128 20:45:49.358993 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-88s2s" Jan 28 20:45:50 crc kubenswrapper[4746]: I0128 20:45:50.310236 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" podUID="627f2e7c-f091-4ea2-9c3c-fce02f2b7669" containerName="registry" containerID="cri-o://d9c0f04370f1bdb461f2c7bd32fdc1dac6f456ff6b66d902ff14df19d51c7911" gracePeriod=30 Jan 28 20:45:51 crc kubenswrapper[4746]: I0128 20:45:51.307045 4746 generic.go:334] "Generic (PLEG): container finished" podID="627f2e7c-f091-4ea2-9c3c-fce02f2b7669" containerID="d9c0f04370f1bdb461f2c7bd32fdc1dac6f456ff6b66d902ff14df19d51c7911" exitCode=0 Jan 28 20:45:51 crc kubenswrapper[4746]: I0128 20:45:51.307156 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" event={"ID":"627f2e7c-f091-4ea2-9c3c-fce02f2b7669","Type":"ContainerDied","Data":"d9c0f04370f1bdb461f2c7bd32fdc1dac6f456ff6b66d902ff14df19d51c7911"} Jan 28 20:45:51 crc kubenswrapper[4746]: I0128 20:45:51.989566 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.104346 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwdsp\" (UniqueName: \"kubernetes.io/projected/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-kube-api-access-lwdsp\") pod \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.104448 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-ca-trust-extracted\") pod \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.104480 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-trusted-ca\") pod \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.104581 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-bound-sa-token\") pod \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.104632 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-installation-pull-secrets\") pod \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.104656 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-registry-tls\") pod \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.104973 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.104999 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-registry-certificates\") pod \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\" (UID: \"627f2e7c-f091-4ea2-9c3c-fce02f2b7669\") " Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.105463 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "627f2e7c-f091-4ea2-9c3c-fce02f2b7669" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.105682 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "627f2e7c-f091-4ea2-9c3c-fce02f2b7669" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.107174 4746 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.107211 4746 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.111541 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-kube-api-access-lwdsp" (OuterVolumeSpecName: "kube-api-access-lwdsp") pod "627f2e7c-f091-4ea2-9c3c-fce02f2b7669" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669"). InnerVolumeSpecName "kube-api-access-lwdsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.112962 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "627f2e7c-f091-4ea2-9c3c-fce02f2b7669" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.113396 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "627f2e7c-f091-4ea2-9c3c-fce02f2b7669" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.117234 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "627f2e7c-f091-4ea2-9c3c-fce02f2b7669" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.117453 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "627f2e7c-f091-4ea2-9c3c-fce02f2b7669" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.127395 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "627f2e7c-f091-4ea2-9c3c-fce02f2b7669" (UID: "627f2e7c-f091-4ea2-9c3c-fce02f2b7669"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.208434 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwdsp\" (UniqueName: \"kubernetes.io/projected/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-kube-api-access-lwdsp\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.208481 4746 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.208490 4746 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.208499 4746 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.208510 4746 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/627f2e7c-f091-4ea2-9c3c-fce02f2b7669-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.322196 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" event={"ID":"627f2e7c-f091-4ea2-9c3c-fce02f2b7669","Type":"ContainerDied","Data":"d0c002ddefd779f35ff45eca4808099be622b3caba190689f19423097af3c16d"} Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.322271 4746 scope.go:117] "RemoveContainer" containerID="d9c0f04370f1bdb461f2c7bd32fdc1dac6f456ff6b66d902ff14df19d51c7911" Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.322375 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lwhk4" Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.367794 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lwhk4"] Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.375934 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lwhk4"] Jan 28 20:45:52 crc kubenswrapper[4746]: I0128 20:45:52.842476 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="627f2e7c-f091-4ea2-9c3c-fce02f2b7669" path="/var/lib/kubelet/pods/627f2e7c-f091-4ea2-9c3c-fce02f2b7669/volumes" Jan 28 20:45:56 crc kubenswrapper[4746]: I0128 20:45:56.676059 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hp8gs" Jan 28 20:45:56 crc kubenswrapper[4746]: I0128 20:45:56.733636 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hp8gs" Jan 28 20:46:15 crc kubenswrapper[4746]: I0128 20:46:15.871446 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:46:15 crc kubenswrapper[4746]: I0128 20:46:15.872444 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:46:45 crc kubenswrapper[4746]: I0128 20:46:45.871288 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:46:45 crc kubenswrapper[4746]: I0128 20:46:45.871872 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:46:45 crc kubenswrapper[4746]: I0128 20:46:45.871927 4746 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:46:45 crc kubenswrapper[4746]: I0128 20:46:45.872679 4746 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f197cfecdfc241f837aa378849ac18dd600cbcb4c925257a8bc1cdfb97e289aa"} pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 20:46:45 crc kubenswrapper[4746]: I0128 20:46:45.872730 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" containerID="cri-o://f197cfecdfc241f837aa378849ac18dd600cbcb4c925257a8bc1cdfb97e289aa" gracePeriod=600 Jan 28 20:46:46 crc kubenswrapper[4746]: I0128 20:46:46.724345 4746 generic.go:334] "Generic (PLEG): container finished" podID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerID="f197cfecdfc241f837aa378849ac18dd600cbcb4c925257a8bc1cdfb97e289aa" exitCode=0 Jan 28 20:46:46 crc kubenswrapper[4746]: I0128 20:46:46.724510 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerDied","Data":"f197cfecdfc241f837aa378849ac18dd600cbcb4c925257a8bc1cdfb97e289aa"} Jan 28 20:46:46 crc kubenswrapper[4746]: I0128 20:46:46.724791 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerStarted","Data":"e798c8759f80b0bd2201227041fd066d21ab146038c8bedf3a5228a982d21b64"} Jan 28 20:46:46 crc kubenswrapper[4746]: I0128 20:46:46.724814 4746 scope.go:117] "RemoveContainer" containerID="66d250466155027405bf90c4e5ed09388238d4cee63604e28486a16778f9d188" Jan 28 20:49:15 crc kubenswrapper[4746]: I0128 20:49:15.871220 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:49:15 crc kubenswrapper[4746]: I0128 20:49:15.872471 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:49:45 crc kubenswrapper[4746]: I0128 20:49:45.872016 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:49:45 crc kubenswrapper[4746]: I0128 20:49:45.872877 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:50:15 crc kubenswrapper[4746]: I0128 20:50:15.872143 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:50:15 crc kubenswrapper[4746]: I0128 20:50:15.873140 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:50:15 crc kubenswrapper[4746]: I0128 20:50:15.873222 4746 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:50:15 crc kubenswrapper[4746]: I0128 20:50:15.874292 4746 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e798c8759f80b0bd2201227041fd066d21ab146038c8bedf3a5228a982d21b64"} pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 20:50:15 crc kubenswrapper[4746]: I0128 20:50:15.874403 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" containerID="cri-o://e798c8759f80b0bd2201227041fd066d21ab146038c8bedf3a5228a982d21b64" gracePeriod=600 Jan 28 20:50:16 crc kubenswrapper[4746]: I0128 20:50:16.313121 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerDied","Data":"e798c8759f80b0bd2201227041fd066d21ab146038c8bedf3a5228a982d21b64"} Jan 28 20:50:16 crc kubenswrapper[4746]: I0128 20:50:16.313122 4746 generic.go:334] "Generic (PLEG): container finished" podID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerID="e798c8759f80b0bd2201227041fd066d21ab146038c8bedf3a5228a982d21b64" exitCode=0 Jan 28 20:50:16 crc kubenswrapper[4746]: I0128 20:50:16.313538 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerStarted","Data":"4dbcdfa14610109c45d3514591f8d6ce15356b36ba815407076266ee1f95c6fd"} Jan 28 20:50:16 crc kubenswrapper[4746]: I0128 20:50:16.313486 4746 scope.go:117] "RemoveContainer" containerID="f197cfecdfc241f837aa378849ac18dd600cbcb4c925257a8bc1cdfb97e289aa" Jan 28 20:50:23 crc kubenswrapper[4746]: I0128 20:50:23.001040 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8"] Jan 28 20:50:23 crc kubenswrapper[4746]: E0128 20:50:23.001992 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627f2e7c-f091-4ea2-9c3c-fce02f2b7669" containerName="registry" Jan 28 20:50:23 crc kubenswrapper[4746]: I0128 20:50:23.002007 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="627f2e7c-f091-4ea2-9c3c-fce02f2b7669" containerName="registry" Jan 28 20:50:23 crc kubenswrapper[4746]: I0128 20:50:23.002113 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="627f2e7c-f091-4ea2-9c3c-fce02f2b7669" containerName="registry" Jan 28 20:50:23 crc kubenswrapper[4746]: I0128 20:50:23.002844 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8" Jan 28 20:50:23 crc kubenswrapper[4746]: I0128 20:50:23.005941 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 28 20:50:23 crc kubenswrapper[4746]: I0128 20:50:23.014451 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8"] Jan 28 20:50:23 crc kubenswrapper[4746]: I0128 20:50:23.063753 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e93805e-6f5f-4618-b962-d9fca6cfe272-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8\" (UID: \"3e93805e-6f5f-4618-b962-d9fca6cfe272\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8" Jan 28 20:50:23 crc kubenswrapper[4746]: I0128 20:50:23.063807 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e93805e-6f5f-4618-b962-d9fca6cfe272-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8\" (UID: \"3e93805e-6f5f-4618-b962-d9fca6cfe272\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8" Jan 28 20:50:23 crc kubenswrapper[4746]: I0128 20:50:23.063864 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrrpn\" (UniqueName: \"kubernetes.io/projected/3e93805e-6f5f-4618-b962-d9fca6cfe272-kube-api-access-zrrpn\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8\" (UID: \"3e93805e-6f5f-4618-b962-d9fca6cfe272\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8" Jan 28 20:50:23 crc kubenswrapper[4746]: I0128 20:50:23.165243 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrrpn\" (UniqueName: \"kubernetes.io/projected/3e93805e-6f5f-4618-b962-d9fca6cfe272-kube-api-access-zrrpn\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8\" (UID: \"3e93805e-6f5f-4618-b962-d9fca6cfe272\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8" Jan 28 20:50:23 crc kubenswrapper[4746]: I0128 20:50:23.165351 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e93805e-6f5f-4618-b962-d9fca6cfe272-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8\" (UID: \"3e93805e-6f5f-4618-b962-d9fca6cfe272\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8" Jan 28 20:50:23 crc kubenswrapper[4746]: I0128 20:50:23.165391 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e93805e-6f5f-4618-b962-d9fca6cfe272-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8\" (UID: \"3e93805e-6f5f-4618-b962-d9fca6cfe272\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8" Jan 28 20:50:23 crc kubenswrapper[4746]: I0128 20:50:23.166189 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e93805e-6f5f-4618-b962-d9fca6cfe272-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8\" (UID: \"3e93805e-6f5f-4618-b962-d9fca6cfe272\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8" Jan 28 20:50:23 crc kubenswrapper[4746]: I0128 20:50:23.166307 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e93805e-6f5f-4618-b962-d9fca6cfe272-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8\" (UID: \"3e93805e-6f5f-4618-b962-d9fca6cfe272\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8" Jan 28 20:50:23 crc kubenswrapper[4746]: I0128 20:50:23.190360 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrrpn\" (UniqueName: \"kubernetes.io/projected/3e93805e-6f5f-4618-b962-d9fca6cfe272-kube-api-access-zrrpn\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8\" (UID: \"3e93805e-6f5f-4618-b962-d9fca6cfe272\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8" Jan 28 20:50:23 crc kubenswrapper[4746]: I0128 20:50:23.357730 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8" Jan 28 20:50:23 crc kubenswrapper[4746]: I0128 20:50:23.802598 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8"] Jan 28 20:50:24 crc kubenswrapper[4746]: I0128 20:50:24.371335 4746 generic.go:334] "Generic (PLEG): container finished" podID="3e93805e-6f5f-4618-b962-d9fca6cfe272" containerID="7b21522ab8b6b2ec4086b7eed9e804ffb28da1161642807dc040aaea34333fd4" exitCode=0 Jan 28 20:50:24 crc kubenswrapper[4746]: I0128 20:50:24.371403 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8" event={"ID":"3e93805e-6f5f-4618-b962-d9fca6cfe272","Type":"ContainerDied","Data":"7b21522ab8b6b2ec4086b7eed9e804ffb28da1161642807dc040aaea34333fd4"} Jan 28 20:50:24 crc kubenswrapper[4746]: I0128 20:50:24.371794 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8" event={"ID":"3e93805e-6f5f-4618-b962-d9fca6cfe272","Type":"ContainerStarted","Data":"b862d816f5677cb6f5d4c410e0a7bdde0e06b766c6e6e9d899a42de37eb23988"} Jan 28 20:50:24 crc kubenswrapper[4746]: I0128 20:50:24.374107 4746 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 20:50:26 crc kubenswrapper[4746]: I0128 20:50:26.388728 4746 generic.go:334] "Generic (PLEG): container finished" podID="3e93805e-6f5f-4618-b962-d9fca6cfe272" containerID="7292fcf521a2647368e67165cd93e587e7cdf0cb57e6a1801931455bbdcf48ec" exitCode=0 Jan 28 20:50:26 crc kubenswrapper[4746]: I0128 20:50:26.388834 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8" event={"ID":"3e93805e-6f5f-4618-b962-d9fca6cfe272","Type":"ContainerDied","Data":"7292fcf521a2647368e67165cd93e587e7cdf0cb57e6a1801931455bbdcf48ec"} Jan 28 20:50:27 crc kubenswrapper[4746]: I0128 20:50:27.399282 4746 generic.go:334] "Generic (PLEG): container finished" podID="3e93805e-6f5f-4618-b962-d9fca6cfe272" containerID="d417f54055e7bcc3b3373fa1e268cbecfef7a579bae89ce898df995c4aae5310" exitCode=0 Jan 28 20:50:27 crc kubenswrapper[4746]: I0128 20:50:27.399366 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8" event={"ID":"3e93805e-6f5f-4618-b962-d9fca6cfe272","Type":"ContainerDied","Data":"d417f54055e7bcc3b3373fa1e268cbecfef7a579bae89ce898df995c4aae5310"} Jan 28 20:50:28 crc kubenswrapper[4746]: I0128 20:50:28.710606 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8" Jan 28 20:50:28 crc kubenswrapper[4746]: I0128 20:50:28.851369 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrrpn\" (UniqueName: \"kubernetes.io/projected/3e93805e-6f5f-4618-b962-d9fca6cfe272-kube-api-access-zrrpn\") pod \"3e93805e-6f5f-4618-b962-d9fca6cfe272\" (UID: \"3e93805e-6f5f-4618-b962-d9fca6cfe272\") " Jan 28 20:50:28 crc kubenswrapper[4746]: I0128 20:50:28.852037 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e93805e-6f5f-4618-b962-d9fca6cfe272-bundle\") pod \"3e93805e-6f5f-4618-b962-d9fca6cfe272\" (UID: \"3e93805e-6f5f-4618-b962-d9fca6cfe272\") " Jan 28 20:50:28 crc kubenswrapper[4746]: I0128 20:50:28.852124 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e93805e-6f5f-4618-b962-d9fca6cfe272-util\") pod \"3e93805e-6f5f-4618-b962-d9fca6cfe272\" (UID: \"3e93805e-6f5f-4618-b962-d9fca6cfe272\") " Jan 28 20:50:28 crc kubenswrapper[4746]: I0128 20:50:28.854650 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e93805e-6f5f-4618-b962-d9fca6cfe272-bundle" (OuterVolumeSpecName: "bundle") pod "3e93805e-6f5f-4618-b962-d9fca6cfe272" (UID: "3e93805e-6f5f-4618-b962-d9fca6cfe272"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:50:28 crc kubenswrapper[4746]: I0128 20:50:28.862609 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e93805e-6f5f-4618-b962-d9fca6cfe272-kube-api-access-zrrpn" (OuterVolumeSpecName: "kube-api-access-zrrpn") pod "3e93805e-6f5f-4618-b962-d9fca6cfe272" (UID: "3e93805e-6f5f-4618-b962-d9fca6cfe272"). InnerVolumeSpecName "kube-api-access-zrrpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:50:28 crc kubenswrapper[4746]: I0128 20:50:28.875367 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e93805e-6f5f-4618-b962-d9fca6cfe272-util" (OuterVolumeSpecName: "util") pod "3e93805e-6f5f-4618-b962-d9fca6cfe272" (UID: "3e93805e-6f5f-4618-b962-d9fca6cfe272"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:50:28 crc kubenswrapper[4746]: I0128 20:50:28.954477 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrrpn\" (UniqueName: \"kubernetes.io/projected/3e93805e-6f5f-4618-b962-d9fca6cfe272-kube-api-access-zrrpn\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:28 crc kubenswrapper[4746]: I0128 20:50:28.954539 4746 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3e93805e-6f5f-4618-b962-d9fca6cfe272-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:28 crc kubenswrapper[4746]: I0128 20:50:28.954554 4746 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3e93805e-6f5f-4618-b962-d9fca6cfe272-util\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:29 crc kubenswrapper[4746]: I0128 20:50:29.418258 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8" event={"ID":"3e93805e-6f5f-4618-b962-d9fca6cfe272","Type":"ContainerDied","Data":"b862d816f5677cb6f5d4c410e0a7bdde0e06b766c6e6e9d899a42de37eb23988"} Jan 28 20:50:29 crc kubenswrapper[4746]: I0128 20:50:29.418332 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b862d816f5677cb6f5d4c410e0a7bdde0e06b766c6e6e9d899a42de37eb23988" Jan 28 20:50:29 crc kubenswrapper[4746]: I0128 20:50:29.418348 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.001729 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8vmvh"] Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.002853 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovn-controller" containerID="cri-o://57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d" gracePeriod=30 Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.002917 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f" gracePeriod=30 Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.002963 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="sbdb" containerID="cri-o://cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc" gracePeriod=30 Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.002980 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovn-acl-logging" containerID="cri-o://af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88" gracePeriod=30 Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.002948 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="northd" containerID="cri-o://b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88" gracePeriod=30 Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.003053 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="kube-rbac-proxy-node" containerID="cri-o://2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1" gracePeriod=30 Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.002917 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="nbdb" containerID="cri-o://17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec" gracePeriod=30 Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.049730 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovnkube-controller" containerID="cri-o://41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b" gracePeriod=30 Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.449905 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovnkube-controller/3.log" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.452871 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovnkube-controller/3.log" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.453330 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovn-acl-logging/0.log" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.457918 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovn-controller/0.log" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.458567 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.459299 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovn-acl-logging/0.log" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.459832 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8vmvh_c4d15639-62fb-41b7-a1d4-6f51f3af6d99/ovn-controller/0.log" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460193 4746 generic.go:334] "Generic (PLEG): container finished" podID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerID="41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b" exitCode=0 Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460223 4746 generic.go:334] "Generic (PLEG): container finished" podID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerID="cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc" exitCode=0 Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460231 4746 generic.go:334] "Generic (PLEG): container finished" podID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerID="17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec" exitCode=0 Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460242 4746 generic.go:334] "Generic (PLEG): container finished" podID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerID="b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88" exitCode=0 Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460253 4746 generic.go:334] "Generic (PLEG): container finished" podID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerID="73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f" exitCode=0 Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460261 4746 generic.go:334] "Generic (PLEG): container finished" podID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerID="2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1" exitCode=0 Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460269 4746 generic.go:334] "Generic (PLEG): container finished" podID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerID="af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88" exitCode=143 Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460261 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerDied","Data":"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460309 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerDied","Data":"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460324 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerDied","Data":"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460338 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerDied","Data":"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460349 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerDied","Data":"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460369 4746 scope.go:117] "RemoveContainer" containerID="41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460370 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerDied","Data":"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460501 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460517 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460523 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460546 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460554 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460560 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460565 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460570 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460575 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460584 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerDied","Data":"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460594 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460603 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460659 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460665 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460688 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460695 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460702 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460708 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460714 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460719 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460279 4746 generic.go:334] "Generic (PLEG): container finished" podID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerID="57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d" exitCode=143 Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460727 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerDied","Data":"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460739 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460747 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460753 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460758 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460763 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460768 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460774 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460779 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460784 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460789 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460797 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" event={"ID":"c4d15639-62fb-41b7-a1d4-6f51f3af6d99","Type":"ContainerDied","Data":"898c5dcede9f2cde269997afb517c22e410533983244db06d196bc12f2c22793"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460806 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460831 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460836 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460842 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460848 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460852 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460858 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460863 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460870 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.460875 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.463340 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qhpvf_cdf26de0-b602-4bdf-b492-65b3b6b31434/kube-multus/2.log" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.463721 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qhpvf_cdf26de0-b602-4bdf-b492-65b3b6b31434/kube-multus/1.log" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.463757 4746 generic.go:334] "Generic (PLEG): container finished" podID="cdf26de0-b602-4bdf-b492-65b3b6b31434" containerID="7739b8614574ec2b9e7c1e6a6e443f85ce5fa487dd8be878363a899877ff42f3" exitCode=2 Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.463793 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qhpvf" event={"ID":"cdf26de0-b602-4bdf-b492-65b3b6b31434","Type":"ContainerDied","Data":"7739b8614574ec2b9e7c1e6a6e443f85ce5fa487dd8be878363a899877ff42f3"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.463806 4746 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c88cd0b151f23677f83a944570f36a7a6edd30d9be47638a17cbb9098ec1c23"} Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.464452 4746 scope.go:117] "RemoveContainer" containerID="7739b8614574ec2b9e7c1e6a6e443f85ce5fa487dd8be878363a899877ff42f3" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.464782 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-qhpvf_openshift-multus(cdf26de0-b602-4bdf-b492-65b3b6b31434)\"" pod="openshift-multus/multus-qhpvf" podUID="cdf26de0-b602-4bdf-b492-65b3b6b31434" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.490047 4746 scope.go:117] "RemoveContainer" containerID="dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.510962 4746 scope.go:117] "RemoveContainer" containerID="cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547315 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-etc-openvswitch\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547371 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-env-overrides\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547413 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-slash\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547451 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-run-ovn-kubernetes\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547528 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-var-lib-openvswitch\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547511 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547570 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-slash" (OuterVolumeSpecName: "host-slash") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547638 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547685 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547689 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-systemd-units\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547719 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547811 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-cni-bin\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547851 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgrqs\" (UniqueName: \"kubernetes.io/projected/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-kube-api-access-kgrqs\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547876 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-ovnkube-config\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547897 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-kubelet\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547915 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547927 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-ovn-node-metrics-cert\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547985 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.548072 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-run-openvswitch\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.547951 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.548148 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-var-lib-cni-networks-ovn-kubernetes\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.548273 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-run-ovn\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.548379 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-ovnkube-script-lib\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.548171 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.548200 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.548318 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.548470 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-cni-netd\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.548476 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.548546 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.548505 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-run-systemd\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.548611 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-run-netns\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.548635 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-log-socket\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.548683 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.548691 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-log-socket" (OuterVolumeSpecName: "log-socket") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.548715 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-node-log\") pod \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\" (UID: \"c4d15639-62fb-41b7-a1d4-6f51f3af6d99\") " Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.548735 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-node-log" (OuterVolumeSpecName: "node-log") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.549352 4746 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.549373 4746 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.549386 4746 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.549396 4746 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.549407 4746 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.549421 4746 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.549428 4746 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.549439 4746 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.549448 4746 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-log-socket\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.549456 4746 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-node-log\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.549464 4746 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.549474 4746 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.549485 4746 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-slash\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.549496 4746 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.549505 4746 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.549514 4746 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.549759 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.557861 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-kube-api-access-kgrqs" (OuterVolumeSpecName: "kube-api-access-kgrqs") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "kube-api-access-kgrqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.565396 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.567258 4746 scope.go:117] "RemoveContainer" containerID="17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.570832 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "c4d15639-62fb-41b7-a1d4-6f51f3af6d99" (UID: "c4d15639-62fb-41b7-a1d4-6f51f3af6d99"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.587504 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-48hq4"] Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.587769 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="northd" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.587799 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="northd" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.587811 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovnkube-controller" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.587820 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovnkube-controller" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.587829 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="kube-rbac-proxy-ovn-metrics" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.587836 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="kube-rbac-proxy-ovn-metrics" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.587842 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="kubecfg-setup" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.587848 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="kubecfg-setup" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.587870 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovnkube-controller" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.587878 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovnkube-controller" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.587885 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovnkube-controller" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.587891 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovnkube-controller" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.587900 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e93805e-6f5f-4618-b962-d9fca6cfe272" containerName="util" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.587906 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e93805e-6f5f-4618-b962-d9fca6cfe272" containerName="util" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.587914 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e93805e-6f5f-4618-b962-d9fca6cfe272" containerName="pull" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.587919 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e93805e-6f5f-4618-b962-d9fca6cfe272" containerName="pull" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.587927 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovnkube-controller" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.587949 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovnkube-controller" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.587961 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="kube-rbac-proxy-node" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.587968 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="kube-rbac-proxy-node" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.587981 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="sbdb" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.587988 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="sbdb" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.588002 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="nbdb" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.588009 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="nbdb" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.588037 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e93805e-6f5f-4618-b962-d9fca6cfe272" containerName="extract" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.588043 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e93805e-6f5f-4618-b962-d9fca6cfe272" containerName="extract" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.588054 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovn-controller" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.588062 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovn-controller" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.588112 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovn-acl-logging" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.588122 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovn-acl-logging" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.588265 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovnkube-controller" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.588277 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovnkube-controller" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.588285 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="nbdb" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.588294 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="kube-rbac-proxy-node" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.588302 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovnkube-controller" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.588308 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovnkube-controller" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.588316 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="northd" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.588350 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="kube-rbac-proxy-ovn-metrics" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.588363 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovn-controller" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.588374 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovn-acl-logging" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.588386 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e93805e-6f5f-4618-b962-d9fca6cfe272" containerName="extract" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.588398 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="sbdb" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.588541 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovnkube-controller" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.588551 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovnkube-controller" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.588709 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" containerName="ovnkube-controller" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.594720 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.600629 4746 scope.go:117] "RemoveContainer" containerID="b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.623162 4746 scope.go:117] "RemoveContainer" containerID="73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.646511 4746 scope.go:117] "RemoveContainer" containerID="2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.651464 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgrqs\" (UniqueName: \"kubernetes.io/projected/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-kube-api-access-kgrqs\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.651515 4746 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.651531 4746 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.651544 4746 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c4d15639-62fb-41b7-a1d4-6f51f3af6d99-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.663981 4746 scope.go:117] "RemoveContainer" containerID="af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.680497 4746 scope.go:117] "RemoveContainer" containerID="57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.703224 4746 scope.go:117] "RemoveContainer" containerID="45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.721639 4746 scope.go:117] "RemoveContainer" containerID="41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.722237 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b\": container with ID starting with 41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b not found: ID does not exist" containerID="41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.722283 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b"} err="failed to get container status \"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b\": rpc error: code = NotFound desc = could not find container \"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b\": container with ID starting with 41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.722309 4746 scope.go:117] "RemoveContainer" containerID="dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.722787 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c\": container with ID starting with dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c not found: ID does not exist" containerID="dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.722852 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c"} err="failed to get container status \"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c\": rpc error: code = NotFound desc = could not find container \"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c\": container with ID starting with dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.722918 4746 scope.go:117] "RemoveContainer" containerID="cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.723342 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\": container with ID starting with cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc not found: ID does not exist" containerID="cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.723370 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc"} err="failed to get container status \"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\": rpc error: code = NotFound desc = could not find container \"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\": container with ID starting with cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.723387 4746 scope.go:117] "RemoveContainer" containerID="17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.723760 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\": container with ID starting with 17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec not found: ID does not exist" containerID="17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.723785 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec"} err="failed to get container status \"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\": rpc error: code = NotFound desc = could not find container \"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\": container with ID starting with 17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.723802 4746 scope.go:117] "RemoveContainer" containerID="b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.724063 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\": container with ID starting with b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88 not found: ID does not exist" containerID="b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.724122 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88"} err="failed to get container status \"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\": rpc error: code = NotFound desc = could not find container \"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\": container with ID starting with b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88 not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.724144 4746 scope.go:117] "RemoveContainer" containerID="73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.724459 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\": container with ID starting with 73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f not found: ID does not exist" containerID="73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.724489 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f"} err="failed to get container status \"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\": rpc error: code = NotFound desc = could not find container \"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\": container with ID starting with 73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.724508 4746 scope.go:117] "RemoveContainer" containerID="2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.724879 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\": container with ID starting with 2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1 not found: ID does not exist" containerID="2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.724925 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1"} err="failed to get container status \"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\": rpc error: code = NotFound desc = could not find container \"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\": container with ID starting with 2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1 not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.724942 4746 scope.go:117] "RemoveContainer" containerID="af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.725245 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\": container with ID starting with af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88 not found: ID does not exist" containerID="af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.725271 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88"} err="failed to get container status \"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\": rpc error: code = NotFound desc = could not find container \"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\": container with ID starting with af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88 not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.725288 4746 scope.go:117] "RemoveContainer" containerID="57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.729447 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\": container with ID starting with 57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d not found: ID does not exist" containerID="57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.729482 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d"} err="failed to get container status \"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\": rpc error: code = NotFound desc = could not find container \"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\": container with ID starting with 57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.729498 4746 scope.go:117] "RemoveContainer" containerID="45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328" Jan 28 20:50:34 crc kubenswrapper[4746]: E0128 20:50:34.729743 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\": container with ID starting with 45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328 not found: ID does not exist" containerID="45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.729767 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328"} err="failed to get container status \"45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\": rpc error: code = NotFound desc = could not find container \"45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\": container with ID starting with 45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328 not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.729782 4746 scope.go:117] "RemoveContainer" containerID="41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.729993 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b"} err="failed to get container status \"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b\": rpc error: code = NotFound desc = could not find container \"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b\": container with ID starting with 41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.730021 4746 scope.go:117] "RemoveContainer" containerID="dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.730231 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c"} err="failed to get container status \"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c\": rpc error: code = NotFound desc = could not find container \"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c\": container with ID starting with dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.730253 4746 scope.go:117] "RemoveContainer" containerID="cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.730448 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc"} err="failed to get container status \"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\": rpc error: code = NotFound desc = could not find container \"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\": container with ID starting with cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.730471 4746 scope.go:117] "RemoveContainer" containerID="17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.730677 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec"} err="failed to get container status \"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\": rpc error: code = NotFound desc = could not find container \"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\": container with ID starting with 17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.730703 4746 scope.go:117] "RemoveContainer" containerID="b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.730907 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88"} err="failed to get container status \"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\": rpc error: code = NotFound desc = could not find container \"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\": container with ID starting with b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88 not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.730927 4746 scope.go:117] "RemoveContainer" containerID="73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.731165 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f"} err="failed to get container status \"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\": rpc error: code = NotFound desc = could not find container \"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\": container with ID starting with 73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.731187 4746 scope.go:117] "RemoveContainer" containerID="2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.731409 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1"} err="failed to get container status \"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\": rpc error: code = NotFound desc = could not find container \"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\": container with ID starting with 2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1 not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.731435 4746 scope.go:117] "RemoveContainer" containerID="af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.731644 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88"} err="failed to get container status \"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\": rpc error: code = NotFound desc = could not find container \"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\": container with ID starting with af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88 not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.731670 4746 scope.go:117] "RemoveContainer" containerID="57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.731896 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d"} err="failed to get container status \"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\": rpc error: code = NotFound desc = could not find container \"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\": container with ID starting with 57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.731919 4746 scope.go:117] "RemoveContainer" containerID="45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.732148 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328"} err="failed to get container status \"45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\": rpc error: code = NotFound desc = could not find container \"45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\": container with ID starting with 45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328 not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.732172 4746 scope.go:117] "RemoveContainer" containerID="41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.732376 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b"} err="failed to get container status \"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b\": rpc error: code = NotFound desc = could not find container \"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b\": container with ID starting with 41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.732400 4746 scope.go:117] "RemoveContainer" containerID="dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.732607 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c"} err="failed to get container status \"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c\": rpc error: code = NotFound desc = could not find container \"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c\": container with ID starting with dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.732626 4746 scope.go:117] "RemoveContainer" containerID="cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.732928 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc"} err="failed to get container status \"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\": rpc error: code = NotFound desc = could not find container \"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\": container with ID starting with cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.732947 4746 scope.go:117] "RemoveContainer" containerID="17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.733241 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec"} err="failed to get container status \"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\": rpc error: code = NotFound desc = could not find container \"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\": container with ID starting with 17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.733266 4746 scope.go:117] "RemoveContainer" containerID="b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.733655 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88"} err="failed to get container status \"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\": rpc error: code = NotFound desc = could not find container \"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\": container with ID starting with b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88 not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.733674 4746 scope.go:117] "RemoveContainer" containerID="73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.734111 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f"} err="failed to get container status \"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\": rpc error: code = NotFound desc = could not find container \"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\": container with ID starting with 73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.734130 4746 scope.go:117] "RemoveContainer" containerID="2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.734514 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1"} err="failed to get container status \"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\": rpc error: code = NotFound desc = could not find container \"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\": container with ID starting with 2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1 not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.734534 4746 scope.go:117] "RemoveContainer" containerID="af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.735187 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88"} err="failed to get container status \"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\": rpc error: code = NotFound desc = could not find container \"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\": container with ID starting with af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88 not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.735268 4746 scope.go:117] "RemoveContainer" containerID="57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.735783 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d"} err="failed to get container status \"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\": rpc error: code = NotFound desc = could not find container \"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\": container with ID starting with 57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.735804 4746 scope.go:117] "RemoveContainer" containerID="45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.736160 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328"} err="failed to get container status \"45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\": rpc error: code = NotFound desc = could not find container \"45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\": container with ID starting with 45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328 not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.736184 4746 scope.go:117] "RemoveContainer" containerID="41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.736501 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b"} err="failed to get container status \"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b\": rpc error: code = NotFound desc = could not find container \"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b\": container with ID starting with 41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.736522 4746 scope.go:117] "RemoveContainer" containerID="dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.736816 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c"} err="failed to get container status \"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c\": rpc error: code = NotFound desc = could not find container \"dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c\": container with ID starting with dac4fcde6d3cb0d62e92dd34133b79787bdd8338dcee58772ed7584830a7cb2c not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.736837 4746 scope.go:117] "RemoveContainer" containerID="cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.737135 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc"} err="failed to get container status \"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\": rpc error: code = NotFound desc = could not find container \"cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc\": container with ID starting with cdd60109dba5f45418dd5bc29c649a50aae8479fa77fc3dc50ca4f7c3042e3fc not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.737170 4746 scope.go:117] "RemoveContainer" containerID="17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.737502 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec"} err="failed to get container status \"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\": rpc error: code = NotFound desc = could not find container \"17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec\": container with ID starting with 17d09f7f46014da3713f76f111bca0a7ce8a24a154aa6144291d760d85e7d3ec not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.737541 4746 scope.go:117] "RemoveContainer" containerID="b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.737836 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88"} err="failed to get container status \"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\": rpc error: code = NotFound desc = could not find container \"b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88\": container with ID starting with b7ad895d0ff1ad570136d96671e1bde5c62882af50a407945ef0e7bc52727b88 not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.737860 4746 scope.go:117] "RemoveContainer" containerID="73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.738170 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f"} err="failed to get container status \"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\": rpc error: code = NotFound desc = could not find container \"73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f\": container with ID starting with 73e82be84afa7be760acda86ebf6c956d558dcbd4f37f049fba84e08c75fcd9f not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.738190 4746 scope.go:117] "RemoveContainer" containerID="2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.738467 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1"} err="failed to get container status \"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\": rpc error: code = NotFound desc = could not find container \"2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1\": container with ID starting with 2637734601bf1e43535b70bd62cc6aea7fc9f62d0d2a33c83602e7065a793be1 not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.738486 4746 scope.go:117] "RemoveContainer" containerID="af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.738745 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88"} err="failed to get container status \"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\": rpc error: code = NotFound desc = could not find container \"af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88\": container with ID starting with af842ba33fdf06048038dd3e12c59f7a9d0867aa28acee1f2cc4571e7db28b88 not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.738763 4746 scope.go:117] "RemoveContainer" containerID="57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.739029 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d"} err="failed to get container status \"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\": rpc error: code = NotFound desc = could not find container \"57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d\": container with ID starting with 57a776a1daf1b17eb44d9fa09fe2f7ec01f647aa0a40d0d571736d2f7c2f8e4d not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.739047 4746 scope.go:117] "RemoveContainer" containerID="45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.739319 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328"} err="failed to get container status \"45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\": rpc error: code = NotFound desc = could not find container \"45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328\": container with ID starting with 45e59dfe364d511e17b8264a31157501de3ed23e7125d79979df83d328242328 not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.739352 4746 scope.go:117] "RemoveContainer" containerID="41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.739569 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b"} err="failed to get container status \"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b\": rpc error: code = NotFound desc = could not find container \"41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b\": container with ID starting with 41fb0df09b842458b04f1cb1896433afe675709f7b2453543bdab40d606d6e7b not found: ID does not exist" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.752738 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.752827 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-run-ovn\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.752861 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-run-ovn-kubernetes\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.752882 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-slash\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.753034 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-run-systemd\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.753111 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-etc-openvswitch\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.753137 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-run-netns\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.753157 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3f18c1eb-ae42-4def-9588-60e567b82089-ovnkube-script-lib\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.753221 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-cni-netd\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.753256 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-kubelet\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.753290 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-log-socket\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.753323 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-var-lib-openvswitch\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.753350 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-cni-bin\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.753374 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-node-log\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.753416 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rd29\" (UniqueName: \"kubernetes.io/projected/3f18c1eb-ae42-4def-9588-60e567b82089-kube-api-access-2rd29\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.753453 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-systemd-units\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.753470 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f18c1eb-ae42-4def-9588-60e567b82089-ovnkube-config\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.753490 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f18c1eb-ae42-4def-9588-60e567b82089-env-overrides\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.753533 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-run-openvswitch\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.753551 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f18c1eb-ae42-4def-9588-60e567b82089-ovn-node-metrics-cert\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.854791 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-run-systemd\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.854837 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-etc-openvswitch\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.854890 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-run-netns\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.854912 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3f18c1eb-ae42-4def-9588-60e567b82089-ovnkube-script-lib\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.854936 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-run-systemd\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.854945 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-cni-netd\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.854965 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-kubelet\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.854984 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-log-socket\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855002 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-var-lib-openvswitch\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855020 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-cni-bin\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855023 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-etc-openvswitch\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855038 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-node-log\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855050 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-log-socket\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855060 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rd29\" (UniqueName: \"kubernetes.io/projected/3f18c1eb-ae42-4def-9588-60e567b82089-kube-api-access-2rd29\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855098 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-systemd-units\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855113 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f18c1eb-ae42-4def-9588-60e567b82089-ovnkube-config\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855132 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f18c1eb-ae42-4def-9588-60e567b82089-env-overrides\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855155 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-run-openvswitch\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855172 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f18c1eb-ae42-4def-9588-60e567b82089-ovn-node-metrics-cert\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855211 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855251 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-run-ovn\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855260 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-cni-netd\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855294 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-slash\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855268 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-slash\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855327 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-var-lib-openvswitch\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.854997 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-run-netns\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855361 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-run-ovn-kubernetes\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855385 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-node-log\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855477 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-run-ovn\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855481 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855512 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-run-ovn-kubernetes\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855363 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-cni-bin\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855904 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-run-openvswitch\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855910 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3f18c1eb-ae42-4def-9588-60e567b82089-ovnkube-script-lib\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855948 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-host-kubelet\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855973 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3f18c1eb-ae42-4def-9588-60e567b82089-systemd-units\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.855981 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f18c1eb-ae42-4def-9588-60e567b82089-env-overrides\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.856445 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f18c1eb-ae42-4def-9588-60e567b82089-ovnkube-config\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.859772 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f18c1eb-ae42-4def-9588-60e567b82089-ovn-node-metrics-cert\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.876601 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rd29\" (UniqueName: \"kubernetes.io/projected/3f18c1eb-ae42-4def-9588-60e567b82089-kube-api-access-2rd29\") pod \"ovnkube-node-48hq4\" (UID: \"3f18c1eb-ae42-4def-9588-60e567b82089\") " pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: I0128 20:50:34.913120 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:34 crc kubenswrapper[4746]: W0128 20:50:34.932906 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f18c1eb_ae42_4def_9588_60e567b82089.slice/crio-dcf11b3626ea468365e6c4715c8e6f7f86a6e7982642e3d1a81cc1ca429fe561 WatchSource:0}: Error finding container dcf11b3626ea468365e6c4715c8e6f7f86a6e7982642e3d1a81cc1ca429fe561: Status 404 returned error can't find the container with id dcf11b3626ea468365e6c4715c8e6f7f86a6e7982642e3d1a81cc1ca429fe561 Jan 28 20:50:35 crc kubenswrapper[4746]: I0128 20:50:35.470057 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8vmvh" Jan 28 20:50:35 crc kubenswrapper[4746]: I0128 20:50:35.473256 4746 generic.go:334] "Generic (PLEG): container finished" podID="3f18c1eb-ae42-4def-9588-60e567b82089" containerID="9357b8a08f9624fa28d5d2c6e2978184d9f104e81367d10a174b523bf9a7d48b" exitCode=0 Jan 28 20:50:35 crc kubenswrapper[4746]: I0128 20:50:35.473311 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" event={"ID":"3f18c1eb-ae42-4def-9588-60e567b82089","Type":"ContainerDied","Data":"9357b8a08f9624fa28d5d2c6e2978184d9f104e81367d10a174b523bf9a7d48b"} Jan 28 20:50:35 crc kubenswrapper[4746]: I0128 20:50:35.473339 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" event={"ID":"3f18c1eb-ae42-4def-9588-60e567b82089","Type":"ContainerStarted","Data":"dcf11b3626ea468365e6c4715c8e6f7f86a6e7982642e3d1a81cc1ca429fe561"} Jan 28 20:50:35 crc kubenswrapper[4746]: I0128 20:50:35.507719 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8vmvh"] Jan 28 20:50:35 crc kubenswrapper[4746]: I0128 20:50:35.513357 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8vmvh"] Jan 28 20:50:36 crc kubenswrapper[4746]: I0128 20:50:36.484160 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" event={"ID":"3f18c1eb-ae42-4def-9588-60e567b82089","Type":"ContainerStarted","Data":"3da1efee15f41689d5332085ffa79ffec11067140548fda6c381e8298f20d165"} Jan 28 20:50:36 crc kubenswrapper[4746]: I0128 20:50:36.484654 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" event={"ID":"3f18c1eb-ae42-4def-9588-60e567b82089","Type":"ContainerStarted","Data":"791a8f2f5f6954fee6cfb0ece1e3d78ee2ca1bbb61fcfc3a809c59714715fd9b"} Jan 28 20:50:36 crc kubenswrapper[4746]: I0128 20:50:36.484666 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" event={"ID":"3f18c1eb-ae42-4def-9588-60e567b82089","Type":"ContainerStarted","Data":"b2f224d381763ec261e9590be44616b88141463fab2b3a8d00b5caff58c296c4"} Jan 28 20:50:36 crc kubenswrapper[4746]: I0128 20:50:36.484676 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" event={"ID":"3f18c1eb-ae42-4def-9588-60e567b82089","Type":"ContainerStarted","Data":"e3ee096acd4e129f27820d0763d68d7ce9449a1c865aa2288bbcf435f5f44c61"} Jan 28 20:50:36 crc kubenswrapper[4746]: I0128 20:50:36.484687 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" event={"ID":"3f18c1eb-ae42-4def-9588-60e567b82089","Type":"ContainerStarted","Data":"c297de32cbc99fa2a1fe54ca694fb064a769126f525a87017689b368d07e04c5"} Jan 28 20:50:36 crc kubenswrapper[4746]: I0128 20:50:36.843585 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4d15639-62fb-41b7-a1d4-6f51f3af6d99" path="/var/lib/kubelet/pods/c4d15639-62fb-41b7-a1d4-6f51f3af6d99/volumes" Jan 28 20:50:37 crc kubenswrapper[4746]: I0128 20:50:37.492569 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" event={"ID":"3f18c1eb-ae42-4def-9588-60e567b82089","Type":"ContainerStarted","Data":"8430bbb81e93178d3866f1b08468ec6c8ec43d4e8ee7aa9dfd116f4a03e8ec2d"} Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.675624 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg"] Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.676327 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.678578 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-5qqs7" Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.679679 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.682923 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.797212 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k"] Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.797949 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.802800 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.804185 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-d294w" Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.814242 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx"] Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.815100 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.825144 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5cmr\" (UniqueName: \"kubernetes.io/projected/0e1b10c8-2491-403a-9ea3-9805d8167d7a-kube-api-access-w5cmr\") pod \"obo-prometheus-operator-68bc856cb9-sx9pg\" (UID: \"0e1b10c8-2491-403a-9ea3-9805d8167d7a\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.926943 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/09345bfc-4171-49c5-85e3-32616db6ce17-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-565bff74c4-stlnx\" (UID: \"09345bfc-4171-49c5-85e3-32616db6ce17\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.927484 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10acdec7-69f6-42e1-b065-c84b8d82fd03-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k\" (UID: \"10acdec7-69f6-42e1-b065-c84b8d82fd03\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.927568 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5cmr\" (UniqueName: \"kubernetes.io/projected/0e1b10c8-2491-403a-9ea3-9805d8167d7a-kube-api-access-w5cmr\") pod \"obo-prometheus-operator-68bc856cb9-sx9pg\" (UID: \"0e1b10c8-2491-403a-9ea3-9805d8167d7a\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.927599 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/09345bfc-4171-49c5-85e3-32616db6ce17-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-565bff74c4-stlnx\" (UID: \"09345bfc-4171-49c5-85e3-32616db6ce17\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.927629 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10acdec7-69f6-42e1-b065-c84b8d82fd03-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k\" (UID: \"10acdec7-69f6-42e1-b065-c84b8d82fd03\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.951112 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5cmr\" (UniqueName: \"kubernetes.io/projected/0e1b10c8-2491-403a-9ea3-9805d8167d7a-kube-api-access-w5cmr\") pod \"obo-prometheus-operator-68bc856cb9-sx9pg\" (UID: \"0e1b10c8-2491-403a-9ea3-9805d8167d7a\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" Jan 28 20:50:38 crc kubenswrapper[4746]: I0128 20:50:38.993382 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.015670 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-m2mx9"] Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.016523 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.021475 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-25bvw" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.022049 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.027611 4746 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-sx9pg_openshift-operators_0e1b10c8-2491-403a-9ea3-9805d8167d7a_0(dbeece577b98014e29c91652b20bbd7ee3a51b2916289d395eda6215a77982d7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.027928 4746 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-sx9pg_openshift-operators_0e1b10c8-2491-403a-9ea3-9805d8167d7a_0(dbeece577b98014e29c91652b20bbd7ee3a51b2916289d395eda6215a77982d7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.028076 4746 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-sx9pg_openshift-operators_0e1b10c8-2491-403a-9ea3-9805d8167d7a_0(dbeece577b98014e29c91652b20bbd7ee3a51b2916289d395eda6215a77982d7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.028242 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-sx9pg_openshift-operators(0e1b10c8-2491-403a-9ea3-9805d8167d7a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-sx9pg_openshift-operators(0e1b10c8-2491-403a-9ea3-9805d8167d7a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-sx9pg_openshift-operators_0e1b10c8-2491-403a-9ea3-9805d8167d7a_0(dbeece577b98014e29c91652b20bbd7ee3a51b2916289d395eda6215a77982d7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" podUID="0e1b10c8-2491-403a-9ea3-9805d8167d7a" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.028382 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10acdec7-69f6-42e1-b065-c84b8d82fd03-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k\" (UID: \"10acdec7-69f6-42e1-b065-c84b8d82fd03\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.028474 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/09345bfc-4171-49c5-85e3-32616db6ce17-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-565bff74c4-stlnx\" (UID: \"09345bfc-4171-49c5-85e3-32616db6ce17\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.028506 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10acdec7-69f6-42e1-b065-c84b8d82fd03-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k\" (UID: \"10acdec7-69f6-42e1-b065-c84b8d82fd03\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.028549 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/09345bfc-4171-49c5-85e3-32616db6ce17-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-565bff74c4-stlnx\" (UID: \"09345bfc-4171-49c5-85e3-32616db6ce17\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.032546 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10acdec7-69f6-42e1-b065-c84b8d82fd03-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k\" (UID: \"10acdec7-69f6-42e1-b065-c84b8d82fd03\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.033876 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/09345bfc-4171-49c5-85e3-32616db6ce17-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-565bff74c4-stlnx\" (UID: \"09345bfc-4171-49c5-85e3-32616db6ce17\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.038851 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/09345bfc-4171-49c5-85e3-32616db6ce17-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-565bff74c4-stlnx\" (UID: \"09345bfc-4171-49c5-85e3-32616db6ce17\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.046517 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10acdec7-69f6-42e1-b065-c84b8d82fd03-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k\" (UID: \"10acdec7-69f6-42e1-b065-c84b8d82fd03\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.114797 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.130194 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpc4w\" (UniqueName: \"kubernetes.io/projected/2788b8ac-4eb0-46cb-8861-c55d6b302dd7-kube-api-access-bpc4w\") pod \"observability-operator-59bdc8b94-m2mx9\" (UID: \"2788b8ac-4eb0-46cb-8861-c55d6b302dd7\") " pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.130270 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2788b8ac-4eb0-46cb-8861-c55d6b302dd7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-m2mx9\" (UID: \"2788b8ac-4eb0-46cb-8861-c55d6b302dd7\") " pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.130919 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.152725 4746 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_openshift-operators_10acdec7-69f6-42e1-b065-c84b8d82fd03_0(6cae519f7da3ba8d163f959ab184ebcf5561c53de04cba192e183aa0d8ecd932): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.152825 4746 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_openshift-operators_10acdec7-69f6-42e1-b065-c84b8d82fd03_0(6cae519f7da3ba8d163f959ab184ebcf5561c53de04cba192e183aa0d8ecd932): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.152853 4746 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_openshift-operators_10acdec7-69f6-42e1-b065-c84b8d82fd03_0(6cae519f7da3ba8d163f959ab184ebcf5561c53de04cba192e183aa0d8ecd932): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.152925 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_openshift-operators(10acdec7-69f6-42e1-b065-c84b8d82fd03)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_openshift-operators(10acdec7-69f6-42e1-b065-c84b8d82fd03)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_openshift-operators_10acdec7-69f6-42e1-b065-c84b8d82fd03_0(6cae519f7da3ba8d163f959ab184ebcf5561c53de04cba192e183aa0d8ecd932): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" podUID="10acdec7-69f6-42e1-b065-c84b8d82fd03" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.166555 4746 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_openshift-operators_09345bfc-4171-49c5-85e3-32616db6ce17_0(7a230c0ce158d153b76deccb2d2d89be608e809c70a02be5b9c06dc0a0d58118): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.166627 4746 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_openshift-operators_09345bfc-4171-49c5-85e3-32616db6ce17_0(7a230c0ce158d153b76deccb2d2d89be608e809c70a02be5b9c06dc0a0d58118): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.166654 4746 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_openshift-operators_09345bfc-4171-49c5-85e3-32616db6ce17_0(7a230c0ce158d153b76deccb2d2d89be608e809c70a02be5b9c06dc0a0d58118): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.166710 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_openshift-operators(09345bfc-4171-49c5-85e3-32616db6ce17)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_openshift-operators(09345bfc-4171-49c5-85e3-32616db6ce17)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_openshift-operators_09345bfc-4171-49c5-85e3-32616db6ce17_0(7a230c0ce158d153b76deccb2d2d89be608e809c70a02be5b9c06dc0a0d58118): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" podUID="09345bfc-4171-49c5-85e3-32616db6ce17" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.224277 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-jnzwc"] Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.225099 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.229451 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-5n589" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.231534 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2788b8ac-4eb0-46cb-8861-c55d6b302dd7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-m2mx9\" (UID: \"2788b8ac-4eb0-46cb-8861-c55d6b302dd7\") " pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.231614 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpc4w\" (UniqueName: \"kubernetes.io/projected/2788b8ac-4eb0-46cb-8861-c55d6b302dd7-kube-api-access-bpc4w\") pod \"observability-operator-59bdc8b94-m2mx9\" (UID: \"2788b8ac-4eb0-46cb-8861-c55d6b302dd7\") " pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.235759 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/2788b8ac-4eb0-46cb-8861-c55d6b302dd7-observability-operator-tls\") pod \"observability-operator-59bdc8b94-m2mx9\" (UID: \"2788b8ac-4eb0-46cb-8861-c55d6b302dd7\") " pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.309809 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpc4w\" (UniqueName: \"kubernetes.io/projected/2788b8ac-4eb0-46cb-8861-c55d6b302dd7-kube-api-access-bpc4w\") pod \"observability-operator-59bdc8b94-m2mx9\" (UID: \"2788b8ac-4eb0-46cb-8861-c55d6b302dd7\") " pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.334092 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkx8w\" (UniqueName: \"kubernetes.io/projected/f13f3a63-44b1-4644-8bea-99e25a6764c3-kube-api-access-hkx8w\") pod \"perses-operator-5bf474d74f-jnzwc\" (UID: \"f13f3a63-44b1-4644-8bea-99e25a6764c3\") " pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.334519 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f13f3a63-44b1-4644-8bea-99e25a6764c3-openshift-service-ca\") pod \"perses-operator-5bf474d74f-jnzwc\" (UID: \"f13f3a63-44b1-4644-8bea-99e25a6764c3\") " pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.387773 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.412393 4746 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m2mx9_openshift-operators_2788b8ac-4eb0-46cb-8861-c55d6b302dd7_0(a3e2613307d1f24b843319472dec0c7843639f3a37009ed5ac0d41d97ae2c1b9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.412483 4746 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m2mx9_openshift-operators_2788b8ac-4eb0-46cb-8861-c55d6b302dd7_0(a3e2613307d1f24b843319472dec0c7843639f3a37009ed5ac0d41d97ae2c1b9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.412515 4746 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m2mx9_openshift-operators_2788b8ac-4eb0-46cb-8861-c55d6b302dd7_0(a3e2613307d1f24b843319472dec0c7843639f3a37009ed5ac0d41d97ae2c1b9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.412597 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-m2mx9_openshift-operators(2788b8ac-4eb0-46cb-8861-c55d6b302dd7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-m2mx9_openshift-operators(2788b8ac-4eb0-46cb-8861-c55d6b302dd7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m2mx9_openshift-operators_2788b8ac-4eb0-46cb-8861-c55d6b302dd7_0(a3e2613307d1f24b843319472dec0c7843639f3a37009ed5ac0d41d97ae2c1b9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" podUID="2788b8ac-4eb0-46cb-8861-c55d6b302dd7" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.436255 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f13f3a63-44b1-4644-8bea-99e25a6764c3-openshift-service-ca\") pod \"perses-operator-5bf474d74f-jnzwc\" (UID: \"f13f3a63-44b1-4644-8bea-99e25a6764c3\") " pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.436346 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkx8w\" (UniqueName: \"kubernetes.io/projected/f13f3a63-44b1-4644-8bea-99e25a6764c3-kube-api-access-hkx8w\") pod \"perses-operator-5bf474d74f-jnzwc\" (UID: \"f13f3a63-44b1-4644-8bea-99e25a6764c3\") " pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.437583 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f13f3a63-44b1-4644-8bea-99e25a6764c3-openshift-service-ca\") pod \"perses-operator-5bf474d74f-jnzwc\" (UID: \"f13f3a63-44b1-4644-8bea-99e25a6764c3\") " pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.453145 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkx8w\" (UniqueName: \"kubernetes.io/projected/f13f3a63-44b1-4644-8bea-99e25a6764c3-kube-api-access-hkx8w\") pod \"perses-operator-5bf474d74f-jnzwc\" (UID: \"f13f3a63-44b1-4644-8bea-99e25a6764c3\") " pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.508453 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" event={"ID":"3f18c1eb-ae42-4def-9588-60e567b82089","Type":"ContainerStarted","Data":"d304c35e3edaaa5946466283fb14e762787391080b6004f421eb50d6137c8af4"} Jan 28 20:50:39 crc kubenswrapper[4746]: I0128 20:50:39.541062 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.561949 4746 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-jnzwc_openshift-operators_f13f3a63-44b1-4644-8bea-99e25a6764c3_0(6e86f6086cd14b1266fec37b18f69620b66f96b98589a505004989d6aa48b431): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.562048 4746 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-jnzwc_openshift-operators_f13f3a63-44b1-4644-8bea-99e25a6764c3_0(6e86f6086cd14b1266fec37b18f69620b66f96b98589a505004989d6aa48b431): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.562151 4746 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-jnzwc_openshift-operators_f13f3a63-44b1-4644-8bea-99e25a6764c3_0(6e86f6086cd14b1266fec37b18f69620b66f96b98589a505004989d6aa48b431): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:50:39 crc kubenswrapper[4746]: E0128 20:50:39.562235 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-jnzwc_openshift-operators(f13f3a63-44b1-4644-8bea-99e25a6764c3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-jnzwc_openshift-operators(f13f3a63-44b1-4644-8bea-99e25a6764c3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-jnzwc_openshift-operators_f13f3a63-44b1-4644-8bea-99e25a6764c3_0(6e86f6086cd14b1266fec37b18f69620b66f96b98589a505004989d6aa48b431): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" podUID="f13f3a63-44b1-4644-8bea-99e25a6764c3" Jan 28 20:50:41 crc kubenswrapper[4746]: I0128 20:50:41.526199 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" event={"ID":"3f18c1eb-ae42-4def-9588-60e567b82089","Type":"ContainerStarted","Data":"6b45ceb468b37276ca7b8632a215d49d504817a301898cb1b01539656bff23d2"} Jan 28 20:50:41 crc kubenswrapper[4746]: I0128 20:50:41.526731 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:41 crc kubenswrapper[4746]: I0128 20:50:41.526787 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:41 crc kubenswrapper[4746]: I0128 20:50:41.565207 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" podStartSLOduration=7.565181148 podStartE2EDuration="7.565181148s" podCreationTimestamp="2026-01-28 20:50:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:50:41.560411547 +0000 UTC m=+669.516597901" watchObservedRunningTime="2026-01-28 20:50:41.565181148 +0000 UTC m=+669.521367502" Jan 28 20:50:41 crc kubenswrapper[4746]: I0128 20:50:41.568861 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:42 crc kubenswrapper[4746]: I0128 20:50:42.153430 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg"] Jan 28 20:50:42 crc kubenswrapper[4746]: I0128 20:50:42.154090 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" Jan 28 20:50:42 crc kubenswrapper[4746]: I0128 20:50:42.154692 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" Jan 28 20:50:42 crc kubenswrapper[4746]: I0128 20:50:42.168337 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k"] Jan 28 20:50:42 crc kubenswrapper[4746]: I0128 20:50:42.168513 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:50:42 crc kubenswrapper[4746]: I0128 20:50:42.169054 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:50:42 crc kubenswrapper[4746]: I0128 20:50:42.199394 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-jnzwc"] Jan 28 20:50:42 crc kubenswrapper[4746]: I0128 20:50:42.199557 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:50:42 crc kubenswrapper[4746]: I0128 20:50:42.200236 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.224411 4746 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-sx9pg_openshift-operators_0e1b10c8-2491-403a-9ea3-9805d8167d7a_0(b95a1d63b387effb1f129273448ebcff58a0d3919b8d7036b0d142de82f2d7e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.224537 4746 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-sx9pg_openshift-operators_0e1b10c8-2491-403a-9ea3-9805d8167d7a_0(b95a1d63b387effb1f129273448ebcff58a0d3919b8d7036b0d142de82f2d7e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.224602 4746 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-sx9pg_openshift-operators_0e1b10c8-2491-403a-9ea3-9805d8167d7a_0(b95a1d63b387effb1f129273448ebcff58a0d3919b8d7036b0d142de82f2d7e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.224689 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-sx9pg_openshift-operators(0e1b10c8-2491-403a-9ea3-9805d8167d7a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-sx9pg_openshift-operators(0e1b10c8-2491-403a-9ea3-9805d8167d7a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-sx9pg_openshift-operators_0e1b10c8-2491-403a-9ea3-9805d8167d7a_0(b95a1d63b387effb1f129273448ebcff58a0d3919b8d7036b0d142de82f2d7e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" podUID="0e1b10c8-2491-403a-9ea3-9805d8167d7a" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.238513 4746 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_openshift-operators_10acdec7-69f6-42e1-b065-c84b8d82fd03_0(07fb1fafdeea27534d1c630825d60d9a6e08d4c2ab41ecc44b8a4d2f73328766): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.238642 4746 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_openshift-operators_10acdec7-69f6-42e1-b065-c84b8d82fd03_0(07fb1fafdeea27534d1c630825d60d9a6e08d4c2ab41ecc44b8a4d2f73328766): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.238678 4746 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_openshift-operators_10acdec7-69f6-42e1-b065-c84b8d82fd03_0(07fb1fafdeea27534d1c630825d60d9a6e08d4c2ab41ecc44b8a4d2f73328766): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.238752 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_openshift-operators(10acdec7-69f6-42e1-b065-c84b8d82fd03)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_openshift-operators(10acdec7-69f6-42e1-b065-c84b8d82fd03)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_openshift-operators_10acdec7-69f6-42e1-b065-c84b8d82fd03_0(07fb1fafdeea27534d1c630825d60d9a6e08d4c2ab41ecc44b8a4d2f73328766): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" podUID="10acdec7-69f6-42e1-b065-c84b8d82fd03" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.270175 4746 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-jnzwc_openshift-operators_f13f3a63-44b1-4644-8bea-99e25a6764c3_0(6a296ce7a52fc893a55cd23bfb61081da5696118c1caa1ee2379bd241ae8addf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.270284 4746 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-jnzwc_openshift-operators_f13f3a63-44b1-4644-8bea-99e25a6764c3_0(6a296ce7a52fc893a55cd23bfb61081da5696118c1caa1ee2379bd241ae8addf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.270311 4746 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-jnzwc_openshift-operators_f13f3a63-44b1-4644-8bea-99e25a6764c3_0(6a296ce7a52fc893a55cd23bfb61081da5696118c1caa1ee2379bd241ae8addf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.270368 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-jnzwc_openshift-operators(f13f3a63-44b1-4644-8bea-99e25a6764c3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-jnzwc_openshift-operators(f13f3a63-44b1-4644-8bea-99e25a6764c3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-jnzwc_openshift-operators_f13f3a63-44b1-4644-8bea-99e25a6764c3_0(6a296ce7a52fc893a55cd23bfb61081da5696118c1caa1ee2379bd241ae8addf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" podUID="f13f3a63-44b1-4644-8bea-99e25a6764c3" Jan 28 20:50:42 crc kubenswrapper[4746]: I0128 20:50:42.270456 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-m2mx9"] Jan 28 20:50:42 crc kubenswrapper[4746]: I0128 20:50:42.270603 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:50:42 crc kubenswrapper[4746]: I0128 20:50:42.271172 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.327498 4746 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m2mx9_openshift-operators_2788b8ac-4eb0-46cb-8861-c55d6b302dd7_0(e9ab23283ad81f1133afe38d57a51447f072add95153b6e7a182ad0f165c04d0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.327594 4746 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m2mx9_openshift-operators_2788b8ac-4eb0-46cb-8861-c55d6b302dd7_0(e9ab23283ad81f1133afe38d57a51447f072add95153b6e7a182ad0f165c04d0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.327628 4746 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m2mx9_openshift-operators_2788b8ac-4eb0-46cb-8861-c55d6b302dd7_0(e9ab23283ad81f1133afe38d57a51447f072add95153b6e7a182ad0f165c04d0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.327692 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-m2mx9_openshift-operators(2788b8ac-4eb0-46cb-8861-c55d6b302dd7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-m2mx9_openshift-operators(2788b8ac-4eb0-46cb-8861-c55d6b302dd7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m2mx9_openshift-operators_2788b8ac-4eb0-46cb-8861-c55d6b302dd7_0(e9ab23283ad81f1133afe38d57a51447f072add95153b6e7a182ad0f165c04d0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" podUID="2788b8ac-4eb0-46cb-8861-c55d6b302dd7" Jan 28 20:50:42 crc kubenswrapper[4746]: I0128 20:50:42.386626 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx"] Jan 28 20:50:42 crc kubenswrapper[4746]: I0128 20:50:42.386800 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:50:42 crc kubenswrapper[4746]: I0128 20:50:42.387386 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.409504 4746 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_openshift-operators_09345bfc-4171-49c5-85e3-32616db6ce17_0(e03c75152feb4aca3346a764b753264af41ffcf56574cf422664fe95013afa32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.409581 4746 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_openshift-operators_09345bfc-4171-49c5-85e3-32616db6ce17_0(e03c75152feb4aca3346a764b753264af41ffcf56574cf422664fe95013afa32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.409607 4746 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_openshift-operators_09345bfc-4171-49c5-85e3-32616db6ce17_0(e03c75152feb4aca3346a764b753264af41ffcf56574cf422664fe95013afa32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:50:42 crc kubenswrapper[4746]: E0128 20:50:42.409656 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_openshift-operators(09345bfc-4171-49c5-85e3-32616db6ce17)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_openshift-operators(09345bfc-4171-49c5-85e3-32616db6ce17)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_openshift-operators_09345bfc-4171-49c5-85e3-32616db6ce17_0(e03c75152feb4aca3346a764b753264af41ffcf56574cf422664fe95013afa32): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" podUID="09345bfc-4171-49c5-85e3-32616db6ce17" Jan 28 20:50:42 crc kubenswrapper[4746]: I0128 20:50:42.531989 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:42 crc kubenswrapper[4746]: I0128 20:50:42.591512 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:50:44 crc kubenswrapper[4746]: I0128 20:50:44.292529 4746 scope.go:117] "RemoveContainer" containerID="9c88cd0b151f23677f83a944570f36a7a6edd30d9be47638a17cbb9098ec1c23" Jan 28 20:50:44 crc kubenswrapper[4746]: I0128 20:50:44.543768 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qhpvf_cdf26de0-b602-4bdf-b492-65b3b6b31434/kube-multus/2.log" Jan 28 20:50:49 crc kubenswrapper[4746]: I0128 20:50:49.836141 4746 scope.go:117] "RemoveContainer" containerID="7739b8614574ec2b9e7c1e6a6e443f85ce5fa487dd8be878363a899877ff42f3" Jan 28 20:50:49 crc kubenswrapper[4746]: E0128 20:50:49.837167 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-qhpvf_openshift-multus(cdf26de0-b602-4bdf-b492-65b3b6b31434)\"" pod="openshift-multus/multus-qhpvf" podUID="cdf26de0-b602-4bdf-b492-65b3b6b31434" Jan 28 20:50:53 crc kubenswrapper[4746]: I0128 20:50:53.835808 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:50:53 crc kubenswrapper[4746]: I0128 20:50:53.836854 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:50:53 crc kubenswrapper[4746]: E0128 20:50:53.877334 4746 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m2mx9_openshift-operators_2788b8ac-4eb0-46cb-8861-c55d6b302dd7_0(f3f50f755486cad7ab204ddd07672590ed3d85907bc2dfb29cd9fcfbaf95e97d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 20:50:53 crc kubenswrapper[4746]: E0128 20:50:53.877409 4746 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m2mx9_openshift-operators_2788b8ac-4eb0-46cb-8861-c55d6b302dd7_0(f3f50f755486cad7ab204ddd07672590ed3d85907bc2dfb29cd9fcfbaf95e97d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:50:53 crc kubenswrapper[4746]: E0128 20:50:53.877428 4746 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m2mx9_openshift-operators_2788b8ac-4eb0-46cb-8861-c55d6b302dd7_0(f3f50f755486cad7ab204ddd07672590ed3d85907bc2dfb29cd9fcfbaf95e97d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:50:53 crc kubenswrapper[4746]: E0128 20:50:53.877476 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-m2mx9_openshift-operators(2788b8ac-4eb0-46cb-8861-c55d6b302dd7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-m2mx9_openshift-operators(2788b8ac-4eb0-46cb-8861-c55d6b302dd7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-m2mx9_openshift-operators_2788b8ac-4eb0-46cb-8861-c55d6b302dd7_0(f3f50f755486cad7ab204ddd07672590ed3d85907bc2dfb29cd9fcfbaf95e97d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" podUID="2788b8ac-4eb0-46cb-8861-c55d6b302dd7" Jan 28 20:50:54 crc kubenswrapper[4746]: I0128 20:50:54.835532 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:50:54 crc kubenswrapper[4746]: I0128 20:50:54.836166 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:50:54 crc kubenswrapper[4746]: E0128 20:50:54.860816 4746 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-jnzwc_openshift-operators_f13f3a63-44b1-4644-8bea-99e25a6764c3_0(a1334963c4db08a2da4c18efaaa7ead0d118e11534f5a2dd219d8eddbdb9dabb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 20:50:54 crc kubenswrapper[4746]: E0128 20:50:54.860903 4746 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-jnzwc_openshift-operators_f13f3a63-44b1-4644-8bea-99e25a6764c3_0(a1334963c4db08a2da4c18efaaa7ead0d118e11534f5a2dd219d8eddbdb9dabb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:50:54 crc kubenswrapper[4746]: E0128 20:50:54.860930 4746 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-jnzwc_openshift-operators_f13f3a63-44b1-4644-8bea-99e25a6764c3_0(a1334963c4db08a2da4c18efaaa7ead0d118e11534f5a2dd219d8eddbdb9dabb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:50:54 crc kubenswrapper[4746]: E0128 20:50:54.860995 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-jnzwc_openshift-operators(f13f3a63-44b1-4644-8bea-99e25a6764c3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-jnzwc_openshift-operators(f13f3a63-44b1-4644-8bea-99e25a6764c3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-jnzwc_openshift-operators_f13f3a63-44b1-4644-8bea-99e25a6764c3_0(a1334963c4db08a2da4c18efaaa7ead0d118e11534f5a2dd219d8eddbdb9dabb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" podUID="f13f3a63-44b1-4644-8bea-99e25a6764c3" Jan 28 20:50:55 crc kubenswrapper[4746]: I0128 20:50:55.835538 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" Jan 28 20:50:55 crc kubenswrapper[4746]: I0128 20:50:55.837655 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" Jan 28 20:50:55 crc kubenswrapper[4746]: E0128 20:50:55.859798 4746 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-sx9pg_openshift-operators_0e1b10c8-2491-403a-9ea3-9805d8167d7a_0(84806b8422784c227b0a1f014f599492f954201597e06410385b1d27f18a5c72): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 20:50:55 crc kubenswrapper[4746]: E0128 20:50:55.859891 4746 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-sx9pg_openshift-operators_0e1b10c8-2491-403a-9ea3-9805d8167d7a_0(84806b8422784c227b0a1f014f599492f954201597e06410385b1d27f18a5c72): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" Jan 28 20:50:55 crc kubenswrapper[4746]: E0128 20:50:55.859926 4746 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-sx9pg_openshift-operators_0e1b10c8-2491-403a-9ea3-9805d8167d7a_0(84806b8422784c227b0a1f014f599492f954201597e06410385b1d27f18a5c72): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" Jan 28 20:50:55 crc kubenswrapper[4746]: E0128 20:50:55.859984 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-sx9pg_openshift-operators(0e1b10c8-2491-403a-9ea3-9805d8167d7a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-sx9pg_openshift-operators(0e1b10c8-2491-403a-9ea3-9805d8167d7a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-sx9pg_openshift-operators_0e1b10c8-2491-403a-9ea3-9805d8167d7a_0(84806b8422784c227b0a1f014f599492f954201597e06410385b1d27f18a5c72): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" podUID="0e1b10c8-2491-403a-9ea3-9805d8167d7a" Jan 28 20:50:56 crc kubenswrapper[4746]: I0128 20:50:56.837790 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:50:56 crc kubenswrapper[4746]: I0128 20:50:56.838634 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:50:56 crc kubenswrapper[4746]: E0128 20:50:56.869746 4746 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_openshift-operators_10acdec7-69f6-42e1-b065-c84b8d82fd03_0(87508c96096927d8bab902a7fea6aa88bba8104ddbc11e8207aa795f39113a96): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 20:50:56 crc kubenswrapper[4746]: E0128 20:50:56.869845 4746 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_openshift-operators_10acdec7-69f6-42e1-b065-c84b8d82fd03_0(87508c96096927d8bab902a7fea6aa88bba8104ddbc11e8207aa795f39113a96): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:50:56 crc kubenswrapper[4746]: E0128 20:50:56.869875 4746 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_openshift-operators_10acdec7-69f6-42e1-b065-c84b8d82fd03_0(87508c96096927d8bab902a7fea6aa88bba8104ddbc11e8207aa795f39113a96): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:50:56 crc kubenswrapper[4746]: E0128 20:50:56.869947 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_openshift-operators(10acdec7-69f6-42e1-b065-c84b8d82fd03)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_openshift-operators(10acdec7-69f6-42e1-b065-c84b8d82fd03)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_openshift-operators_10acdec7-69f6-42e1-b065-c84b8d82fd03_0(87508c96096927d8bab902a7fea6aa88bba8104ddbc11e8207aa795f39113a96): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" podUID="10acdec7-69f6-42e1-b065-c84b8d82fd03" Jan 28 20:50:57 crc kubenswrapper[4746]: I0128 20:50:57.835335 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:50:57 crc kubenswrapper[4746]: I0128 20:50:57.835939 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:50:57 crc kubenswrapper[4746]: E0128 20:50:57.861872 4746 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_openshift-operators_09345bfc-4171-49c5-85e3-32616db6ce17_0(73b7c906a369db750df7e7e2def7eeac8e2df7588ffa61162747ce624220c930): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 28 20:50:57 crc kubenswrapper[4746]: E0128 20:50:57.861987 4746 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_openshift-operators_09345bfc-4171-49c5-85e3-32616db6ce17_0(73b7c906a369db750df7e7e2def7eeac8e2df7588ffa61162747ce624220c930): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:50:57 crc kubenswrapper[4746]: E0128 20:50:57.862033 4746 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_openshift-operators_09345bfc-4171-49c5-85e3-32616db6ce17_0(73b7c906a369db750df7e7e2def7eeac8e2df7588ffa61162747ce624220c930): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:50:57 crc kubenswrapper[4746]: E0128 20:50:57.862128 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_openshift-operators(09345bfc-4171-49c5-85e3-32616db6ce17)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_openshift-operators(09345bfc-4171-49c5-85e3-32616db6ce17)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_openshift-operators_09345bfc-4171-49c5-85e3-32616db6ce17_0(73b7c906a369db750df7e7e2def7eeac8e2df7588ffa61162747ce624220c930): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" podUID="09345bfc-4171-49c5-85e3-32616db6ce17" Jan 28 20:51:04 crc kubenswrapper[4746]: I0128 20:51:04.836473 4746 scope.go:117] "RemoveContainer" containerID="7739b8614574ec2b9e7c1e6a6e443f85ce5fa487dd8be878363a899877ff42f3" Jan 28 20:51:04 crc kubenswrapper[4746]: I0128 20:51:04.945188 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-48hq4" Jan 28 20:51:05 crc kubenswrapper[4746]: I0128 20:51:05.679364 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qhpvf_cdf26de0-b602-4bdf-b492-65b3b6b31434/kube-multus/2.log" Jan 28 20:51:05 crc kubenswrapper[4746]: I0128 20:51:05.679467 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qhpvf" event={"ID":"cdf26de0-b602-4bdf-b492-65b3b6b31434","Type":"ContainerStarted","Data":"aad1e1acab42418c5bb825ce446b77372bbaaa49b87070497e0ed4a1974229e2"} Jan 28 20:51:07 crc kubenswrapper[4746]: I0128 20:51:07.835908 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:51:07 crc kubenswrapper[4746]: I0128 20:51:07.837267 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:51:08 crc kubenswrapper[4746]: I0128 20:51:08.107454 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-m2mx9"] Jan 28 20:51:08 crc kubenswrapper[4746]: I0128 20:51:08.701180 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" event={"ID":"2788b8ac-4eb0-46cb-8861-c55d6b302dd7","Type":"ContainerStarted","Data":"fd739833b06c0d4ea75e6fef95d999c664ab83fa44e61aa6aab2e35442e2b412"} Jan 28 20:51:08 crc kubenswrapper[4746]: I0128 20:51:08.840274 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:51:08 crc kubenswrapper[4746]: I0128 20:51:08.840472 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:51:09 crc kubenswrapper[4746]: I0128 20:51:09.146470 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-jnzwc"] Jan 28 20:51:09 crc kubenswrapper[4746]: I0128 20:51:09.710033 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" event={"ID":"f13f3a63-44b1-4644-8bea-99e25a6764c3","Type":"ContainerStarted","Data":"0a7fb8f8d5c7416c79f8c7d133b181392bd192a14e774f2b176f800e6d0edfb7"} Jan 28 20:51:09 crc kubenswrapper[4746]: I0128 20:51:09.835783 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:51:09 crc kubenswrapper[4746]: I0128 20:51:09.836610 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" Jan 28 20:51:10 crc kubenswrapper[4746]: I0128 20:51:10.094679 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k"] Jan 28 20:51:10 crc kubenswrapper[4746]: W0128 20:51:10.103815 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10acdec7_69f6_42e1_b065_c84b8d82fd03.slice/crio-2821ba7fde629ed0fc0966dee2dcf8094ad2171c5d055d82a18d164b44dda9cc WatchSource:0}: Error finding container 2821ba7fde629ed0fc0966dee2dcf8094ad2171c5d055d82a18d164b44dda9cc: Status 404 returned error can't find the container with id 2821ba7fde629ed0fc0966dee2dcf8094ad2171c5d055d82a18d164b44dda9cc Jan 28 20:51:10 crc kubenswrapper[4746]: I0128 20:51:10.725720 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" event={"ID":"10acdec7-69f6-42e1-b065-c84b8d82fd03","Type":"ContainerStarted","Data":"2821ba7fde629ed0fc0966dee2dcf8094ad2171c5d055d82a18d164b44dda9cc"} Jan 28 20:51:10 crc kubenswrapper[4746]: I0128 20:51:10.835979 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" Jan 28 20:51:10 crc kubenswrapper[4746]: I0128 20:51:10.836095 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:51:10 crc kubenswrapper[4746]: I0128 20:51:10.836682 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" Jan 28 20:51:10 crc kubenswrapper[4746]: I0128 20:51:10.836804 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" Jan 28 20:51:11 crc kubenswrapper[4746]: I0128 20:51:11.077694 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx"] Jan 28 20:51:11 crc kubenswrapper[4746]: W0128 20:51:11.089818 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09345bfc_4171_49c5_85e3_32616db6ce17.slice/crio-18b7b54e5a75b107ff0641022a26bfe71853c6263b460de96f1673b44e10fda6 WatchSource:0}: Error finding container 18b7b54e5a75b107ff0641022a26bfe71853c6263b460de96f1673b44e10fda6: Status 404 returned error can't find the container with id 18b7b54e5a75b107ff0641022a26bfe71853c6263b460de96f1673b44e10fda6 Jan 28 20:51:11 crc kubenswrapper[4746]: I0128 20:51:11.116898 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg"] Jan 28 20:51:11 crc kubenswrapper[4746]: W0128 20:51:11.126595 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e1b10c8_2491_403a_9ea3_9805d8167d7a.slice/crio-97668fe00b1f3e93731d066dc13fd4f63d78801ad1257354c5450f22c1e001d8 WatchSource:0}: Error finding container 97668fe00b1f3e93731d066dc13fd4f63d78801ad1257354c5450f22c1e001d8: Status 404 returned error can't find the container with id 97668fe00b1f3e93731d066dc13fd4f63d78801ad1257354c5450f22c1e001d8 Jan 28 20:51:11 crc kubenswrapper[4746]: I0128 20:51:11.735484 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" event={"ID":"09345bfc-4171-49c5-85e3-32616db6ce17","Type":"ContainerStarted","Data":"18b7b54e5a75b107ff0641022a26bfe71853c6263b460de96f1673b44e10fda6"} Jan 28 20:51:11 crc kubenswrapper[4746]: I0128 20:51:11.736920 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" event={"ID":"0e1b10c8-2491-403a-9ea3-9805d8167d7a","Type":"ContainerStarted","Data":"97668fe00b1f3e93731d066dc13fd4f63d78801ad1257354c5450f22c1e001d8"} Jan 28 20:51:17 crc kubenswrapper[4746]: I0128 20:51:17.778704 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" event={"ID":"0e1b10c8-2491-403a-9ea3-9805d8167d7a","Type":"ContainerStarted","Data":"006b47fdd3d0c71931d695c9d5e03c77360a2f8fa678c14526fb32b42fdcf3ae"} Jan 28 20:51:17 crc kubenswrapper[4746]: I0128 20:51:17.780252 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" event={"ID":"f13f3a63-44b1-4644-8bea-99e25a6764c3","Type":"ContainerStarted","Data":"1dde72e03c885b15cd5c685cdaf8ac36da9f405ce8a63f6fd181933fa43d4128"} Jan 28 20:51:17 crc kubenswrapper[4746]: I0128 20:51:17.780424 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:51:17 crc kubenswrapper[4746]: I0128 20:51:17.782110 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" event={"ID":"10acdec7-69f6-42e1-b065-c84b8d82fd03","Type":"ContainerStarted","Data":"9890f1afca1dbabda62d40b49def0907e34764f4dea7ac0568ea7133c2061550"} Jan 28 20:51:17 crc kubenswrapper[4746]: I0128 20:51:17.785449 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" event={"ID":"09345bfc-4171-49c5-85e3-32616db6ce17","Type":"ContainerStarted","Data":"0b6b6684aee99fb0107f70c18106d8486cc32fe04be70bfaedaa9ca772d29e62"} Jan 28 20:51:17 crc kubenswrapper[4746]: I0128 20:51:17.827715 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sx9pg" podStartSLOduration=33.762944051 podStartE2EDuration="39.827690614s" podCreationTimestamp="2026-01-28 20:50:38 +0000 UTC" firstStartedPulling="2026-01-28 20:51:11.130477799 +0000 UTC m=+699.086664143" lastFinishedPulling="2026-01-28 20:51:17.195224352 +0000 UTC m=+705.151410706" observedRunningTime="2026-01-28 20:51:17.801394011 +0000 UTC m=+705.757580365" watchObservedRunningTime="2026-01-28 20:51:17.827690614 +0000 UTC m=+705.783876978" Jan 28 20:51:17 crc kubenswrapper[4746]: I0128 20:51:17.827861 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" podStartSLOduration=30.801364666 podStartE2EDuration="38.827854589s" podCreationTimestamp="2026-01-28 20:50:39 +0000 UTC" firstStartedPulling="2026-01-28 20:51:09.167465094 +0000 UTC m=+697.123651438" lastFinishedPulling="2026-01-28 20:51:17.193955007 +0000 UTC m=+705.150141361" observedRunningTime="2026-01-28 20:51:17.824213228 +0000 UTC m=+705.780399582" watchObservedRunningTime="2026-01-28 20:51:17.827854589 +0000 UTC m=+705.784040953" Jan 28 20:51:17 crc kubenswrapper[4746]: I0128 20:51:17.846799 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-stlnx" podStartSLOduration=33.740755131 podStartE2EDuration="39.846767359s" podCreationTimestamp="2026-01-28 20:50:38 +0000 UTC" firstStartedPulling="2026-01-28 20:51:11.093667156 +0000 UTC m=+699.049853510" lastFinishedPulling="2026-01-28 20:51:17.199679374 +0000 UTC m=+705.155865738" observedRunningTime="2026-01-28 20:51:17.842256815 +0000 UTC m=+705.798443179" watchObservedRunningTime="2026-01-28 20:51:17.846767359 +0000 UTC m=+705.802953723" Jan 28 20:51:17 crc kubenswrapper[4746]: I0128 20:51:17.865730 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k" podStartSLOduration=32.780847664 podStartE2EDuration="39.865698531s" podCreationTimestamp="2026-01-28 20:50:38 +0000 UTC" firstStartedPulling="2026-01-28 20:51:10.1091241 +0000 UTC m=+698.065310464" lastFinishedPulling="2026-01-28 20:51:17.193974977 +0000 UTC m=+705.150161331" observedRunningTime="2026-01-28 20:51:17.86458207 +0000 UTC m=+705.820768454" watchObservedRunningTime="2026-01-28 20:51:17.865698531 +0000 UTC m=+705.821884885" Jan 28 20:51:21 crc kubenswrapper[4746]: I0128 20:51:21.823277 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" event={"ID":"2788b8ac-4eb0-46cb-8861-c55d6b302dd7","Type":"ContainerStarted","Data":"eb6b7172ad1a6b1caff3127422016eecb193a7fe0f4f2305cc11c36fd124d6a2"} Jan 28 20:51:21 crc kubenswrapper[4746]: I0128 20:51:21.824178 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:51:21 crc kubenswrapper[4746]: I0128 20:51:21.829380 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" Jan 28 20:51:21 crc kubenswrapper[4746]: I0128 20:51:21.877693 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-m2mx9" podStartSLOduration=31.331679765 podStartE2EDuration="43.877663946s" podCreationTimestamp="2026-01-28 20:50:38 +0000 UTC" firstStartedPulling="2026-01-28 20:51:08.118514215 +0000 UTC m=+696.074700569" lastFinishedPulling="2026-01-28 20:51:20.664498396 +0000 UTC m=+708.620684750" observedRunningTime="2026-01-28 20:51:21.854119198 +0000 UTC m=+709.810305572" watchObservedRunningTime="2026-01-28 20:51:21.877663946 +0000 UTC m=+709.833850330" Jan 28 20:51:29 crc kubenswrapper[4746]: I0128 20:51:29.544592 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-jnzwc" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.163059 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-5dcxq"] Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.164099 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-5dcxq" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.169459 4746 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-j6n8j" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.169680 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.169815 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.187583 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-5dcxq"] Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.192869 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-m56r8"] Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.193758 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-m56r8" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.201980 4746 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-lvcvr" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.212949 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-bzzv6"] Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.213842 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-bzzv6" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.218442 4746 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-kfdv6" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.227984 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-bzzv6"] Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.234151 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-m56r8"] Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.242049 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvn5v\" (UniqueName: \"kubernetes.io/projected/e669e571-cde2-4753-a233-bd4ff6c76f02-kube-api-access-jvn5v\") pod \"cert-manager-cainjector-cf98fcc89-m56r8\" (UID: \"e669e571-cde2-4753-a233-bd4ff6c76f02\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-m56r8" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.242192 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz6hx\" (UniqueName: \"kubernetes.io/projected/6ff603d5-0f8d-415a-8616-55be576956bf-kube-api-access-hz6hx\") pod \"cert-manager-webhook-687f57d79b-bzzv6\" (UID: \"6ff603d5-0f8d-415a-8616-55be576956bf\") " pod="cert-manager/cert-manager-webhook-687f57d79b-bzzv6" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.242227 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwbnw\" (UniqueName: \"kubernetes.io/projected/ca33d567-a88a-4cad-b323-ffbb4ac0e02e-kube-api-access-lwbnw\") pod \"cert-manager-858654f9db-5dcxq\" (UID: \"ca33d567-a88a-4cad-b323-ffbb4ac0e02e\") " pod="cert-manager/cert-manager-858654f9db-5dcxq" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.344025 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz6hx\" (UniqueName: \"kubernetes.io/projected/6ff603d5-0f8d-415a-8616-55be576956bf-kube-api-access-hz6hx\") pod \"cert-manager-webhook-687f57d79b-bzzv6\" (UID: \"6ff603d5-0f8d-415a-8616-55be576956bf\") " pod="cert-manager/cert-manager-webhook-687f57d79b-bzzv6" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.344100 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwbnw\" (UniqueName: \"kubernetes.io/projected/ca33d567-a88a-4cad-b323-ffbb4ac0e02e-kube-api-access-lwbnw\") pod \"cert-manager-858654f9db-5dcxq\" (UID: \"ca33d567-a88a-4cad-b323-ffbb4ac0e02e\") " pod="cert-manager/cert-manager-858654f9db-5dcxq" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.344145 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvn5v\" (UniqueName: \"kubernetes.io/projected/e669e571-cde2-4753-a233-bd4ff6c76f02-kube-api-access-jvn5v\") pod \"cert-manager-cainjector-cf98fcc89-m56r8\" (UID: \"e669e571-cde2-4753-a233-bd4ff6c76f02\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-m56r8" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.365161 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvn5v\" (UniqueName: \"kubernetes.io/projected/e669e571-cde2-4753-a233-bd4ff6c76f02-kube-api-access-jvn5v\") pod \"cert-manager-cainjector-cf98fcc89-m56r8\" (UID: \"e669e571-cde2-4753-a233-bd4ff6c76f02\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-m56r8" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.365173 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwbnw\" (UniqueName: \"kubernetes.io/projected/ca33d567-a88a-4cad-b323-ffbb4ac0e02e-kube-api-access-lwbnw\") pod \"cert-manager-858654f9db-5dcxq\" (UID: \"ca33d567-a88a-4cad-b323-ffbb4ac0e02e\") " pod="cert-manager/cert-manager-858654f9db-5dcxq" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.381019 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz6hx\" (UniqueName: \"kubernetes.io/projected/6ff603d5-0f8d-415a-8616-55be576956bf-kube-api-access-hz6hx\") pod \"cert-manager-webhook-687f57d79b-bzzv6\" (UID: \"6ff603d5-0f8d-415a-8616-55be576956bf\") " pod="cert-manager/cert-manager-webhook-687f57d79b-bzzv6" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.482174 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-5dcxq" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.514851 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-m56r8" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.530036 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-bzzv6" Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.802864 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-5dcxq"] Jan 28 20:51:32 crc kubenswrapper[4746]: I0128 20:51:32.927361 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-5dcxq" event={"ID":"ca33d567-a88a-4cad-b323-ffbb4ac0e02e","Type":"ContainerStarted","Data":"63dc55b77bf5a8f21bfc3291fb0d45459ca1495ec7d72e67a02a3b2ee6b899db"} Jan 28 20:51:33 crc kubenswrapper[4746]: I0128 20:51:33.129064 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-bzzv6"] Jan 28 20:51:33 crc kubenswrapper[4746]: W0128 20:51:33.137034 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ff603d5_0f8d_415a_8616_55be576956bf.slice/crio-f8e78fde82929b2646137b89716f1ecf3794b71c683f620e3423439657bd449e WatchSource:0}: Error finding container f8e78fde82929b2646137b89716f1ecf3794b71c683f620e3423439657bd449e: Status 404 returned error can't find the container with id f8e78fde82929b2646137b89716f1ecf3794b71c683f620e3423439657bd449e Jan 28 20:51:33 crc kubenswrapper[4746]: I0128 20:51:33.192780 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-m56r8"] Jan 28 20:51:33 crc kubenswrapper[4746]: I0128 20:51:33.945760 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-m56r8" event={"ID":"e669e571-cde2-4753-a233-bd4ff6c76f02","Type":"ContainerStarted","Data":"d82f2a73b973124ec9f0aecb163818c4af455b4b7ee6534acba2fbb48aa04566"} Jan 28 20:51:33 crc kubenswrapper[4746]: I0128 20:51:33.950904 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-bzzv6" event={"ID":"6ff603d5-0f8d-415a-8616-55be576956bf","Type":"ContainerStarted","Data":"f8e78fde82929b2646137b89716f1ecf3794b71c683f620e3423439657bd449e"} Jan 28 20:51:37 crc kubenswrapper[4746]: I0128 20:51:37.989477 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-5dcxq" event={"ID":"ca33d567-a88a-4cad-b323-ffbb4ac0e02e","Type":"ContainerStarted","Data":"12f3600ffd9834423108fed43bbed41a2905abdaf88a501dbe44a914f8e35cb3"} Jan 28 20:51:38 crc kubenswrapper[4746]: I0128 20:51:38.010793 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-m56r8" event={"ID":"e669e571-cde2-4753-a233-bd4ff6c76f02","Type":"ContainerStarted","Data":"bc00f03c9c5f7ff14ec6f97d97d6901ccf773a854bca4873db5f2456cb22db42"} Jan 28 20:51:38 crc kubenswrapper[4746]: I0128 20:51:38.019921 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-bzzv6" event={"ID":"6ff603d5-0f8d-415a-8616-55be576956bf","Type":"ContainerStarted","Data":"73cfa23ec756b8ff050c05558ca07824cbc3c6d44ba11fc2dce045b5c09510f2"} Jan 28 20:51:38 crc kubenswrapper[4746]: I0128 20:51:38.021146 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-bzzv6" Jan 28 20:51:38 crc kubenswrapper[4746]: I0128 20:51:38.046792 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-5dcxq" podStartSLOduration=1.422118363 podStartE2EDuration="6.046761919s" podCreationTimestamp="2026-01-28 20:51:32 +0000 UTC" firstStartedPulling="2026-01-28 20:51:32.83290307 +0000 UTC m=+720.789089424" lastFinishedPulling="2026-01-28 20:51:37.457546626 +0000 UTC m=+725.413732980" observedRunningTime="2026-01-28 20:51:38.04013388 +0000 UTC m=+725.996320234" watchObservedRunningTime="2026-01-28 20:51:38.046761919 +0000 UTC m=+726.002948273" Jan 28 20:51:38 crc kubenswrapper[4746]: I0128 20:51:38.086346 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-m56r8" podStartSLOduration=1.8999810419999998 podStartE2EDuration="6.086295146s" podCreationTimestamp="2026-01-28 20:51:32 +0000 UTC" firstStartedPulling="2026-01-28 20:51:33.198627982 +0000 UTC m=+721.154814336" lastFinishedPulling="2026-01-28 20:51:37.384942066 +0000 UTC m=+725.341128440" observedRunningTime="2026-01-28 20:51:38.082538695 +0000 UTC m=+726.038725049" watchObservedRunningTime="2026-01-28 20:51:38.086295146 +0000 UTC m=+726.042481500" Jan 28 20:51:38 crc kubenswrapper[4746]: I0128 20:51:38.117450 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-bzzv6" podStartSLOduration=1.871740209 podStartE2EDuration="6.117421476s" podCreationTimestamp="2026-01-28 20:51:32 +0000 UTC" firstStartedPulling="2026-01-28 20:51:33.139213588 +0000 UTC m=+721.095399942" lastFinishedPulling="2026-01-28 20:51:37.384894815 +0000 UTC m=+725.341081209" observedRunningTime="2026-01-28 20:51:38.115859153 +0000 UTC m=+726.072045517" watchObservedRunningTime="2026-01-28 20:51:38.117421476 +0000 UTC m=+726.073607830" Jan 28 20:51:42 crc kubenswrapper[4746]: I0128 20:51:42.533037 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-bzzv6" Jan 28 20:52:06 crc kubenswrapper[4746]: I0128 20:52:06.295327 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg"] Jan 28 20:52:06 crc kubenswrapper[4746]: I0128 20:52:06.297876 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" Jan 28 20:52:06 crc kubenswrapper[4746]: I0128 20:52:06.302464 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 28 20:52:06 crc kubenswrapper[4746]: I0128 20:52:06.313878 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg"] Jan 28 20:52:06 crc kubenswrapper[4746]: I0128 20:52:06.332525 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e9f38fa-c869-458e-8a7c-fb15ec9acccd-bundle\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg\" (UID: \"2e9f38fa-c869-458e-8a7c-fb15ec9acccd\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" Jan 28 20:52:06 crc kubenswrapper[4746]: I0128 20:52:06.332921 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfbpj\" (UniqueName: \"kubernetes.io/projected/2e9f38fa-c869-458e-8a7c-fb15ec9acccd-kube-api-access-pfbpj\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg\" (UID: \"2e9f38fa-c869-458e-8a7c-fb15ec9acccd\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" Jan 28 20:52:06 crc kubenswrapper[4746]: I0128 20:52:06.333118 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e9f38fa-c869-458e-8a7c-fb15ec9acccd-util\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg\" (UID: \"2e9f38fa-c869-458e-8a7c-fb15ec9acccd\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" Jan 28 20:52:06 crc kubenswrapper[4746]: I0128 20:52:06.434865 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e9f38fa-c869-458e-8a7c-fb15ec9acccd-util\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg\" (UID: \"2e9f38fa-c869-458e-8a7c-fb15ec9acccd\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" Jan 28 20:52:06 crc kubenswrapper[4746]: I0128 20:52:06.435277 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e9f38fa-c869-458e-8a7c-fb15ec9acccd-bundle\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg\" (UID: \"2e9f38fa-c869-458e-8a7c-fb15ec9acccd\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" Jan 28 20:52:06 crc kubenswrapper[4746]: I0128 20:52:06.435399 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfbpj\" (UniqueName: \"kubernetes.io/projected/2e9f38fa-c869-458e-8a7c-fb15ec9acccd-kube-api-access-pfbpj\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg\" (UID: \"2e9f38fa-c869-458e-8a7c-fb15ec9acccd\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" Jan 28 20:52:06 crc kubenswrapper[4746]: I0128 20:52:06.435500 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e9f38fa-c869-458e-8a7c-fb15ec9acccd-util\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg\" (UID: \"2e9f38fa-c869-458e-8a7c-fb15ec9acccd\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" Jan 28 20:52:06 crc kubenswrapper[4746]: I0128 20:52:06.435634 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e9f38fa-c869-458e-8a7c-fb15ec9acccd-bundle\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg\" (UID: \"2e9f38fa-c869-458e-8a7c-fb15ec9acccd\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" Jan 28 20:52:06 crc kubenswrapper[4746]: I0128 20:52:06.458778 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfbpj\" (UniqueName: \"kubernetes.io/projected/2e9f38fa-c869-458e-8a7c-fb15ec9acccd-kube-api-access-pfbpj\") pod \"3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg\" (UID: \"2e9f38fa-c869-458e-8a7c-fb15ec9acccd\") " pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" Jan 28 20:52:06 crc kubenswrapper[4746]: I0128 20:52:06.616499 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" Jan 28 20:52:06 crc kubenswrapper[4746]: I0128 20:52:06.858142 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg"] Jan 28 20:52:07 crc kubenswrapper[4746]: I0128 20:52:07.256813 4746 generic.go:334] "Generic (PLEG): container finished" podID="2e9f38fa-c869-458e-8a7c-fb15ec9acccd" containerID="444bb831902b736299e934af856c327c5bb41a3927a76818160c00ebf7e00510" exitCode=0 Jan 28 20:52:07 crc kubenswrapper[4746]: I0128 20:52:07.256908 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" event={"ID":"2e9f38fa-c869-458e-8a7c-fb15ec9acccd","Type":"ContainerDied","Data":"444bb831902b736299e934af856c327c5bb41a3927a76818160c00ebf7e00510"} Jan 28 20:52:07 crc kubenswrapper[4746]: I0128 20:52:07.256962 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" event={"ID":"2e9f38fa-c869-458e-8a7c-fb15ec9acccd","Type":"ContainerStarted","Data":"1bbaf75f5d052f49092bd5d412915744d07169edd56650a163bb120f28bb85fa"} Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.601237 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8lltf"] Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.604106 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lltf" Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.622832 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8lltf"] Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.668659 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c83c6519-1cce-431c-9869-b0ab5716d2ed-catalog-content\") pod \"redhat-operators-8lltf\" (UID: \"c83c6519-1cce-431c-9869-b0ab5716d2ed\") " pod="openshift-marketplace/redhat-operators-8lltf" Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.668728 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dnkc\" (UniqueName: \"kubernetes.io/projected/c83c6519-1cce-431c-9869-b0ab5716d2ed-kube-api-access-6dnkc\") pod \"redhat-operators-8lltf\" (UID: \"c83c6519-1cce-431c-9869-b0ab5716d2ed\") " pod="openshift-marketplace/redhat-operators-8lltf" Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.668771 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c83c6519-1cce-431c-9869-b0ab5716d2ed-utilities\") pod \"redhat-operators-8lltf\" (UID: \"c83c6519-1cce-431c-9869-b0ab5716d2ed\") " pod="openshift-marketplace/redhat-operators-8lltf" Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.770113 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c83c6519-1cce-431c-9869-b0ab5716d2ed-utilities\") pod \"redhat-operators-8lltf\" (UID: \"c83c6519-1cce-431c-9869-b0ab5716d2ed\") " pod="openshift-marketplace/redhat-operators-8lltf" Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.770227 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c83c6519-1cce-431c-9869-b0ab5716d2ed-catalog-content\") pod \"redhat-operators-8lltf\" (UID: \"c83c6519-1cce-431c-9869-b0ab5716d2ed\") " pod="openshift-marketplace/redhat-operators-8lltf" Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.770285 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dnkc\" (UniqueName: \"kubernetes.io/projected/c83c6519-1cce-431c-9869-b0ab5716d2ed-kube-api-access-6dnkc\") pod \"redhat-operators-8lltf\" (UID: \"c83c6519-1cce-431c-9869-b0ab5716d2ed\") " pod="openshift-marketplace/redhat-operators-8lltf" Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.770895 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c83c6519-1cce-431c-9869-b0ab5716d2ed-utilities\") pod \"redhat-operators-8lltf\" (UID: \"c83c6519-1cce-431c-9869-b0ab5716d2ed\") " pod="openshift-marketplace/redhat-operators-8lltf" Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.770938 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c83c6519-1cce-431c-9869-b0ab5716d2ed-catalog-content\") pod \"redhat-operators-8lltf\" (UID: \"c83c6519-1cce-431c-9869-b0ab5716d2ed\") " pod="openshift-marketplace/redhat-operators-8lltf" Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.792648 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dnkc\" (UniqueName: \"kubernetes.io/projected/c83c6519-1cce-431c-9869-b0ab5716d2ed-kube-api-access-6dnkc\") pod \"redhat-operators-8lltf\" (UID: \"c83c6519-1cce-431c-9869-b0ab5716d2ed\") " pod="openshift-marketplace/redhat-operators-8lltf" Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.855788 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.856836 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.860739 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.860920 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.868489 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.940551 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lltf" Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.974825 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0471cd9c-8dd2-4064-8133-041dd094c43d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0471cd9c-8dd2-4064-8133-041dd094c43d\") pod \"minio\" (UID: \"4454cf22-d4de-41b2-aa88-cba40aedf606\") " pod="minio-dev/minio" Jan 28 20:52:08 crc kubenswrapper[4746]: I0128 20:52:08.974882 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtjk9\" (UniqueName: \"kubernetes.io/projected/4454cf22-d4de-41b2-aa88-cba40aedf606-kube-api-access-vtjk9\") pod \"minio\" (UID: \"4454cf22-d4de-41b2-aa88-cba40aedf606\") " pod="minio-dev/minio" Jan 28 20:52:09 crc kubenswrapper[4746]: I0128 20:52:09.078381 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0471cd9c-8dd2-4064-8133-041dd094c43d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0471cd9c-8dd2-4064-8133-041dd094c43d\") pod \"minio\" (UID: \"4454cf22-d4de-41b2-aa88-cba40aedf606\") " pod="minio-dev/minio" Jan 28 20:52:09 crc kubenswrapper[4746]: I0128 20:52:09.079006 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtjk9\" (UniqueName: \"kubernetes.io/projected/4454cf22-d4de-41b2-aa88-cba40aedf606-kube-api-access-vtjk9\") pod \"minio\" (UID: \"4454cf22-d4de-41b2-aa88-cba40aedf606\") " pod="minio-dev/minio" Jan 28 20:52:09 crc kubenswrapper[4746]: I0128 20:52:09.082859 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 20:52:09 crc kubenswrapper[4746]: I0128 20:52:09.082899 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0471cd9c-8dd2-4064-8133-041dd094c43d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0471cd9c-8dd2-4064-8133-041dd094c43d\") pod \"minio\" (UID: \"4454cf22-d4de-41b2-aa88-cba40aedf606\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/00ac37901245cdaf2ce1c9ae20f90dcc8eea893bace26da352fee767778bcc8f/globalmount\"" pod="minio-dev/minio" Jan 28 20:52:09 crc kubenswrapper[4746]: I0128 20:52:09.116195 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtjk9\" (UniqueName: \"kubernetes.io/projected/4454cf22-d4de-41b2-aa88-cba40aedf606-kube-api-access-vtjk9\") pod \"minio\" (UID: \"4454cf22-d4de-41b2-aa88-cba40aedf606\") " pod="minio-dev/minio" Jan 28 20:52:09 crc kubenswrapper[4746]: I0128 20:52:09.134730 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0471cd9c-8dd2-4064-8133-041dd094c43d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0471cd9c-8dd2-4064-8133-041dd094c43d\") pod \"minio\" (UID: \"4454cf22-d4de-41b2-aa88-cba40aedf606\") " pod="minio-dev/minio" Jan 28 20:52:09 crc kubenswrapper[4746]: I0128 20:52:09.182909 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Jan 28 20:52:09 crc kubenswrapper[4746]: I0128 20:52:09.223723 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8lltf"] Jan 28 20:52:09 crc kubenswrapper[4746]: W0128 20:52:09.248913 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc83c6519_1cce_431c_9869_b0ab5716d2ed.slice/crio-0697696982d370b572dd01af49da370cf72f393c68d88e0b1ffacd351c52d0c1 WatchSource:0}: Error finding container 0697696982d370b572dd01af49da370cf72f393c68d88e0b1ffacd351c52d0c1: Status 404 returned error can't find the container with id 0697696982d370b572dd01af49da370cf72f393c68d88e0b1ffacd351c52d0c1 Jan 28 20:52:09 crc kubenswrapper[4746]: I0128 20:52:09.279031 4746 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 28 20:52:09 crc kubenswrapper[4746]: I0128 20:52:09.279216 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lltf" event={"ID":"c83c6519-1cce-431c-9869-b0ab5716d2ed","Type":"ContainerStarted","Data":"0697696982d370b572dd01af49da370cf72f393c68d88e0b1ffacd351c52d0c1"} Jan 28 20:52:09 crc kubenswrapper[4746]: I0128 20:52:09.281701 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" event={"ID":"2e9f38fa-c869-458e-8a7c-fb15ec9acccd","Type":"ContainerStarted","Data":"f8af53c75c2f51ce7b355850c988f22b9e60cc0b0bb90a836914dde28e86ac3c"} Jan 28 20:52:09 crc kubenswrapper[4746]: I0128 20:52:09.498996 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Jan 28 20:52:10 crc kubenswrapper[4746]: I0128 20:52:10.291386 4746 generic.go:334] "Generic (PLEG): container finished" podID="c83c6519-1cce-431c-9869-b0ab5716d2ed" containerID="f9812bcdd880382f07e19c70cfdd912d1ce26029d51357f205d3906916e1fb62" exitCode=0 Jan 28 20:52:10 crc kubenswrapper[4746]: I0128 20:52:10.291474 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lltf" event={"ID":"c83c6519-1cce-431c-9869-b0ab5716d2ed","Type":"ContainerDied","Data":"f9812bcdd880382f07e19c70cfdd912d1ce26029d51357f205d3906916e1fb62"} Jan 28 20:52:10 crc kubenswrapper[4746]: I0128 20:52:10.296317 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"4454cf22-d4de-41b2-aa88-cba40aedf606","Type":"ContainerStarted","Data":"25bd1947437130852c1c82dd62b146949eff8a491d8304c961aec9566e07fdd0"} Jan 28 20:52:10 crc kubenswrapper[4746]: I0128 20:52:10.315605 4746 generic.go:334] "Generic (PLEG): container finished" podID="2e9f38fa-c869-458e-8a7c-fb15ec9acccd" containerID="f8af53c75c2f51ce7b355850c988f22b9e60cc0b0bb90a836914dde28e86ac3c" exitCode=0 Jan 28 20:52:10 crc kubenswrapper[4746]: I0128 20:52:10.315670 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" event={"ID":"2e9f38fa-c869-458e-8a7c-fb15ec9acccd","Type":"ContainerDied","Data":"f8af53c75c2f51ce7b355850c988f22b9e60cc0b0bb90a836914dde28e86ac3c"} Jan 28 20:52:11 crc kubenswrapper[4746]: I0128 20:52:11.325484 4746 generic.go:334] "Generic (PLEG): container finished" podID="2e9f38fa-c869-458e-8a7c-fb15ec9acccd" containerID="e03f26156e05062e1df0ec8e5e7e7ff2b319dcc2b274519b73bf9fbeb8d4664a" exitCode=0 Jan 28 20:52:11 crc kubenswrapper[4746]: I0128 20:52:11.325678 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" event={"ID":"2e9f38fa-c869-458e-8a7c-fb15ec9acccd","Type":"ContainerDied","Data":"e03f26156e05062e1df0ec8e5e7e7ff2b319dcc2b274519b73bf9fbeb8d4664a"} Jan 28 20:52:12 crc kubenswrapper[4746]: I0128 20:52:12.341753 4746 generic.go:334] "Generic (PLEG): container finished" podID="c83c6519-1cce-431c-9869-b0ab5716d2ed" containerID="9886693d95a884abf3121cff63fe3fd986cfcea83119fd1fe16f978bdf5793f6" exitCode=0 Jan 28 20:52:12 crc kubenswrapper[4746]: I0128 20:52:12.341972 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lltf" event={"ID":"c83c6519-1cce-431c-9869-b0ab5716d2ed","Type":"ContainerDied","Data":"9886693d95a884abf3121cff63fe3fd986cfcea83119fd1fe16f978bdf5793f6"} Jan 28 20:52:12 crc kubenswrapper[4746]: I0128 20:52:12.947801 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" Jan 28 20:52:13 crc kubenswrapper[4746]: I0128 20:52:13.044969 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e9f38fa-c869-458e-8a7c-fb15ec9acccd-util\") pod \"2e9f38fa-c869-458e-8a7c-fb15ec9acccd\" (UID: \"2e9f38fa-c869-458e-8a7c-fb15ec9acccd\") " Jan 28 20:52:13 crc kubenswrapper[4746]: I0128 20:52:13.045032 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e9f38fa-c869-458e-8a7c-fb15ec9acccd-bundle\") pod \"2e9f38fa-c869-458e-8a7c-fb15ec9acccd\" (UID: \"2e9f38fa-c869-458e-8a7c-fb15ec9acccd\") " Jan 28 20:52:13 crc kubenswrapper[4746]: I0128 20:52:13.045073 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfbpj\" (UniqueName: \"kubernetes.io/projected/2e9f38fa-c869-458e-8a7c-fb15ec9acccd-kube-api-access-pfbpj\") pod \"2e9f38fa-c869-458e-8a7c-fb15ec9acccd\" (UID: \"2e9f38fa-c869-458e-8a7c-fb15ec9acccd\") " Jan 28 20:52:13 crc kubenswrapper[4746]: I0128 20:52:13.045986 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e9f38fa-c869-458e-8a7c-fb15ec9acccd-bundle" (OuterVolumeSpecName: "bundle") pod "2e9f38fa-c869-458e-8a7c-fb15ec9acccd" (UID: "2e9f38fa-c869-458e-8a7c-fb15ec9acccd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:52:13 crc kubenswrapper[4746]: I0128 20:52:13.058953 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e9f38fa-c869-458e-8a7c-fb15ec9acccd-util" (OuterVolumeSpecName: "util") pod "2e9f38fa-c869-458e-8a7c-fb15ec9acccd" (UID: "2e9f38fa-c869-458e-8a7c-fb15ec9acccd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:52:13 crc kubenswrapper[4746]: I0128 20:52:13.065252 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e9f38fa-c869-458e-8a7c-fb15ec9acccd-kube-api-access-pfbpj" (OuterVolumeSpecName: "kube-api-access-pfbpj") pod "2e9f38fa-c869-458e-8a7c-fb15ec9acccd" (UID: "2e9f38fa-c869-458e-8a7c-fb15ec9acccd"). InnerVolumeSpecName "kube-api-access-pfbpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:52:13 crc kubenswrapper[4746]: I0128 20:52:13.147096 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfbpj\" (UniqueName: \"kubernetes.io/projected/2e9f38fa-c869-458e-8a7c-fb15ec9acccd-kube-api-access-pfbpj\") on node \"crc\" DevicePath \"\"" Jan 28 20:52:13 crc kubenswrapper[4746]: I0128 20:52:13.147146 4746 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e9f38fa-c869-458e-8a7c-fb15ec9acccd-util\") on node \"crc\" DevicePath \"\"" Jan 28 20:52:13 crc kubenswrapper[4746]: I0128 20:52:13.147157 4746 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e9f38fa-c869-458e-8a7c-fb15ec9acccd-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:52:13 crc kubenswrapper[4746]: I0128 20:52:13.351448 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" event={"ID":"2e9f38fa-c869-458e-8a7c-fb15ec9acccd","Type":"ContainerDied","Data":"1bbaf75f5d052f49092bd5d412915744d07169edd56650a163bb120f28bb85fa"} Jan 28 20:52:13 crc kubenswrapper[4746]: I0128 20:52:13.351496 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bbaf75f5d052f49092bd5d412915744d07169edd56650a163bb120f28bb85fa" Jan 28 20:52:13 crc kubenswrapper[4746]: I0128 20:52:13.351503 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg" Jan 28 20:52:14 crc kubenswrapper[4746]: I0128 20:52:14.360637 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lltf" event={"ID":"c83c6519-1cce-431c-9869-b0ab5716d2ed","Type":"ContainerStarted","Data":"ea5ebbe0e3c3739dddf8d75a5216612d2bbea352018b5688164fcf14b667221e"} Jan 28 20:52:14 crc kubenswrapper[4746]: I0128 20:52:14.363433 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"4454cf22-d4de-41b2-aa88-cba40aedf606","Type":"ContainerStarted","Data":"d0583d229b9fa4c486b040dfe0f72d1fa366c1c44a569fc3098d9ba1a484f778"} Jan 28 20:52:14 crc kubenswrapper[4746]: I0128 20:52:14.386630 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8lltf" podStartSLOduration=3.4383781239999998 podStartE2EDuration="6.386604385s" podCreationTimestamp="2026-01-28 20:52:08 +0000 UTC" firstStartedPulling="2026-01-28 20:52:10.295207271 +0000 UTC m=+758.251393625" lastFinishedPulling="2026-01-28 20:52:13.243433532 +0000 UTC m=+761.199619886" observedRunningTime="2026-01-28 20:52:14.38197061 +0000 UTC m=+762.338156954" watchObservedRunningTime="2026-01-28 20:52:14.386604385 +0000 UTC m=+762.342790739" Jan 28 20:52:14 crc kubenswrapper[4746]: I0128 20:52:14.407177 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=5.684836276 podStartE2EDuration="9.407141829s" podCreationTimestamp="2026-01-28 20:52:05 +0000 UTC" firstStartedPulling="2026-01-28 20:52:09.521855709 +0000 UTC m=+757.478042063" lastFinishedPulling="2026-01-28 20:52:13.244161262 +0000 UTC m=+761.200347616" observedRunningTime="2026-01-28 20:52:14.400151641 +0000 UTC m=+762.356338005" watchObservedRunningTime="2026-01-28 20:52:14.407141829 +0000 UTC m=+762.363328193" Jan 28 20:52:18 crc kubenswrapper[4746]: I0128 20:52:18.940878 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8lltf" Jan 28 20:52:18 crc kubenswrapper[4746]: I0128 20:52:18.941551 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8lltf" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.030032 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g"] Jan 28 20:52:19 crc kubenswrapper[4746]: E0128 20:52:19.030310 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e9f38fa-c869-458e-8a7c-fb15ec9acccd" containerName="extract" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.030325 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e9f38fa-c869-458e-8a7c-fb15ec9acccd" containerName="extract" Jan 28 20:52:19 crc kubenswrapper[4746]: E0128 20:52:19.030338 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e9f38fa-c869-458e-8a7c-fb15ec9acccd" containerName="util" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.030344 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e9f38fa-c869-458e-8a7c-fb15ec9acccd" containerName="util" Jan 28 20:52:19 crc kubenswrapper[4746]: E0128 20:52:19.030358 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e9f38fa-c869-458e-8a7c-fb15ec9acccd" containerName="pull" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.030365 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e9f38fa-c869-458e-8a7c-fb15ec9acccd" containerName="pull" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.030474 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e9f38fa-c869-458e-8a7c-fb15ec9acccd" containerName="extract" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.033031 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.036530 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.036784 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-vvqv5" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.036835 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.036900 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.036908 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.037244 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.052444 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g"] Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.143013 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfda6c5a-4e09-4579-9149-ba5c87aaf387-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6866b6794-24l8g\" (UID: \"cfda6c5a-4e09-4579-9149-ba5c87aaf387\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.143121 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cfda6c5a-4e09-4579-9149-ba5c87aaf387-webhook-cert\") pod \"loki-operator-controller-manager-6866b6794-24l8g\" (UID: \"cfda6c5a-4e09-4579-9149-ba5c87aaf387\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.143152 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/cfda6c5a-4e09-4579-9149-ba5c87aaf387-manager-config\") pod \"loki-operator-controller-manager-6866b6794-24l8g\" (UID: \"cfda6c5a-4e09-4579-9149-ba5c87aaf387\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.143179 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cfda6c5a-4e09-4579-9149-ba5c87aaf387-apiservice-cert\") pod \"loki-operator-controller-manager-6866b6794-24l8g\" (UID: \"cfda6c5a-4e09-4579-9149-ba5c87aaf387\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.143232 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6htr4\" (UniqueName: \"kubernetes.io/projected/cfda6c5a-4e09-4579-9149-ba5c87aaf387-kube-api-access-6htr4\") pod \"loki-operator-controller-manager-6866b6794-24l8g\" (UID: \"cfda6c5a-4e09-4579-9149-ba5c87aaf387\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.244060 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cfda6c5a-4e09-4579-9149-ba5c87aaf387-webhook-cert\") pod \"loki-operator-controller-manager-6866b6794-24l8g\" (UID: \"cfda6c5a-4e09-4579-9149-ba5c87aaf387\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.244127 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/cfda6c5a-4e09-4579-9149-ba5c87aaf387-manager-config\") pod \"loki-operator-controller-manager-6866b6794-24l8g\" (UID: \"cfda6c5a-4e09-4579-9149-ba5c87aaf387\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.244160 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cfda6c5a-4e09-4579-9149-ba5c87aaf387-apiservice-cert\") pod \"loki-operator-controller-manager-6866b6794-24l8g\" (UID: \"cfda6c5a-4e09-4579-9149-ba5c87aaf387\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.244218 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6htr4\" (UniqueName: \"kubernetes.io/projected/cfda6c5a-4e09-4579-9149-ba5c87aaf387-kube-api-access-6htr4\") pod \"loki-operator-controller-manager-6866b6794-24l8g\" (UID: \"cfda6c5a-4e09-4579-9149-ba5c87aaf387\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.244244 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfda6c5a-4e09-4579-9149-ba5c87aaf387-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6866b6794-24l8g\" (UID: \"cfda6c5a-4e09-4579-9149-ba5c87aaf387\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.245456 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/cfda6c5a-4e09-4579-9149-ba5c87aaf387-manager-config\") pod \"loki-operator-controller-manager-6866b6794-24l8g\" (UID: \"cfda6c5a-4e09-4579-9149-ba5c87aaf387\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.253372 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cfda6c5a-4e09-4579-9149-ba5c87aaf387-webhook-cert\") pod \"loki-operator-controller-manager-6866b6794-24l8g\" (UID: \"cfda6c5a-4e09-4579-9149-ba5c87aaf387\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.257820 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cfda6c5a-4e09-4579-9149-ba5c87aaf387-apiservice-cert\") pod \"loki-operator-controller-manager-6866b6794-24l8g\" (UID: \"cfda6c5a-4e09-4579-9149-ba5c87aaf387\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.260267 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfda6c5a-4e09-4579-9149-ba5c87aaf387-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6866b6794-24l8g\" (UID: \"cfda6c5a-4e09-4579-9149-ba5c87aaf387\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.277019 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6htr4\" (UniqueName: \"kubernetes.io/projected/cfda6c5a-4e09-4579-9149-ba5c87aaf387-kube-api-access-6htr4\") pod \"loki-operator-controller-manager-6866b6794-24l8g\" (UID: \"cfda6c5a-4e09-4579-9149-ba5c87aaf387\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.351152 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.578102 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g"] Jan 28 20:52:19 crc kubenswrapper[4746]: I0128 20:52:19.987029 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8lltf" podUID="c83c6519-1cce-431c-9869-b0ab5716d2ed" containerName="registry-server" probeResult="failure" output=< Jan 28 20:52:19 crc kubenswrapper[4746]: timeout: failed to connect service ":50051" within 1s Jan 28 20:52:19 crc kubenswrapper[4746]: > Jan 28 20:52:20 crc kubenswrapper[4746]: I0128 20:52:20.406361 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" event={"ID":"cfda6c5a-4e09-4579-9149-ba5c87aaf387","Type":"ContainerStarted","Data":"027bd44eb735c20e4631b911fba72f7149449389f53658308aa8e341a43fb3f9"} Jan 28 20:52:26 crc kubenswrapper[4746]: I0128 20:52:26.459858 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" event={"ID":"cfda6c5a-4e09-4579-9149-ba5c87aaf387","Type":"ContainerStarted","Data":"0b78b59e8de72a715c4652aad77c226be07d8931a75dc97ace15fc0e43093cba"} Jan 28 20:52:28 crc kubenswrapper[4746]: I0128 20:52:28.988556 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8lltf" Jan 28 20:52:29 crc kubenswrapper[4746]: I0128 20:52:29.062772 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8lltf" Jan 28 20:52:29 crc kubenswrapper[4746]: I0128 20:52:29.590381 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8lltf"] Jan 28 20:52:30 crc kubenswrapper[4746]: I0128 20:52:30.495273 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8lltf" podUID="c83c6519-1cce-431c-9869-b0ab5716d2ed" containerName="registry-server" containerID="cri-o://ea5ebbe0e3c3739dddf8d75a5216612d2bbea352018b5688164fcf14b667221e" gracePeriod=2 Jan 28 20:52:31 crc kubenswrapper[4746]: I0128 20:52:31.517873 4746 generic.go:334] "Generic (PLEG): container finished" podID="c83c6519-1cce-431c-9869-b0ab5716d2ed" containerID="ea5ebbe0e3c3739dddf8d75a5216612d2bbea352018b5688164fcf14b667221e" exitCode=0 Jan 28 20:52:31 crc kubenswrapper[4746]: I0128 20:52:31.517957 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lltf" event={"ID":"c83c6519-1cce-431c-9869-b0ab5716d2ed","Type":"ContainerDied","Data":"ea5ebbe0e3c3739dddf8d75a5216612d2bbea352018b5688164fcf14b667221e"} Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.031848 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lltf" Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.061317 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c83c6519-1cce-431c-9869-b0ab5716d2ed-catalog-content\") pod \"c83c6519-1cce-431c-9869-b0ab5716d2ed\" (UID: \"c83c6519-1cce-431c-9869-b0ab5716d2ed\") " Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.061543 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c83c6519-1cce-431c-9869-b0ab5716d2ed-utilities\") pod \"c83c6519-1cce-431c-9869-b0ab5716d2ed\" (UID: \"c83c6519-1cce-431c-9869-b0ab5716d2ed\") " Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.061657 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dnkc\" (UniqueName: \"kubernetes.io/projected/c83c6519-1cce-431c-9869-b0ab5716d2ed-kube-api-access-6dnkc\") pod \"c83c6519-1cce-431c-9869-b0ab5716d2ed\" (UID: \"c83c6519-1cce-431c-9869-b0ab5716d2ed\") " Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.063001 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c83c6519-1cce-431c-9869-b0ab5716d2ed-utilities" (OuterVolumeSpecName: "utilities") pod "c83c6519-1cce-431c-9869-b0ab5716d2ed" (UID: "c83c6519-1cce-431c-9869-b0ab5716d2ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.075709 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c83c6519-1cce-431c-9869-b0ab5716d2ed-kube-api-access-6dnkc" (OuterVolumeSpecName: "kube-api-access-6dnkc") pod "c83c6519-1cce-431c-9869-b0ab5716d2ed" (UID: "c83c6519-1cce-431c-9869-b0ab5716d2ed"). InnerVolumeSpecName "kube-api-access-6dnkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.164139 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c83c6519-1cce-431c-9869-b0ab5716d2ed-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.164174 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dnkc\" (UniqueName: \"kubernetes.io/projected/c83c6519-1cce-431c-9869-b0ab5716d2ed-kube-api-access-6dnkc\") on node \"crc\" DevicePath \"\"" Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.199049 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c83c6519-1cce-431c-9869-b0ab5716d2ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c83c6519-1cce-431c-9869-b0ab5716d2ed" (UID: "c83c6519-1cce-431c-9869-b0ab5716d2ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.265098 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c83c6519-1cce-431c-9869-b0ab5716d2ed-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.531227 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lltf" event={"ID":"c83c6519-1cce-431c-9869-b0ab5716d2ed","Type":"ContainerDied","Data":"0697696982d370b572dd01af49da370cf72f393c68d88e0b1ffacd351c52d0c1"} Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.531294 4746 scope.go:117] "RemoveContainer" containerID="ea5ebbe0e3c3739dddf8d75a5216612d2bbea352018b5688164fcf14b667221e" Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.531359 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lltf" Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.534502 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" event={"ID":"cfda6c5a-4e09-4579-9149-ba5c87aaf387","Type":"ContainerStarted","Data":"ece4e0c27b1c0d424a4525a4e2e536c0fc4094629f8e06ed0b52633f24c1acfb"} Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.535682 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.538941 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.578547 4746 scope.go:117] "RemoveContainer" containerID="9886693d95a884abf3121cff63fe3fd986cfcea83119fd1fe16f978bdf5793f6" Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.611198 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-6866b6794-24l8g" podStartSLOduration=1.344801426 podStartE2EDuration="13.611175246s" podCreationTimestamp="2026-01-28 20:52:19 +0000 UTC" firstStartedPulling="2026-01-28 20:52:19.594354328 +0000 UTC m=+767.550540683" lastFinishedPulling="2026-01-28 20:52:31.860728149 +0000 UTC m=+779.816914503" observedRunningTime="2026-01-28 20:52:32.59057691 +0000 UTC m=+780.546763284" watchObservedRunningTime="2026-01-28 20:52:32.611175246 +0000 UTC m=+780.567361620" Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.613442 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8lltf"] Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.617691 4746 scope.go:117] "RemoveContainer" containerID="f9812bcdd880382f07e19c70cfdd912d1ce26029d51357f205d3906916e1fb62" Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.622690 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8lltf"] Jan 28 20:52:32 crc kubenswrapper[4746]: I0128 20:52:32.846853 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c83c6519-1cce-431c-9869-b0ab5716d2ed" path="/var/lib/kubelet/pods/c83c6519-1cce-431c-9869-b0ab5716d2ed/volumes" Jan 28 20:52:45 crc kubenswrapper[4746]: I0128 20:52:45.871711 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:52:45 crc kubenswrapper[4746]: I0128 20:52:45.872864 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:52:53 crc kubenswrapper[4746]: I0128 20:52:53.666835 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr"] Jan 28 20:52:53 crc kubenswrapper[4746]: E0128 20:52:53.667959 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83c6519-1cce-431c-9869-b0ab5716d2ed" containerName="extract-content" Jan 28 20:52:53 crc kubenswrapper[4746]: I0128 20:52:53.667977 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83c6519-1cce-431c-9869-b0ab5716d2ed" containerName="extract-content" Jan 28 20:52:53 crc kubenswrapper[4746]: E0128 20:52:53.667988 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83c6519-1cce-431c-9869-b0ab5716d2ed" containerName="extract-utilities" Jan 28 20:52:53 crc kubenswrapper[4746]: I0128 20:52:53.667995 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83c6519-1cce-431c-9869-b0ab5716d2ed" containerName="extract-utilities" Jan 28 20:52:53 crc kubenswrapper[4746]: E0128 20:52:53.668002 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83c6519-1cce-431c-9869-b0ab5716d2ed" containerName="registry-server" Jan 28 20:52:53 crc kubenswrapper[4746]: I0128 20:52:53.668009 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83c6519-1cce-431c-9869-b0ab5716d2ed" containerName="registry-server" Jan 28 20:52:53 crc kubenswrapper[4746]: I0128 20:52:53.668157 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83c6519-1cce-431c-9869-b0ab5716d2ed" containerName="registry-server" Jan 28 20:52:53 crc kubenswrapper[4746]: I0128 20:52:53.669123 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" Jan 28 20:52:53 crc kubenswrapper[4746]: I0128 20:52:53.671397 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 28 20:52:53 crc kubenswrapper[4746]: I0128 20:52:53.680882 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr"] Jan 28 20:52:53 crc kubenswrapper[4746]: I0128 20:52:53.700301 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/743dd52c-2031-4ffc-a4f2-57dfa9438e4e-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr\" (UID: \"743dd52c-2031-4ffc-a4f2-57dfa9438e4e\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" Jan 28 20:52:53 crc kubenswrapper[4746]: I0128 20:52:53.700382 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/743dd52c-2031-4ffc-a4f2-57dfa9438e4e-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr\" (UID: \"743dd52c-2031-4ffc-a4f2-57dfa9438e4e\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" Jan 28 20:52:53 crc kubenswrapper[4746]: I0128 20:52:53.700421 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctspm\" (UniqueName: \"kubernetes.io/projected/743dd52c-2031-4ffc-a4f2-57dfa9438e4e-kube-api-access-ctspm\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr\" (UID: \"743dd52c-2031-4ffc-a4f2-57dfa9438e4e\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" Jan 28 20:52:53 crc kubenswrapper[4746]: I0128 20:52:53.801688 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/743dd52c-2031-4ffc-a4f2-57dfa9438e4e-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr\" (UID: \"743dd52c-2031-4ffc-a4f2-57dfa9438e4e\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" Jan 28 20:52:53 crc kubenswrapper[4746]: I0128 20:52:53.801784 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctspm\" (UniqueName: \"kubernetes.io/projected/743dd52c-2031-4ffc-a4f2-57dfa9438e4e-kube-api-access-ctspm\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr\" (UID: \"743dd52c-2031-4ffc-a4f2-57dfa9438e4e\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" Jan 28 20:52:53 crc kubenswrapper[4746]: I0128 20:52:53.801856 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/743dd52c-2031-4ffc-a4f2-57dfa9438e4e-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr\" (UID: \"743dd52c-2031-4ffc-a4f2-57dfa9438e4e\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" Jan 28 20:52:53 crc kubenswrapper[4746]: I0128 20:52:53.802334 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/743dd52c-2031-4ffc-a4f2-57dfa9438e4e-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr\" (UID: \"743dd52c-2031-4ffc-a4f2-57dfa9438e4e\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" Jan 28 20:52:53 crc kubenswrapper[4746]: I0128 20:52:53.802455 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/743dd52c-2031-4ffc-a4f2-57dfa9438e4e-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr\" (UID: \"743dd52c-2031-4ffc-a4f2-57dfa9438e4e\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" Jan 28 20:52:53 crc kubenswrapper[4746]: I0128 20:52:53.826382 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctspm\" (UniqueName: \"kubernetes.io/projected/743dd52c-2031-4ffc-a4f2-57dfa9438e4e-kube-api-access-ctspm\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr\" (UID: \"743dd52c-2031-4ffc-a4f2-57dfa9438e4e\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" Jan 28 20:52:53 crc kubenswrapper[4746]: I0128 20:52:53.987882 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" Jan 28 20:52:54 crc kubenswrapper[4746]: I0128 20:52:54.436865 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr"] Jan 28 20:52:54 crc kubenswrapper[4746]: I0128 20:52:54.707863 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" event={"ID":"743dd52c-2031-4ffc-a4f2-57dfa9438e4e","Type":"ContainerStarted","Data":"ca1246effebd926234051cfbf99b463cc9eb80bc54a1bca3d69708910b14de63"} Jan 28 20:52:54 crc kubenswrapper[4746]: I0128 20:52:54.708499 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" event={"ID":"743dd52c-2031-4ffc-a4f2-57dfa9438e4e","Type":"ContainerStarted","Data":"882fcca327d7ec4395d18c2757bedf8c605479a0128d849156becaa2670f8763"} Jan 28 20:52:55 crc kubenswrapper[4746]: I0128 20:52:55.715676 4746 generic.go:334] "Generic (PLEG): container finished" podID="743dd52c-2031-4ffc-a4f2-57dfa9438e4e" containerID="ca1246effebd926234051cfbf99b463cc9eb80bc54a1bca3d69708910b14de63" exitCode=0 Jan 28 20:52:55 crc kubenswrapper[4746]: I0128 20:52:55.715727 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" event={"ID":"743dd52c-2031-4ffc-a4f2-57dfa9438e4e","Type":"ContainerDied","Data":"ca1246effebd926234051cfbf99b463cc9eb80bc54a1bca3d69708910b14de63"} Jan 28 20:52:57 crc kubenswrapper[4746]: I0128 20:52:57.730128 4746 generic.go:334] "Generic (PLEG): container finished" podID="743dd52c-2031-4ffc-a4f2-57dfa9438e4e" containerID="a5edcc5cbe381f7f0314742a3f8bf4a083057b451a6a9bc42a42b516c9822635" exitCode=0 Jan 28 20:52:57 crc kubenswrapper[4746]: I0128 20:52:57.730231 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" event={"ID":"743dd52c-2031-4ffc-a4f2-57dfa9438e4e","Type":"ContainerDied","Data":"a5edcc5cbe381f7f0314742a3f8bf4a083057b451a6a9bc42a42b516c9822635"} Jan 28 20:52:58 crc kubenswrapper[4746]: I0128 20:52:58.744145 4746 generic.go:334] "Generic (PLEG): container finished" podID="743dd52c-2031-4ffc-a4f2-57dfa9438e4e" containerID="a18e5319bf586a16473fb6f84a2b1663a5926da5c7c126eb7510db40ba20ae82" exitCode=0 Jan 28 20:52:58 crc kubenswrapper[4746]: I0128 20:52:58.744445 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" event={"ID":"743dd52c-2031-4ffc-a4f2-57dfa9438e4e","Type":"ContainerDied","Data":"a18e5319bf586a16473fb6f84a2b1663a5926da5c7c126eb7510db40ba20ae82"} Jan 28 20:53:00 crc kubenswrapper[4746]: I0128 20:53:00.111187 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" Jan 28 20:53:00 crc kubenswrapper[4746]: I0128 20:53:00.203993 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctspm\" (UniqueName: \"kubernetes.io/projected/743dd52c-2031-4ffc-a4f2-57dfa9438e4e-kube-api-access-ctspm\") pod \"743dd52c-2031-4ffc-a4f2-57dfa9438e4e\" (UID: \"743dd52c-2031-4ffc-a4f2-57dfa9438e4e\") " Jan 28 20:53:00 crc kubenswrapper[4746]: I0128 20:53:00.204106 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/743dd52c-2031-4ffc-a4f2-57dfa9438e4e-util\") pod \"743dd52c-2031-4ffc-a4f2-57dfa9438e4e\" (UID: \"743dd52c-2031-4ffc-a4f2-57dfa9438e4e\") " Jan 28 20:53:00 crc kubenswrapper[4746]: I0128 20:53:00.204178 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/743dd52c-2031-4ffc-a4f2-57dfa9438e4e-bundle\") pod \"743dd52c-2031-4ffc-a4f2-57dfa9438e4e\" (UID: \"743dd52c-2031-4ffc-a4f2-57dfa9438e4e\") " Jan 28 20:53:00 crc kubenswrapper[4746]: I0128 20:53:00.204826 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/743dd52c-2031-4ffc-a4f2-57dfa9438e4e-bundle" (OuterVolumeSpecName: "bundle") pod "743dd52c-2031-4ffc-a4f2-57dfa9438e4e" (UID: "743dd52c-2031-4ffc-a4f2-57dfa9438e4e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:53:00 crc kubenswrapper[4746]: I0128 20:53:00.205028 4746 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/743dd52c-2031-4ffc-a4f2-57dfa9438e4e-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:53:00 crc kubenswrapper[4746]: I0128 20:53:00.210179 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/743dd52c-2031-4ffc-a4f2-57dfa9438e4e-kube-api-access-ctspm" (OuterVolumeSpecName: "kube-api-access-ctspm") pod "743dd52c-2031-4ffc-a4f2-57dfa9438e4e" (UID: "743dd52c-2031-4ffc-a4f2-57dfa9438e4e"). InnerVolumeSpecName "kube-api-access-ctspm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:53:00 crc kubenswrapper[4746]: I0128 20:53:00.219761 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/743dd52c-2031-4ffc-a4f2-57dfa9438e4e-util" (OuterVolumeSpecName: "util") pod "743dd52c-2031-4ffc-a4f2-57dfa9438e4e" (UID: "743dd52c-2031-4ffc-a4f2-57dfa9438e4e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:53:00 crc kubenswrapper[4746]: I0128 20:53:00.306931 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctspm\" (UniqueName: \"kubernetes.io/projected/743dd52c-2031-4ffc-a4f2-57dfa9438e4e-kube-api-access-ctspm\") on node \"crc\" DevicePath \"\"" Jan 28 20:53:00 crc kubenswrapper[4746]: I0128 20:53:00.306977 4746 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/743dd52c-2031-4ffc-a4f2-57dfa9438e4e-util\") on node \"crc\" DevicePath \"\"" Jan 28 20:53:00 crc kubenswrapper[4746]: I0128 20:53:00.767541 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" event={"ID":"743dd52c-2031-4ffc-a4f2-57dfa9438e4e","Type":"ContainerDied","Data":"882fcca327d7ec4395d18c2757bedf8c605479a0128d849156becaa2670f8763"} Jan 28 20:53:00 crc kubenswrapper[4746]: I0128 20:53:00.767600 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="882fcca327d7ec4395d18c2757bedf8c605479a0128d849156becaa2670f8763" Jan 28 20:53:00 crc kubenswrapper[4746]: I0128 20:53:00.767607 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr" Jan 28 20:53:02 crc kubenswrapper[4746]: I0128 20:53:02.755755 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-x2kbm"] Jan 28 20:53:02 crc kubenswrapper[4746]: E0128 20:53:02.756585 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743dd52c-2031-4ffc-a4f2-57dfa9438e4e" containerName="pull" Jan 28 20:53:02 crc kubenswrapper[4746]: I0128 20:53:02.756604 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="743dd52c-2031-4ffc-a4f2-57dfa9438e4e" containerName="pull" Jan 28 20:53:02 crc kubenswrapper[4746]: E0128 20:53:02.756616 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743dd52c-2031-4ffc-a4f2-57dfa9438e4e" containerName="extract" Jan 28 20:53:02 crc kubenswrapper[4746]: I0128 20:53:02.756625 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="743dd52c-2031-4ffc-a4f2-57dfa9438e4e" containerName="extract" Jan 28 20:53:02 crc kubenswrapper[4746]: E0128 20:53:02.756640 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743dd52c-2031-4ffc-a4f2-57dfa9438e4e" containerName="util" Jan 28 20:53:02 crc kubenswrapper[4746]: I0128 20:53:02.756649 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="743dd52c-2031-4ffc-a4f2-57dfa9438e4e" containerName="util" Jan 28 20:53:02 crc kubenswrapper[4746]: I0128 20:53:02.756817 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="743dd52c-2031-4ffc-a4f2-57dfa9438e4e" containerName="extract" Jan 28 20:53:02 crc kubenswrapper[4746]: I0128 20:53:02.757420 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-x2kbm" Jan 28 20:53:02 crc kubenswrapper[4746]: I0128 20:53:02.761118 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 28 20:53:02 crc kubenswrapper[4746]: I0128 20:53:02.761939 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 28 20:53:02 crc kubenswrapper[4746]: I0128 20:53:02.763418 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-f4skg" Jan 28 20:53:02 crc kubenswrapper[4746]: I0128 20:53:02.778920 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-x2kbm"] Jan 28 20:53:02 crc kubenswrapper[4746]: I0128 20:53:02.848180 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz9g4\" (UniqueName: \"kubernetes.io/projected/e8f9251b-82e6-4fde-8a14-4430af400661-kube-api-access-zz9g4\") pod \"nmstate-operator-646758c888-x2kbm\" (UID: \"e8f9251b-82e6-4fde-8a14-4430af400661\") " pod="openshift-nmstate/nmstate-operator-646758c888-x2kbm" Jan 28 20:53:02 crc kubenswrapper[4746]: I0128 20:53:02.950150 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz9g4\" (UniqueName: \"kubernetes.io/projected/e8f9251b-82e6-4fde-8a14-4430af400661-kube-api-access-zz9g4\") pod \"nmstate-operator-646758c888-x2kbm\" (UID: \"e8f9251b-82e6-4fde-8a14-4430af400661\") " pod="openshift-nmstate/nmstate-operator-646758c888-x2kbm" Jan 28 20:53:02 crc kubenswrapper[4746]: I0128 20:53:02.973314 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz9g4\" (UniqueName: \"kubernetes.io/projected/e8f9251b-82e6-4fde-8a14-4430af400661-kube-api-access-zz9g4\") pod \"nmstate-operator-646758c888-x2kbm\" (UID: \"e8f9251b-82e6-4fde-8a14-4430af400661\") " pod="openshift-nmstate/nmstate-operator-646758c888-x2kbm" Jan 28 20:53:03 crc kubenswrapper[4746]: I0128 20:53:03.074303 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-x2kbm" Jan 28 20:53:03 crc kubenswrapper[4746]: I0128 20:53:03.350064 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-x2kbm"] Jan 28 20:53:03 crc kubenswrapper[4746]: I0128 20:53:03.791875 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-x2kbm" event={"ID":"e8f9251b-82e6-4fde-8a14-4430af400661","Type":"ContainerStarted","Data":"7a4fd9c7dc13eea54520ef0d7e1f0d9b536d224370b253cb7815bd8bca4ec504"} Jan 28 20:53:05 crc kubenswrapper[4746]: I0128 20:53:05.812300 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-x2kbm" event={"ID":"e8f9251b-82e6-4fde-8a14-4430af400661","Type":"ContainerStarted","Data":"3e3712fd17ce3583dd3a68ccebe268bf85e928c220af19dcdd27067be5509bb2"} Jan 28 20:53:05 crc kubenswrapper[4746]: I0128 20:53:05.839937 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-x2kbm" podStartSLOduration=1.574023487 podStartE2EDuration="3.839906982s" podCreationTimestamp="2026-01-28 20:53:02 +0000 UTC" firstStartedPulling="2026-01-28 20:53:03.36060716 +0000 UTC m=+811.316793514" lastFinishedPulling="2026-01-28 20:53:05.626490655 +0000 UTC m=+813.582677009" observedRunningTime="2026-01-28 20:53:05.83612075 +0000 UTC m=+813.792307114" watchObservedRunningTime="2026-01-28 20:53:05.839906982 +0000 UTC m=+813.796093336" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.782737 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-v6k5d"] Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.784106 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-v6k5d" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.787740 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-w8bmm" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.793713 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-vffsx"] Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.794724 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-vffsx" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.798560 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.814539 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4108ee2d-3096-4956-95b4-7c2b8327175c-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-vffsx\" (UID: \"4108ee2d-3096-4956-95b4-7c2b8327175c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-vffsx" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.814602 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9q4l\" (UniqueName: \"kubernetes.io/projected/48781ec4-e4a7-402c-a111-22310cfe0305-kube-api-access-s9q4l\") pod \"nmstate-metrics-54757c584b-v6k5d\" (UID: \"48781ec4-e4a7-402c-a111-22310cfe0305\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-v6k5d" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.814634 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkpsf\" (UniqueName: \"kubernetes.io/projected/4108ee2d-3096-4956-95b4-7c2b8327175c-kube-api-access-dkpsf\") pod \"nmstate-webhook-8474b5b9d8-vffsx\" (UID: \"4108ee2d-3096-4956-95b4-7c2b8327175c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-vffsx" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.817463 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-sbwgb"] Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.818625 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-sbwgb" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.848621 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-vffsx"] Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.917422 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4108ee2d-3096-4956-95b4-7c2b8327175c-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-vffsx\" (UID: \"4108ee2d-3096-4956-95b4-7c2b8327175c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-vffsx" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.917522 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/30f828e0-bffb-4b84-be14-53eac55a3ca3-dbus-socket\") pod \"nmstate-handler-sbwgb\" (UID: \"30f828e0-bffb-4b84-be14-53eac55a3ca3\") " pod="openshift-nmstate/nmstate-handler-sbwgb" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.917555 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9q4l\" (UniqueName: \"kubernetes.io/projected/48781ec4-e4a7-402c-a111-22310cfe0305-kube-api-access-s9q4l\") pod \"nmstate-metrics-54757c584b-v6k5d\" (UID: \"48781ec4-e4a7-402c-a111-22310cfe0305\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-v6k5d" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.917589 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/30f828e0-bffb-4b84-be14-53eac55a3ca3-nmstate-lock\") pod \"nmstate-handler-sbwgb\" (UID: \"30f828e0-bffb-4b84-be14-53eac55a3ca3\") " pod="openshift-nmstate/nmstate-handler-sbwgb" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.917621 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p62f\" (UniqueName: \"kubernetes.io/projected/30f828e0-bffb-4b84-be14-53eac55a3ca3-kube-api-access-5p62f\") pod \"nmstate-handler-sbwgb\" (UID: \"30f828e0-bffb-4b84-be14-53eac55a3ca3\") " pod="openshift-nmstate/nmstate-handler-sbwgb" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.917687 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkpsf\" (UniqueName: \"kubernetes.io/projected/4108ee2d-3096-4956-95b4-7c2b8327175c-kube-api-access-dkpsf\") pod \"nmstate-webhook-8474b5b9d8-vffsx\" (UID: \"4108ee2d-3096-4956-95b4-7c2b8327175c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-vffsx" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.917779 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/30f828e0-bffb-4b84-be14-53eac55a3ca3-ovs-socket\") pod \"nmstate-handler-sbwgb\" (UID: \"30f828e0-bffb-4b84-be14-53eac55a3ca3\") " pod="openshift-nmstate/nmstate-handler-sbwgb" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.920515 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-v6k5d"] Jan 28 20:53:06 crc kubenswrapper[4746]: E0128 20:53:06.923510 4746 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Jan 28 20:53:06 crc kubenswrapper[4746]: E0128 20:53:06.923632 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4108ee2d-3096-4956-95b4-7c2b8327175c-tls-key-pair podName:4108ee2d-3096-4956-95b4-7c2b8327175c nodeName:}" failed. No retries permitted until 2026-01-28 20:53:07.42360367 +0000 UTC m=+815.379790024 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/4108ee2d-3096-4956-95b4-7c2b8327175c-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-vffsx" (UID: "4108ee2d-3096-4956-95b4-7c2b8327175c") : secret "openshift-nmstate-webhook" not found Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.955726 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9q4l\" (UniqueName: \"kubernetes.io/projected/48781ec4-e4a7-402c-a111-22310cfe0305-kube-api-access-s9q4l\") pod \"nmstate-metrics-54757c584b-v6k5d\" (UID: \"48781ec4-e4a7-402c-a111-22310cfe0305\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-v6k5d" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.957125 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkpsf\" (UniqueName: \"kubernetes.io/projected/4108ee2d-3096-4956-95b4-7c2b8327175c-kube-api-access-dkpsf\") pod \"nmstate-webhook-8474b5b9d8-vffsx\" (UID: \"4108ee2d-3096-4956-95b4-7c2b8327175c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-vffsx" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.982274 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-4s5cl"] Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.983171 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4s5cl" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.993225 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-xhpgz" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.993607 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.994525 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 28 20:53:06 crc kubenswrapper[4746]: I0128 20:53:06.994952 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-4s5cl"] Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.019547 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6078a6ee-9b98-476e-89f3-5430a34e7ec9-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-4s5cl\" (UID: \"6078a6ee-9b98-476e-89f3-5430a34e7ec9\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4s5cl" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.019647 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldw4n\" (UniqueName: \"kubernetes.io/projected/6078a6ee-9b98-476e-89f3-5430a34e7ec9-kube-api-access-ldw4n\") pod \"nmstate-console-plugin-7754f76f8b-4s5cl\" (UID: \"6078a6ee-9b98-476e-89f3-5430a34e7ec9\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4s5cl" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.019723 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/30f828e0-bffb-4b84-be14-53eac55a3ca3-dbus-socket\") pod \"nmstate-handler-sbwgb\" (UID: \"30f828e0-bffb-4b84-be14-53eac55a3ca3\") " pod="openshift-nmstate/nmstate-handler-sbwgb" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.019746 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/30f828e0-bffb-4b84-be14-53eac55a3ca3-nmstate-lock\") pod \"nmstate-handler-sbwgb\" (UID: \"30f828e0-bffb-4b84-be14-53eac55a3ca3\") " pod="openshift-nmstate/nmstate-handler-sbwgb" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.019766 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p62f\" (UniqueName: \"kubernetes.io/projected/30f828e0-bffb-4b84-be14-53eac55a3ca3-kube-api-access-5p62f\") pod \"nmstate-handler-sbwgb\" (UID: \"30f828e0-bffb-4b84-be14-53eac55a3ca3\") " pod="openshift-nmstate/nmstate-handler-sbwgb" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.019796 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6078a6ee-9b98-476e-89f3-5430a34e7ec9-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-4s5cl\" (UID: \"6078a6ee-9b98-476e-89f3-5430a34e7ec9\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4s5cl" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.019821 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/30f828e0-bffb-4b84-be14-53eac55a3ca3-ovs-socket\") pod \"nmstate-handler-sbwgb\" (UID: \"30f828e0-bffb-4b84-be14-53eac55a3ca3\") " pod="openshift-nmstate/nmstate-handler-sbwgb" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.019902 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/30f828e0-bffb-4b84-be14-53eac55a3ca3-ovs-socket\") pod \"nmstate-handler-sbwgb\" (UID: \"30f828e0-bffb-4b84-be14-53eac55a3ca3\") " pod="openshift-nmstate/nmstate-handler-sbwgb" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.020137 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/30f828e0-bffb-4b84-be14-53eac55a3ca3-dbus-socket\") pod \"nmstate-handler-sbwgb\" (UID: \"30f828e0-bffb-4b84-be14-53eac55a3ca3\") " pod="openshift-nmstate/nmstate-handler-sbwgb" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.020161 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/30f828e0-bffb-4b84-be14-53eac55a3ca3-nmstate-lock\") pod \"nmstate-handler-sbwgb\" (UID: \"30f828e0-bffb-4b84-be14-53eac55a3ca3\") " pod="openshift-nmstate/nmstate-handler-sbwgb" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.053234 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p62f\" (UniqueName: \"kubernetes.io/projected/30f828e0-bffb-4b84-be14-53eac55a3ca3-kube-api-access-5p62f\") pod \"nmstate-handler-sbwgb\" (UID: \"30f828e0-bffb-4b84-be14-53eac55a3ca3\") " pod="openshift-nmstate/nmstate-handler-sbwgb" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.107459 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-v6k5d" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.121344 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6078a6ee-9b98-476e-89f3-5430a34e7ec9-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-4s5cl\" (UID: \"6078a6ee-9b98-476e-89f3-5430a34e7ec9\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4s5cl" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.121842 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6078a6ee-9b98-476e-89f3-5430a34e7ec9-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-4s5cl\" (UID: \"6078a6ee-9b98-476e-89f3-5430a34e7ec9\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4s5cl" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.121879 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldw4n\" (UniqueName: \"kubernetes.io/projected/6078a6ee-9b98-476e-89f3-5430a34e7ec9-kube-api-access-ldw4n\") pod \"nmstate-console-plugin-7754f76f8b-4s5cl\" (UID: \"6078a6ee-9b98-476e-89f3-5430a34e7ec9\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4s5cl" Jan 28 20:53:07 crc kubenswrapper[4746]: E0128 20:53:07.121591 4746 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Jan 28 20:53:07 crc kubenswrapper[4746]: E0128 20:53:07.122032 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6078a6ee-9b98-476e-89f3-5430a34e7ec9-plugin-serving-cert podName:6078a6ee-9b98-476e-89f3-5430a34e7ec9 nodeName:}" failed. No retries permitted until 2026-01-28 20:53:07.622002104 +0000 UTC m=+815.578188458 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/6078a6ee-9b98-476e-89f3-5430a34e7ec9-plugin-serving-cert") pod "nmstate-console-plugin-7754f76f8b-4s5cl" (UID: "6078a6ee-9b98-476e-89f3-5430a34e7ec9") : secret "plugin-serving-cert" not found Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.122828 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6078a6ee-9b98-476e-89f3-5430a34e7ec9-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-4s5cl\" (UID: \"6078a6ee-9b98-476e-89f3-5430a34e7ec9\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4s5cl" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.148479 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-sbwgb" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.155044 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldw4n\" (UniqueName: \"kubernetes.io/projected/6078a6ee-9b98-476e-89f3-5430a34e7ec9-kube-api-access-ldw4n\") pod \"nmstate-console-plugin-7754f76f8b-4s5cl\" (UID: \"6078a6ee-9b98-476e-89f3-5430a34e7ec9\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4s5cl" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.261578 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-64487fb756-szs6b"] Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.262613 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.292740 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64487fb756-szs6b"] Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.330478 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f043266b-71f5-4cca-8903-e635d4d4e4a1-console-config\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.330604 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f043266b-71f5-4cca-8903-e635d4d4e4a1-trusted-ca-bundle\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.330635 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f043266b-71f5-4cca-8903-e635d4d4e4a1-oauth-serving-cert\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.330652 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f043266b-71f5-4cca-8903-e635d4d4e4a1-console-oauth-config\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.330700 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f043266b-71f5-4cca-8903-e635d4d4e4a1-service-ca\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.330725 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s96nd\" (UniqueName: \"kubernetes.io/projected/f043266b-71f5-4cca-8903-e635d4d4e4a1-kube-api-access-s96nd\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.330743 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f043266b-71f5-4cca-8903-e635d4d4e4a1-console-serving-cert\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.433095 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4108ee2d-3096-4956-95b4-7c2b8327175c-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-vffsx\" (UID: \"4108ee2d-3096-4956-95b4-7c2b8327175c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-vffsx" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.433479 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f043266b-71f5-4cca-8903-e635d4d4e4a1-service-ca\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.433506 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s96nd\" (UniqueName: \"kubernetes.io/projected/f043266b-71f5-4cca-8903-e635d4d4e4a1-kube-api-access-s96nd\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.433545 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f043266b-71f5-4cca-8903-e635d4d4e4a1-console-serving-cert\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.433564 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f043266b-71f5-4cca-8903-e635d4d4e4a1-console-config\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.433619 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f043266b-71f5-4cca-8903-e635d4d4e4a1-trusted-ca-bundle\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.433645 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f043266b-71f5-4cca-8903-e635d4d4e4a1-oauth-serving-cert\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.433667 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f043266b-71f5-4cca-8903-e635d4d4e4a1-console-oauth-config\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.438801 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f043266b-71f5-4cca-8903-e635d4d4e4a1-console-config\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.440183 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f043266b-71f5-4cca-8903-e635d4d4e4a1-trusted-ca-bundle\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.440853 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f043266b-71f5-4cca-8903-e635d4d4e4a1-service-ca\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.441095 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f043266b-71f5-4cca-8903-e635d4d4e4a1-oauth-serving-cert\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.444450 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f043266b-71f5-4cca-8903-e635d4d4e4a1-console-serving-cert\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.450825 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f043266b-71f5-4cca-8903-e635d4d4e4a1-console-oauth-config\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.451640 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4108ee2d-3096-4956-95b4-7c2b8327175c-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-vffsx\" (UID: \"4108ee2d-3096-4956-95b4-7c2b8327175c\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-vffsx" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.467762 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s96nd\" (UniqueName: \"kubernetes.io/projected/f043266b-71f5-4cca-8903-e635d4d4e4a1-kube-api-access-s96nd\") pod \"console-64487fb756-szs6b\" (UID: \"f043266b-71f5-4cca-8903-e635d4d4e4a1\") " pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.578716 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.635027 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6078a6ee-9b98-476e-89f3-5430a34e7ec9-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-4s5cl\" (UID: \"6078a6ee-9b98-476e-89f3-5430a34e7ec9\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4s5cl" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.640513 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6078a6ee-9b98-476e-89f3-5430a34e7ec9-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-4s5cl\" (UID: \"6078a6ee-9b98-476e-89f3-5430a34e7ec9\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4s5cl" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.717767 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-vffsx" Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.776771 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-v6k5d"] Jan 28 20:53:07 crc kubenswrapper[4746]: W0128 20:53:07.787375 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48781ec4_e4a7_402c_a111_22310cfe0305.slice/crio-f37675cbfa212c9bef319dd7eb0d6f9f95c2806affa66c1236b449c2b01780f5 WatchSource:0}: Error finding container f37675cbfa212c9bef319dd7eb0d6f9f95c2806affa66c1236b449c2b01780f5: Status 404 returned error can't find the container with id f37675cbfa212c9bef319dd7eb0d6f9f95c2806affa66c1236b449c2b01780f5 Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.802922 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64487fb756-szs6b"] Jan 28 20:53:07 crc kubenswrapper[4746]: W0128 20:53:07.813201 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf043266b_71f5_4cca_8903_e635d4d4e4a1.slice/crio-540502fa87d655230814b82c513afd550fc8c2383fc010db4b0a052f207ddff1 WatchSource:0}: Error finding container 540502fa87d655230814b82c513afd550fc8c2383fc010db4b0a052f207ddff1: Status 404 returned error can't find the container with id 540502fa87d655230814b82c513afd550fc8c2383fc010db4b0a052f207ddff1 Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.828427 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-sbwgb" event={"ID":"30f828e0-bffb-4b84-be14-53eac55a3ca3","Type":"ContainerStarted","Data":"e954dca6e9629db1af20c051a599dfc35a3ee2561a173abe65924a74d89657e7"} Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.840704 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-v6k5d" event={"ID":"48781ec4-e4a7-402c-a111-22310cfe0305","Type":"ContainerStarted","Data":"f37675cbfa212c9bef319dd7eb0d6f9f95c2806affa66c1236b449c2b01780f5"} Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.853293 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64487fb756-szs6b" event={"ID":"f043266b-71f5-4cca-8903-e635d4d4e4a1","Type":"ContainerStarted","Data":"540502fa87d655230814b82c513afd550fc8c2383fc010db4b0a052f207ddff1"} Jan 28 20:53:07 crc kubenswrapper[4746]: I0128 20:53:07.912891 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4s5cl" Jan 28 20:53:08 crc kubenswrapper[4746]: I0128 20:53:08.037746 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-vffsx"] Jan 28 20:53:08 crc kubenswrapper[4746]: I0128 20:53:08.189588 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-4s5cl"] Jan 28 20:53:08 crc kubenswrapper[4746]: W0128 20:53:08.195351 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6078a6ee_9b98_476e_89f3_5430a34e7ec9.slice/crio-4e9af2f02a525c170e15456da9ef91dbc0c3e1bc1c7781079d832e7c6827ee15 WatchSource:0}: Error finding container 4e9af2f02a525c170e15456da9ef91dbc0c3e1bc1c7781079d832e7c6827ee15: Status 404 returned error can't find the container with id 4e9af2f02a525c170e15456da9ef91dbc0c3e1bc1c7781079d832e7c6827ee15 Jan 28 20:53:08 crc kubenswrapper[4746]: I0128 20:53:08.881488 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-vffsx" event={"ID":"4108ee2d-3096-4956-95b4-7c2b8327175c","Type":"ContainerStarted","Data":"b66a6b9ef9571e307d913e5d285c31c2d55ef47df713f0884aabfa70434748a9"} Jan 28 20:53:08 crc kubenswrapper[4746]: I0128 20:53:08.883107 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4s5cl" event={"ID":"6078a6ee-9b98-476e-89f3-5430a34e7ec9","Type":"ContainerStarted","Data":"4e9af2f02a525c170e15456da9ef91dbc0c3e1bc1c7781079d832e7c6827ee15"} Jan 28 20:53:08 crc kubenswrapper[4746]: I0128 20:53:08.884941 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64487fb756-szs6b" event={"ID":"f043266b-71f5-4cca-8903-e635d4d4e4a1","Type":"ContainerStarted","Data":"8a0b1b3411c30bd25c64b5a392fc862bbd1937b0b93bc34760f20c62f339c5b6"} Jan 28 20:53:08 crc kubenswrapper[4746]: I0128 20:53:08.913520 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64487fb756-szs6b" podStartSLOduration=1.913493087 podStartE2EDuration="1.913493087s" podCreationTimestamp="2026-01-28 20:53:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:53:08.904320941 +0000 UTC m=+816.860507315" watchObservedRunningTime="2026-01-28 20:53:08.913493087 +0000 UTC m=+816.869679441" Jan 28 20:53:11 crc kubenswrapper[4746]: I0128 20:53:11.934343 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-sbwgb" event={"ID":"30f828e0-bffb-4b84-be14-53eac55a3ca3","Type":"ContainerStarted","Data":"e8a4886c96d23790563fe071c7a07aa05e1e958a9f193634fe71ea17999744d0"} Jan 28 20:53:11 crc kubenswrapper[4746]: I0128 20:53:11.937281 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-sbwgb" Jan 28 20:53:11 crc kubenswrapper[4746]: I0128 20:53:11.938924 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-v6k5d" event={"ID":"48781ec4-e4a7-402c-a111-22310cfe0305","Type":"ContainerStarted","Data":"dbecf1bbdd5e3a1c27ae631f64262810cd94655debdac8522f1310afc58a6e3d"} Jan 28 20:53:11 crc kubenswrapper[4746]: I0128 20:53:11.941718 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-vffsx" event={"ID":"4108ee2d-3096-4956-95b4-7c2b8327175c","Type":"ContainerStarted","Data":"42e046fd85717551400e0bf4794ef5fe9c5075d46fd2508c773e460d56ae4f76"} Jan 28 20:53:11 crc kubenswrapper[4746]: I0128 20:53:11.941883 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-vffsx" Jan 28 20:53:11 crc kubenswrapper[4746]: I0128 20:53:11.943387 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4s5cl" event={"ID":"6078a6ee-9b98-476e-89f3-5430a34e7ec9","Type":"ContainerStarted","Data":"6658d22a9027f51905b62f5777a4db11e1a84ba55ad32e2fcc27313f0a95948d"} Jan 28 20:53:11 crc kubenswrapper[4746]: I0128 20:53:11.989421 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-sbwgb" podStartSLOduration=2.396452388 podStartE2EDuration="5.989393696s" podCreationTimestamp="2026-01-28 20:53:06 +0000 UTC" firstStartedPulling="2026-01-28 20:53:07.182324291 +0000 UTC m=+815.138510645" lastFinishedPulling="2026-01-28 20:53:10.775265599 +0000 UTC m=+818.731451953" observedRunningTime="2026-01-28 20:53:11.973531218 +0000 UTC m=+819.929717572" watchObservedRunningTime="2026-01-28 20:53:11.989393696 +0000 UTC m=+819.945580050" Jan 28 20:53:12 crc kubenswrapper[4746]: I0128 20:53:12.015346 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-4s5cl" podStartSLOduration=3.342555814 podStartE2EDuration="6.015320236s" podCreationTimestamp="2026-01-28 20:53:06 +0000 UTC" firstStartedPulling="2026-01-28 20:53:08.197221393 +0000 UTC m=+816.153407757" lastFinishedPulling="2026-01-28 20:53:10.869985785 +0000 UTC m=+818.826172179" observedRunningTime="2026-01-28 20:53:12.007871594 +0000 UTC m=+819.964057948" watchObservedRunningTime="2026-01-28 20:53:12.015320236 +0000 UTC m=+819.971506590" Jan 28 20:53:12 crc kubenswrapper[4746]: I0128 20:53:12.082072 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-vffsx" podStartSLOduration=3.377708173 podStartE2EDuration="6.082047865s" podCreationTimestamp="2026-01-28 20:53:06 +0000 UTC" firstStartedPulling="2026-01-28 20:53:08.079480937 +0000 UTC m=+816.035667291" lastFinishedPulling="2026-01-28 20:53:10.783820629 +0000 UTC m=+818.740006983" observedRunningTime="2026-01-28 20:53:12.073273699 +0000 UTC m=+820.029460053" watchObservedRunningTime="2026-01-28 20:53:12.082047865 +0000 UTC m=+820.038234219" Jan 28 20:53:13 crc kubenswrapper[4746]: I0128 20:53:13.967103 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-v6k5d" event={"ID":"48781ec4-e4a7-402c-a111-22310cfe0305","Type":"ContainerStarted","Data":"34de24051c574897b9193416937b4453bc8e1abac23402cd2c314fd359648010"} Jan 28 20:53:14 crc kubenswrapper[4746]: I0128 20:53:14.003339 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-v6k5d" podStartSLOduration=2.221140898 podStartE2EDuration="8.003308941s" podCreationTimestamp="2026-01-28 20:53:06 +0000 UTC" firstStartedPulling="2026-01-28 20:53:07.791195618 +0000 UTC m=+815.747381972" lastFinishedPulling="2026-01-28 20:53:13.573363651 +0000 UTC m=+821.529550015" observedRunningTime="2026-01-28 20:53:13.995424099 +0000 UTC m=+821.951610463" watchObservedRunningTime="2026-01-28 20:53:14.003308941 +0000 UTC m=+821.959495315" Jan 28 20:53:15 crc kubenswrapper[4746]: I0128 20:53:15.872150 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:53:15 crc kubenswrapper[4746]: I0128 20:53:15.872638 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:53:17 crc kubenswrapper[4746]: I0128 20:53:17.176113 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-sbwgb" Jan 28 20:53:17 crc kubenswrapper[4746]: I0128 20:53:17.580484 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:17 crc kubenswrapper[4746]: I0128 20:53:17.580543 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:17 crc kubenswrapper[4746]: I0128 20:53:17.585490 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:18 crc kubenswrapper[4746]: I0128 20:53:18.003537 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64487fb756-szs6b" Jan 28 20:53:18 crc kubenswrapper[4746]: I0128 20:53:18.069058 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hcxv8"] Jan 28 20:53:27 crc kubenswrapper[4746]: I0128 20:53:27.725285 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-vffsx" Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.124400 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-hcxv8" podUID="94cef654-afbe-42c2-8069-5dbcb7294abb" containerName="console" containerID="cri-o://bf07bf85734432391e1902898b2e3ff5cd6265ec338b9d98a12b388e93d34a0d" gracePeriod=15 Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.815380 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hcxv8_94cef654-afbe-42c2-8069-5dbcb7294abb/console/0.log" Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.815773 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.822330 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hcxv8_94cef654-afbe-42c2-8069-5dbcb7294abb/console/0.log" Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.822380 4746 generic.go:334] "Generic (PLEG): container finished" podID="94cef654-afbe-42c2-8069-5dbcb7294abb" containerID="bf07bf85734432391e1902898b2e3ff5cd6265ec338b9d98a12b388e93d34a0d" exitCode=2 Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.822422 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hcxv8" event={"ID":"94cef654-afbe-42c2-8069-5dbcb7294abb","Type":"ContainerDied","Data":"bf07bf85734432391e1902898b2e3ff5cd6265ec338b9d98a12b388e93d34a0d"} Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.822458 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hcxv8" event={"ID":"94cef654-afbe-42c2-8069-5dbcb7294abb","Type":"ContainerDied","Data":"9472ab79081e1688ed461a49bc6fb1a958f71fca59cf6edb83c041bbba201d4d"} Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.822478 4746 scope.go:117] "RemoveContainer" containerID="bf07bf85734432391e1902898b2e3ff5cd6265ec338b9d98a12b388e93d34a0d" Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.822603 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hcxv8" Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.842954 4746 scope.go:117] "RemoveContainer" containerID="bf07bf85734432391e1902898b2e3ff5cd6265ec338b9d98a12b388e93d34a0d" Jan 28 20:53:43 crc kubenswrapper[4746]: E0128 20:53:43.846400 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf07bf85734432391e1902898b2e3ff5cd6265ec338b9d98a12b388e93d34a0d\": container with ID starting with bf07bf85734432391e1902898b2e3ff5cd6265ec338b9d98a12b388e93d34a0d not found: ID does not exist" containerID="bf07bf85734432391e1902898b2e3ff5cd6265ec338b9d98a12b388e93d34a0d" Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.846447 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf07bf85734432391e1902898b2e3ff5cd6265ec338b9d98a12b388e93d34a0d"} err="failed to get container status \"bf07bf85734432391e1902898b2e3ff5cd6265ec338b9d98a12b388e93d34a0d\": rpc error: code = NotFound desc = could not find container \"bf07bf85734432391e1902898b2e3ff5cd6265ec338b9d98a12b388e93d34a0d\": container with ID starting with bf07bf85734432391e1902898b2e3ff5cd6265ec338b9d98a12b388e93d34a0d not found: ID does not exist" Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.966074 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-console-config\") pod \"94cef654-afbe-42c2-8069-5dbcb7294abb\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.966165 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-trusted-ca-bundle\") pod \"94cef654-afbe-42c2-8069-5dbcb7294abb\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.966240 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tknp\" (UniqueName: \"kubernetes.io/projected/94cef654-afbe-42c2-8069-5dbcb7294abb-kube-api-access-6tknp\") pod \"94cef654-afbe-42c2-8069-5dbcb7294abb\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.966317 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-service-ca\") pod \"94cef654-afbe-42c2-8069-5dbcb7294abb\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.966337 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-oauth-serving-cert\") pod \"94cef654-afbe-42c2-8069-5dbcb7294abb\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.966366 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/94cef654-afbe-42c2-8069-5dbcb7294abb-console-serving-cert\") pod \"94cef654-afbe-42c2-8069-5dbcb7294abb\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.966408 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/94cef654-afbe-42c2-8069-5dbcb7294abb-console-oauth-config\") pod \"94cef654-afbe-42c2-8069-5dbcb7294abb\" (UID: \"94cef654-afbe-42c2-8069-5dbcb7294abb\") " Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.967303 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "94cef654-afbe-42c2-8069-5dbcb7294abb" (UID: "94cef654-afbe-42c2-8069-5dbcb7294abb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.967603 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "94cef654-afbe-42c2-8069-5dbcb7294abb" (UID: "94cef654-afbe-42c2-8069-5dbcb7294abb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.967754 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-service-ca" (OuterVolumeSpecName: "service-ca") pod "94cef654-afbe-42c2-8069-5dbcb7294abb" (UID: "94cef654-afbe-42c2-8069-5dbcb7294abb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.968673 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-console-config" (OuterVolumeSpecName: "console-config") pod "94cef654-afbe-42c2-8069-5dbcb7294abb" (UID: "94cef654-afbe-42c2-8069-5dbcb7294abb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.973633 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94cef654-afbe-42c2-8069-5dbcb7294abb-kube-api-access-6tknp" (OuterVolumeSpecName: "kube-api-access-6tknp") pod "94cef654-afbe-42c2-8069-5dbcb7294abb" (UID: "94cef654-afbe-42c2-8069-5dbcb7294abb"). InnerVolumeSpecName "kube-api-access-6tknp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.974346 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94cef654-afbe-42c2-8069-5dbcb7294abb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "94cef654-afbe-42c2-8069-5dbcb7294abb" (UID: "94cef654-afbe-42c2-8069-5dbcb7294abb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:53:43 crc kubenswrapper[4746]: I0128 20:53:43.975571 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94cef654-afbe-42c2-8069-5dbcb7294abb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "94cef654-afbe-42c2-8069-5dbcb7294abb" (UID: "94cef654-afbe-42c2-8069-5dbcb7294abb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.067791 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tknp\" (UniqueName: \"kubernetes.io/projected/94cef654-afbe-42c2-8069-5dbcb7294abb-kube-api-access-6tknp\") on node \"crc\" DevicePath \"\"" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.067840 4746 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-service-ca\") on node \"crc\" DevicePath \"\"" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.067851 4746 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.067861 4746 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/94cef654-afbe-42c2-8069-5dbcb7294abb-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.067874 4746 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/94cef654-afbe-42c2-8069-5dbcb7294abb-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.067882 4746 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-console-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.067891 4746 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94cef654-afbe-42c2-8069-5dbcb7294abb-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.160628 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hcxv8"] Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.167883 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-hcxv8"] Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.256469 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x"] Jan 28 20:53:44 crc kubenswrapper[4746]: E0128 20:53:44.256781 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94cef654-afbe-42c2-8069-5dbcb7294abb" containerName="console" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.256802 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="94cef654-afbe-42c2-8069-5dbcb7294abb" containerName="console" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.256933 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="94cef654-afbe-42c2-8069-5dbcb7294abb" containerName="console" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.257863 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.260350 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.277997 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x"] Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.371521 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jx6c\" (UniqueName: \"kubernetes.io/projected/b238ee1e-c43f-4eb7-8f69-de9f58747168-kube-api-access-6jx6c\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x\" (UID: \"b238ee1e-c43f-4eb7-8f69-de9f58747168\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.371596 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b238ee1e-c43f-4eb7-8f69-de9f58747168-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x\" (UID: \"b238ee1e-c43f-4eb7-8f69-de9f58747168\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.371622 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b238ee1e-c43f-4eb7-8f69-de9f58747168-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x\" (UID: \"b238ee1e-c43f-4eb7-8f69-de9f58747168\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.472906 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b238ee1e-c43f-4eb7-8f69-de9f58747168-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x\" (UID: \"b238ee1e-c43f-4eb7-8f69-de9f58747168\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.472968 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b238ee1e-c43f-4eb7-8f69-de9f58747168-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x\" (UID: \"b238ee1e-c43f-4eb7-8f69-de9f58747168\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.473061 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jx6c\" (UniqueName: \"kubernetes.io/projected/b238ee1e-c43f-4eb7-8f69-de9f58747168-kube-api-access-6jx6c\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x\" (UID: \"b238ee1e-c43f-4eb7-8f69-de9f58747168\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.474251 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b238ee1e-c43f-4eb7-8f69-de9f58747168-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x\" (UID: \"b238ee1e-c43f-4eb7-8f69-de9f58747168\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.475002 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b238ee1e-c43f-4eb7-8f69-de9f58747168-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x\" (UID: \"b238ee1e-c43f-4eb7-8f69-de9f58747168\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.492880 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jx6c\" (UniqueName: \"kubernetes.io/projected/b238ee1e-c43f-4eb7-8f69-de9f58747168-kube-api-access-6jx6c\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x\" (UID: \"b238ee1e-c43f-4eb7-8f69-de9f58747168\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.572667 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.844182 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94cef654-afbe-42c2-8069-5dbcb7294abb" path="/var/lib/kubelet/pods/94cef654-afbe-42c2-8069-5dbcb7294abb/volumes" Jan 28 20:53:44 crc kubenswrapper[4746]: I0128 20:53:44.854140 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x"] Jan 28 20:53:45 crc kubenswrapper[4746]: I0128 20:53:45.840992 4746 generic.go:334] "Generic (PLEG): container finished" podID="b238ee1e-c43f-4eb7-8f69-de9f58747168" containerID="ab9baaa6c41488d568020f01b15df36f0584ccdf128d94e974044e6278c7b0e8" exitCode=0 Jan 28 20:53:45 crc kubenswrapper[4746]: I0128 20:53:45.841094 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x" event={"ID":"b238ee1e-c43f-4eb7-8f69-de9f58747168","Type":"ContainerDied","Data":"ab9baaa6c41488d568020f01b15df36f0584ccdf128d94e974044e6278c7b0e8"} Jan 28 20:53:45 crc kubenswrapper[4746]: I0128 20:53:45.841131 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x" event={"ID":"b238ee1e-c43f-4eb7-8f69-de9f58747168","Type":"ContainerStarted","Data":"be7ecb728f88111b16978db1a1f3e7cffbe9137f3e7395df51a5f2ac8f4422c7"} Jan 28 20:53:45 crc kubenswrapper[4746]: I0128 20:53:45.871285 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:53:45 crc kubenswrapper[4746]: I0128 20:53:45.871374 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:53:45 crc kubenswrapper[4746]: I0128 20:53:45.871431 4746 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:53:45 crc kubenswrapper[4746]: I0128 20:53:45.872315 4746 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4dbcdfa14610109c45d3514591f8d6ce15356b36ba815407076266ee1f95c6fd"} pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 20:53:45 crc kubenswrapper[4746]: I0128 20:53:45.872393 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" containerID="cri-o://4dbcdfa14610109c45d3514591f8d6ce15356b36ba815407076266ee1f95c6fd" gracePeriod=600 Jan 28 20:53:46 crc kubenswrapper[4746]: I0128 20:53:46.849952 4746 generic.go:334] "Generic (PLEG): container finished" podID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerID="4dbcdfa14610109c45d3514591f8d6ce15356b36ba815407076266ee1f95c6fd" exitCode=0 Jan 28 20:53:46 crc kubenswrapper[4746]: I0128 20:53:46.850019 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerDied","Data":"4dbcdfa14610109c45d3514591f8d6ce15356b36ba815407076266ee1f95c6fd"} Jan 28 20:53:46 crc kubenswrapper[4746]: I0128 20:53:46.850973 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerStarted","Data":"635dfdb81316e9a80fdcd2f942f907e439906f4018e69db1be59f1c63b3c993e"} Jan 28 20:53:46 crc kubenswrapper[4746]: I0128 20:53:46.851010 4746 scope.go:117] "RemoveContainer" containerID="e798c8759f80b0bd2201227041fd066d21ab146038c8bedf3a5228a982d21b64" Jan 28 20:53:47 crc kubenswrapper[4746]: I0128 20:53:47.865222 4746 generic.go:334] "Generic (PLEG): container finished" podID="b238ee1e-c43f-4eb7-8f69-de9f58747168" containerID="7adc80a7ee8059791146a610d87b14bd20d01d01ab212d6b00f559ed77dc3783" exitCode=0 Jan 28 20:53:47 crc kubenswrapper[4746]: I0128 20:53:47.865291 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x" event={"ID":"b238ee1e-c43f-4eb7-8f69-de9f58747168","Type":"ContainerDied","Data":"7adc80a7ee8059791146a610d87b14bd20d01d01ab212d6b00f559ed77dc3783"} Jan 28 20:53:48 crc kubenswrapper[4746]: I0128 20:53:48.880264 4746 generic.go:334] "Generic (PLEG): container finished" podID="b238ee1e-c43f-4eb7-8f69-de9f58747168" containerID="3bc7e7dba9e235e26995911d0882def3fc37e4188832eb2c61cbf9a08948ab91" exitCode=0 Jan 28 20:53:48 crc kubenswrapper[4746]: I0128 20:53:48.880397 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x" event={"ID":"b238ee1e-c43f-4eb7-8f69-de9f58747168","Type":"ContainerDied","Data":"3bc7e7dba9e235e26995911d0882def3fc37e4188832eb2c61cbf9a08948ab91"} Jan 28 20:53:50 crc kubenswrapper[4746]: I0128 20:53:50.181096 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x" Jan 28 20:53:50 crc kubenswrapper[4746]: I0128 20:53:50.267779 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b238ee1e-c43f-4eb7-8f69-de9f58747168-util\") pod \"b238ee1e-c43f-4eb7-8f69-de9f58747168\" (UID: \"b238ee1e-c43f-4eb7-8f69-de9f58747168\") " Jan 28 20:53:50 crc kubenswrapper[4746]: I0128 20:53:50.267875 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b238ee1e-c43f-4eb7-8f69-de9f58747168-bundle\") pod \"b238ee1e-c43f-4eb7-8f69-de9f58747168\" (UID: \"b238ee1e-c43f-4eb7-8f69-de9f58747168\") " Jan 28 20:53:50 crc kubenswrapper[4746]: I0128 20:53:50.267960 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jx6c\" (UniqueName: \"kubernetes.io/projected/b238ee1e-c43f-4eb7-8f69-de9f58747168-kube-api-access-6jx6c\") pod \"b238ee1e-c43f-4eb7-8f69-de9f58747168\" (UID: \"b238ee1e-c43f-4eb7-8f69-de9f58747168\") " Jan 28 20:53:50 crc kubenswrapper[4746]: I0128 20:53:50.268885 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b238ee1e-c43f-4eb7-8f69-de9f58747168-bundle" (OuterVolumeSpecName: "bundle") pod "b238ee1e-c43f-4eb7-8f69-de9f58747168" (UID: "b238ee1e-c43f-4eb7-8f69-de9f58747168"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:53:50 crc kubenswrapper[4746]: I0128 20:53:50.275968 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b238ee1e-c43f-4eb7-8f69-de9f58747168-kube-api-access-6jx6c" (OuterVolumeSpecName: "kube-api-access-6jx6c") pod "b238ee1e-c43f-4eb7-8f69-de9f58747168" (UID: "b238ee1e-c43f-4eb7-8f69-de9f58747168"). InnerVolumeSpecName "kube-api-access-6jx6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:53:50 crc kubenswrapper[4746]: I0128 20:53:50.283872 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b238ee1e-c43f-4eb7-8f69-de9f58747168-util" (OuterVolumeSpecName: "util") pod "b238ee1e-c43f-4eb7-8f69-de9f58747168" (UID: "b238ee1e-c43f-4eb7-8f69-de9f58747168"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:53:50 crc kubenswrapper[4746]: I0128 20:53:50.369901 4746 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b238ee1e-c43f-4eb7-8f69-de9f58747168-util\") on node \"crc\" DevicePath \"\"" Jan 28 20:53:50 crc kubenswrapper[4746]: I0128 20:53:50.369942 4746 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b238ee1e-c43f-4eb7-8f69-de9f58747168-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:53:50 crc kubenswrapper[4746]: I0128 20:53:50.369953 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jx6c\" (UniqueName: \"kubernetes.io/projected/b238ee1e-c43f-4eb7-8f69-de9f58747168-kube-api-access-6jx6c\") on node \"crc\" DevicePath \"\"" Jan 28 20:53:50 crc kubenswrapper[4746]: I0128 20:53:50.897087 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x" event={"ID":"b238ee1e-c43f-4eb7-8f69-de9f58747168","Type":"ContainerDied","Data":"be7ecb728f88111b16978db1a1f3e7cffbe9137f3e7395df51a5f2ac8f4422c7"} Jan 28 20:53:50 crc kubenswrapper[4746]: I0128 20:53:50.897156 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be7ecb728f88111b16978db1a1f3e7cffbe9137f3e7395df51a5f2ac8f4422c7" Jan 28 20:53:50 crc kubenswrapper[4746]: I0128 20:53:50.897105 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.054706 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t"] Jan 28 20:53:59 crc kubenswrapper[4746]: E0128 20:53:59.055985 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b238ee1e-c43f-4eb7-8f69-de9f58747168" containerName="util" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.056010 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="b238ee1e-c43f-4eb7-8f69-de9f58747168" containerName="util" Jan 28 20:53:59 crc kubenswrapper[4746]: E0128 20:53:59.056041 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b238ee1e-c43f-4eb7-8f69-de9f58747168" containerName="extract" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.056052 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="b238ee1e-c43f-4eb7-8f69-de9f58747168" containerName="extract" Jan 28 20:53:59 crc kubenswrapper[4746]: E0128 20:53:59.056064 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b238ee1e-c43f-4eb7-8f69-de9f58747168" containerName="pull" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.056117 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="b238ee1e-c43f-4eb7-8f69-de9f58747168" containerName="pull" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.056271 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="b238ee1e-c43f-4eb7-8f69-de9f58747168" containerName="extract" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.056902 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.060930 4746 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.061000 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.061847 4746 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.061891 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.067683 4746 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-4l7h9" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.073806 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t"] Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.227981 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eb7a3d58-a895-43a6-8f29-240cfb61ed98-webhook-cert\") pod \"metallb-operator-controller-manager-5999cb5f6c-ndf7t\" (UID: \"eb7a3d58-a895-43a6-8f29-240cfb61ed98\") " pod="metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.228059 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn4wv\" (UniqueName: \"kubernetes.io/projected/eb7a3d58-a895-43a6-8f29-240cfb61ed98-kube-api-access-kn4wv\") pod \"metallb-operator-controller-manager-5999cb5f6c-ndf7t\" (UID: \"eb7a3d58-a895-43a6-8f29-240cfb61ed98\") " pod="metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.228157 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eb7a3d58-a895-43a6-8f29-240cfb61ed98-apiservice-cert\") pod \"metallb-operator-controller-manager-5999cb5f6c-ndf7t\" (UID: \"eb7a3d58-a895-43a6-8f29-240cfb61ed98\") " pod="metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.316237 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh"] Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.317343 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.319825 4746 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.319850 4746 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-m4kkv" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.320512 4746 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.329498 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eb7a3d58-a895-43a6-8f29-240cfb61ed98-webhook-cert\") pod \"metallb-operator-controller-manager-5999cb5f6c-ndf7t\" (UID: \"eb7a3d58-a895-43a6-8f29-240cfb61ed98\") " pod="metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.329575 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn4wv\" (UniqueName: \"kubernetes.io/projected/eb7a3d58-a895-43a6-8f29-240cfb61ed98-kube-api-access-kn4wv\") pod \"metallb-operator-controller-manager-5999cb5f6c-ndf7t\" (UID: \"eb7a3d58-a895-43a6-8f29-240cfb61ed98\") " pod="metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.329612 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eb7a3d58-a895-43a6-8f29-240cfb61ed98-apiservice-cert\") pod \"metallb-operator-controller-manager-5999cb5f6c-ndf7t\" (UID: \"eb7a3d58-a895-43a6-8f29-240cfb61ed98\") " pod="metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.343195 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eb7a3d58-a895-43a6-8f29-240cfb61ed98-apiservice-cert\") pod \"metallb-operator-controller-manager-5999cb5f6c-ndf7t\" (UID: \"eb7a3d58-a895-43a6-8f29-240cfb61ed98\") " pod="metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.344242 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh"] Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.359100 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn4wv\" (UniqueName: \"kubernetes.io/projected/eb7a3d58-a895-43a6-8f29-240cfb61ed98-kube-api-access-kn4wv\") pod \"metallb-operator-controller-manager-5999cb5f6c-ndf7t\" (UID: \"eb7a3d58-a895-43a6-8f29-240cfb61ed98\") " pod="metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.359586 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eb7a3d58-a895-43a6-8f29-240cfb61ed98-webhook-cert\") pod \"metallb-operator-controller-manager-5999cb5f6c-ndf7t\" (UID: \"eb7a3d58-a895-43a6-8f29-240cfb61ed98\") " pod="metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.383538 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.437644 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1d703849-bf20-4333-9213-23b52999ae43-apiservice-cert\") pod \"metallb-operator-webhook-server-79f4bb6c4-wm9hh\" (UID: \"1d703849-bf20-4333-9213-23b52999ae43\") " pod="metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.437735 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1d703849-bf20-4333-9213-23b52999ae43-webhook-cert\") pod \"metallb-operator-webhook-server-79f4bb6c4-wm9hh\" (UID: \"1d703849-bf20-4333-9213-23b52999ae43\") " pod="metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.437770 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h756f\" (UniqueName: \"kubernetes.io/projected/1d703849-bf20-4333-9213-23b52999ae43-kube-api-access-h756f\") pod \"metallb-operator-webhook-server-79f4bb6c4-wm9hh\" (UID: \"1d703849-bf20-4333-9213-23b52999ae43\") " pod="metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.540460 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1d703849-bf20-4333-9213-23b52999ae43-apiservice-cert\") pod \"metallb-operator-webhook-server-79f4bb6c4-wm9hh\" (UID: \"1d703849-bf20-4333-9213-23b52999ae43\") " pod="metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.541065 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1d703849-bf20-4333-9213-23b52999ae43-webhook-cert\") pod \"metallb-operator-webhook-server-79f4bb6c4-wm9hh\" (UID: \"1d703849-bf20-4333-9213-23b52999ae43\") " pod="metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.541178 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h756f\" (UniqueName: \"kubernetes.io/projected/1d703849-bf20-4333-9213-23b52999ae43-kube-api-access-h756f\") pod \"metallb-operator-webhook-server-79f4bb6c4-wm9hh\" (UID: \"1d703849-bf20-4333-9213-23b52999ae43\") " pod="metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.547190 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1d703849-bf20-4333-9213-23b52999ae43-apiservice-cert\") pod \"metallb-operator-webhook-server-79f4bb6c4-wm9hh\" (UID: \"1d703849-bf20-4333-9213-23b52999ae43\") " pod="metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.559149 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1d703849-bf20-4333-9213-23b52999ae43-webhook-cert\") pod \"metallb-operator-webhook-server-79f4bb6c4-wm9hh\" (UID: \"1d703849-bf20-4333-9213-23b52999ae43\") " pod="metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.563006 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h756f\" (UniqueName: \"kubernetes.io/projected/1d703849-bf20-4333-9213-23b52999ae43-kube-api-access-h756f\") pod \"metallb-operator-webhook-server-79f4bb6c4-wm9hh\" (UID: \"1d703849-bf20-4333-9213-23b52999ae43\") " pod="metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.635922 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh" Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.695300 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t"] Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.960002 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t" event={"ID":"eb7a3d58-a895-43a6-8f29-240cfb61ed98","Type":"ContainerStarted","Data":"d479430ee9249460fa52700943b81c59677f83adfae169b067b41fe62a1e1634"} Jan 28 20:53:59 crc kubenswrapper[4746]: I0128 20:53:59.989736 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh"] Jan 28 20:53:59 crc kubenswrapper[4746]: W0128 20:53:59.992601 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d703849_bf20_4333_9213_23b52999ae43.slice/crio-30bd60d689918536b6c2657793479f19e3daa972bb67686c1e6f742df5ddcd88 WatchSource:0}: Error finding container 30bd60d689918536b6c2657793479f19e3daa972bb67686c1e6f742df5ddcd88: Status 404 returned error can't find the container with id 30bd60d689918536b6c2657793479f19e3daa972bb67686c1e6f742df5ddcd88 Jan 28 20:54:00 crc kubenswrapper[4746]: I0128 20:54:00.967618 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh" event={"ID":"1d703849-bf20-4333-9213-23b52999ae43","Type":"ContainerStarted","Data":"30bd60d689918536b6c2657793479f19e3daa972bb67686c1e6f742df5ddcd88"} Jan 28 20:54:06 crc kubenswrapper[4746]: I0128 20:54:06.034405 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t" event={"ID":"eb7a3d58-a895-43a6-8f29-240cfb61ed98","Type":"ContainerStarted","Data":"acd1b48b55901c11ce5162653acc7ed10e4891613516cd6a3f267c4fe5b905b9"} Jan 28 20:54:06 crc kubenswrapper[4746]: I0128 20:54:06.036759 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh" Jan 28 20:54:06 crc kubenswrapper[4746]: I0128 20:54:06.058757 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh" podStartSLOduration=1.203693627 podStartE2EDuration="7.058732363s" podCreationTimestamp="2026-01-28 20:53:59 +0000 UTC" firstStartedPulling="2026-01-28 20:53:59.995975191 +0000 UTC m=+867.952161555" lastFinishedPulling="2026-01-28 20:54:05.851013947 +0000 UTC m=+873.807200291" observedRunningTime="2026-01-28 20:54:06.055444204 +0000 UTC m=+874.011630568" watchObservedRunningTime="2026-01-28 20:54:06.058732363 +0000 UTC m=+874.014918717" Jan 28 20:54:07 crc kubenswrapper[4746]: I0128 20:54:07.045498 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh" event={"ID":"1d703849-bf20-4333-9213-23b52999ae43","Type":"ContainerStarted","Data":"93f4ffac4a0783443e505cb4fa9f5da3edfe602f81ef838a562d4d3d38a4a0c2"} Jan 28 20:54:07 crc kubenswrapper[4746]: I0128 20:54:07.046123 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t" Jan 28 20:54:07 crc kubenswrapper[4746]: I0128 20:54:07.076614 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t" podStartSLOduration=1.9513952030000001 podStartE2EDuration="8.076585986s" podCreationTimestamp="2026-01-28 20:53:59 +0000 UTC" firstStartedPulling="2026-01-28 20:53:59.70714572 +0000 UTC m=+867.663332074" lastFinishedPulling="2026-01-28 20:54:05.832336503 +0000 UTC m=+873.788522857" observedRunningTime="2026-01-28 20:54:07.073534614 +0000 UTC m=+875.029720988" watchObservedRunningTime="2026-01-28 20:54:07.076585986 +0000 UTC m=+875.032772350" Jan 28 20:54:19 crc kubenswrapper[4746]: I0128 20:54:19.644937 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-79f4bb6c4-wm9hh" Jan 28 20:54:31 crc kubenswrapper[4746]: I0128 20:54:31.132317 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fcx4p"] Jan 28 20:54:31 crc kubenswrapper[4746]: I0128 20:54:31.135610 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fcx4p" Jan 28 20:54:31 crc kubenswrapper[4746]: I0128 20:54:31.156183 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fcx4p"] Jan 28 20:54:31 crc kubenswrapper[4746]: I0128 20:54:31.257899 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9535cd-70dd-4373-8309-0ea6ad3dfd34-utilities\") pod \"redhat-marketplace-fcx4p\" (UID: \"8b9535cd-70dd-4373-8309-0ea6ad3dfd34\") " pod="openshift-marketplace/redhat-marketplace-fcx4p" Jan 28 20:54:31 crc kubenswrapper[4746]: I0128 20:54:31.258333 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9535cd-70dd-4373-8309-0ea6ad3dfd34-catalog-content\") pod \"redhat-marketplace-fcx4p\" (UID: \"8b9535cd-70dd-4373-8309-0ea6ad3dfd34\") " pod="openshift-marketplace/redhat-marketplace-fcx4p" Jan 28 20:54:31 crc kubenswrapper[4746]: I0128 20:54:31.258386 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq7r2\" (UniqueName: \"kubernetes.io/projected/8b9535cd-70dd-4373-8309-0ea6ad3dfd34-kube-api-access-mq7r2\") pod \"redhat-marketplace-fcx4p\" (UID: \"8b9535cd-70dd-4373-8309-0ea6ad3dfd34\") " pod="openshift-marketplace/redhat-marketplace-fcx4p" Jan 28 20:54:31 crc kubenswrapper[4746]: I0128 20:54:31.360983 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9535cd-70dd-4373-8309-0ea6ad3dfd34-catalog-content\") pod \"redhat-marketplace-fcx4p\" (UID: \"8b9535cd-70dd-4373-8309-0ea6ad3dfd34\") " pod="openshift-marketplace/redhat-marketplace-fcx4p" Jan 28 20:54:31 crc kubenswrapper[4746]: I0128 20:54:31.360060 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9535cd-70dd-4373-8309-0ea6ad3dfd34-catalog-content\") pod \"redhat-marketplace-fcx4p\" (UID: \"8b9535cd-70dd-4373-8309-0ea6ad3dfd34\") " pod="openshift-marketplace/redhat-marketplace-fcx4p" Jan 28 20:54:31 crc kubenswrapper[4746]: I0128 20:54:31.361249 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq7r2\" (UniqueName: \"kubernetes.io/projected/8b9535cd-70dd-4373-8309-0ea6ad3dfd34-kube-api-access-mq7r2\") pod \"redhat-marketplace-fcx4p\" (UID: \"8b9535cd-70dd-4373-8309-0ea6ad3dfd34\") " pod="openshift-marketplace/redhat-marketplace-fcx4p" Jan 28 20:54:31 crc kubenswrapper[4746]: I0128 20:54:31.361380 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9535cd-70dd-4373-8309-0ea6ad3dfd34-utilities\") pod \"redhat-marketplace-fcx4p\" (UID: \"8b9535cd-70dd-4373-8309-0ea6ad3dfd34\") " pod="openshift-marketplace/redhat-marketplace-fcx4p" Jan 28 20:54:31 crc kubenswrapper[4746]: I0128 20:54:31.361910 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9535cd-70dd-4373-8309-0ea6ad3dfd34-utilities\") pod \"redhat-marketplace-fcx4p\" (UID: \"8b9535cd-70dd-4373-8309-0ea6ad3dfd34\") " pod="openshift-marketplace/redhat-marketplace-fcx4p" Jan 28 20:54:31 crc kubenswrapper[4746]: I0128 20:54:31.389334 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq7r2\" (UniqueName: \"kubernetes.io/projected/8b9535cd-70dd-4373-8309-0ea6ad3dfd34-kube-api-access-mq7r2\") pod \"redhat-marketplace-fcx4p\" (UID: \"8b9535cd-70dd-4373-8309-0ea6ad3dfd34\") " pod="openshift-marketplace/redhat-marketplace-fcx4p" Jan 28 20:54:31 crc kubenswrapper[4746]: I0128 20:54:31.467288 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fcx4p" Jan 28 20:54:31 crc kubenswrapper[4746]: I0128 20:54:31.951546 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fcx4p"] Jan 28 20:54:32 crc kubenswrapper[4746]: I0128 20:54:32.236862 4746 generic.go:334] "Generic (PLEG): container finished" podID="8b9535cd-70dd-4373-8309-0ea6ad3dfd34" containerID="cfb8b48d9d7b3295b816d461d4d5539e4ac32cab1dcc7144051ee26877167ff6" exitCode=0 Jan 28 20:54:32 crc kubenswrapper[4746]: I0128 20:54:32.236928 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcx4p" event={"ID":"8b9535cd-70dd-4373-8309-0ea6ad3dfd34","Type":"ContainerDied","Data":"cfb8b48d9d7b3295b816d461d4d5539e4ac32cab1dcc7144051ee26877167ff6"} Jan 28 20:54:32 crc kubenswrapper[4746]: I0128 20:54:32.237317 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcx4p" event={"ID":"8b9535cd-70dd-4373-8309-0ea6ad3dfd34","Type":"ContainerStarted","Data":"e67c9e8283f130c19b441b963a01e60d4a9e6c53dfed4fb3d66fbc7def2b6ee8"} Jan 28 20:54:33 crc kubenswrapper[4746]: I0128 20:54:33.245974 4746 generic.go:334] "Generic (PLEG): container finished" podID="8b9535cd-70dd-4373-8309-0ea6ad3dfd34" containerID="1d4a38770efc0618437a5f55f98c709e9453d6d18b2e61ae677f2d7358c3da12" exitCode=0 Jan 28 20:54:33 crc kubenswrapper[4746]: I0128 20:54:33.246036 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcx4p" event={"ID":"8b9535cd-70dd-4373-8309-0ea6ad3dfd34","Type":"ContainerDied","Data":"1d4a38770efc0618437a5f55f98c709e9453d6d18b2e61ae677f2d7358c3da12"} Jan 28 20:54:34 crc kubenswrapper[4746]: I0128 20:54:34.258793 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcx4p" event={"ID":"8b9535cd-70dd-4373-8309-0ea6ad3dfd34","Type":"ContainerStarted","Data":"9329f83f85b1cbdac37c069f9697ac7193e757b7400086d07685152d4c2d6cd8"} Jan 28 20:54:34 crc kubenswrapper[4746]: I0128 20:54:34.295190 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fcx4p" podStartSLOduration=1.875713952 podStartE2EDuration="3.295161944s" podCreationTimestamp="2026-01-28 20:54:31 +0000 UTC" firstStartedPulling="2026-01-28 20:54:32.239493881 +0000 UTC m=+900.195680235" lastFinishedPulling="2026-01-28 20:54:33.658941863 +0000 UTC m=+901.615128227" observedRunningTime="2026-01-28 20:54:34.286677036 +0000 UTC m=+902.242863440" watchObservedRunningTime="2026-01-28 20:54:34.295161944 +0000 UTC m=+902.251348338" Jan 28 20:54:35 crc kubenswrapper[4746]: I0128 20:54:35.715984 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n5r4d"] Jan 28 20:54:35 crc kubenswrapper[4746]: I0128 20:54:35.717858 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n5r4d" Jan 28 20:54:35 crc kubenswrapper[4746]: I0128 20:54:35.734886 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n5r4d"] Jan 28 20:54:35 crc kubenswrapper[4746]: I0128 20:54:35.858068 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cqzq\" (UniqueName: \"kubernetes.io/projected/f872cd48-4b0a-45b8-95de-1b42589d574a-kube-api-access-9cqzq\") pod \"certified-operators-n5r4d\" (UID: \"f872cd48-4b0a-45b8-95de-1b42589d574a\") " pod="openshift-marketplace/certified-operators-n5r4d" Jan 28 20:54:35 crc kubenswrapper[4746]: I0128 20:54:35.858200 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f872cd48-4b0a-45b8-95de-1b42589d574a-catalog-content\") pod \"certified-operators-n5r4d\" (UID: \"f872cd48-4b0a-45b8-95de-1b42589d574a\") " pod="openshift-marketplace/certified-operators-n5r4d" Jan 28 20:54:35 crc kubenswrapper[4746]: I0128 20:54:35.858273 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f872cd48-4b0a-45b8-95de-1b42589d574a-utilities\") pod \"certified-operators-n5r4d\" (UID: \"f872cd48-4b0a-45b8-95de-1b42589d574a\") " pod="openshift-marketplace/certified-operators-n5r4d" Jan 28 20:54:35 crc kubenswrapper[4746]: I0128 20:54:35.959818 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cqzq\" (UniqueName: \"kubernetes.io/projected/f872cd48-4b0a-45b8-95de-1b42589d574a-kube-api-access-9cqzq\") pod \"certified-operators-n5r4d\" (UID: \"f872cd48-4b0a-45b8-95de-1b42589d574a\") " pod="openshift-marketplace/certified-operators-n5r4d" Jan 28 20:54:35 crc kubenswrapper[4746]: I0128 20:54:35.959895 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f872cd48-4b0a-45b8-95de-1b42589d574a-catalog-content\") pod \"certified-operators-n5r4d\" (UID: \"f872cd48-4b0a-45b8-95de-1b42589d574a\") " pod="openshift-marketplace/certified-operators-n5r4d" Jan 28 20:54:35 crc kubenswrapper[4746]: I0128 20:54:35.959982 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f872cd48-4b0a-45b8-95de-1b42589d574a-utilities\") pod \"certified-operators-n5r4d\" (UID: \"f872cd48-4b0a-45b8-95de-1b42589d574a\") " pod="openshift-marketplace/certified-operators-n5r4d" Jan 28 20:54:35 crc kubenswrapper[4746]: I0128 20:54:35.960628 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f872cd48-4b0a-45b8-95de-1b42589d574a-catalog-content\") pod \"certified-operators-n5r4d\" (UID: \"f872cd48-4b0a-45b8-95de-1b42589d574a\") " pod="openshift-marketplace/certified-operators-n5r4d" Jan 28 20:54:35 crc kubenswrapper[4746]: I0128 20:54:35.960759 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f872cd48-4b0a-45b8-95de-1b42589d574a-utilities\") pod \"certified-operators-n5r4d\" (UID: \"f872cd48-4b0a-45b8-95de-1b42589d574a\") " pod="openshift-marketplace/certified-operators-n5r4d" Jan 28 20:54:35 crc kubenswrapper[4746]: I0128 20:54:35.989155 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cqzq\" (UniqueName: \"kubernetes.io/projected/f872cd48-4b0a-45b8-95de-1b42589d574a-kube-api-access-9cqzq\") pod \"certified-operators-n5r4d\" (UID: \"f872cd48-4b0a-45b8-95de-1b42589d574a\") " pod="openshift-marketplace/certified-operators-n5r4d" Jan 28 20:54:36 crc kubenswrapper[4746]: I0128 20:54:36.045026 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n5r4d" Jan 28 20:54:36 crc kubenswrapper[4746]: I0128 20:54:36.624470 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n5r4d"] Jan 28 20:54:36 crc kubenswrapper[4746]: W0128 20:54:36.632597 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf872cd48_4b0a_45b8_95de_1b42589d574a.slice/crio-1f480bc6be7af1de3ce586cb2ab77875d80871461d725300cddef8257bcec9b9 WatchSource:0}: Error finding container 1f480bc6be7af1de3ce586cb2ab77875d80871461d725300cddef8257bcec9b9: Status 404 returned error can't find the container with id 1f480bc6be7af1de3ce586cb2ab77875d80871461d725300cddef8257bcec9b9 Jan 28 20:54:37 crc kubenswrapper[4746]: I0128 20:54:37.282140 4746 generic.go:334] "Generic (PLEG): container finished" podID="f872cd48-4b0a-45b8-95de-1b42589d574a" containerID="926cd735fd62c33dfd234c154769d997132dcbb79c6e700b487194684268ca15" exitCode=0 Jan 28 20:54:37 crc kubenswrapper[4746]: I0128 20:54:37.282632 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5r4d" event={"ID":"f872cd48-4b0a-45b8-95de-1b42589d574a","Type":"ContainerDied","Data":"926cd735fd62c33dfd234c154769d997132dcbb79c6e700b487194684268ca15"} Jan 28 20:54:37 crc kubenswrapper[4746]: I0128 20:54:37.282668 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5r4d" event={"ID":"f872cd48-4b0a-45b8-95de-1b42589d574a","Type":"ContainerStarted","Data":"1f480bc6be7af1de3ce586cb2ab77875d80871461d725300cddef8257bcec9b9"} Jan 28 20:54:38 crc kubenswrapper[4746]: I0128 20:54:38.291611 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5r4d" event={"ID":"f872cd48-4b0a-45b8-95de-1b42589d574a","Type":"ContainerStarted","Data":"a02a844a7d12231993293fde0d2effffe4998f25c384c23f1f265f3bd8483b65"} Jan 28 20:54:39 crc kubenswrapper[4746]: I0128 20:54:39.300257 4746 generic.go:334] "Generic (PLEG): container finished" podID="f872cd48-4b0a-45b8-95de-1b42589d574a" containerID="a02a844a7d12231993293fde0d2effffe4998f25c384c23f1f265f3bd8483b65" exitCode=0 Jan 28 20:54:39 crc kubenswrapper[4746]: I0128 20:54:39.300310 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5r4d" event={"ID":"f872cd48-4b0a-45b8-95de-1b42589d574a","Type":"ContainerDied","Data":"a02a844a7d12231993293fde0d2effffe4998f25c384c23f1f265f3bd8483b65"} Jan 28 20:54:39 crc kubenswrapper[4746]: I0128 20:54:39.389103 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5999cb5f6c-ndf7t" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.469218 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-hws9w"] Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.472248 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.474036 4746 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-9ng96" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.475259 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.478215 4746 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.485349 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-5crvf"] Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.486565 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5crvf" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.488610 4746 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.499160 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-5crvf"] Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.543514 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/be75902a-e591-4378-89b8-9cab1f53dc5f-frr-conf\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.543587 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/be75902a-e591-4378-89b8-9cab1f53dc5f-frr-sockets\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.543621 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/be75902a-e591-4378-89b8-9cab1f53dc5f-metrics\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.543644 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/be75902a-e591-4378-89b8-9cab1f53dc5f-reloader\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.543676 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swm5f\" (UniqueName: \"kubernetes.io/projected/be75902a-e591-4378-89b8-9cab1f53dc5f-kube-api-access-swm5f\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.543995 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/be75902a-e591-4378-89b8-9cab1f53dc5f-frr-startup\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.544121 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be75902a-e591-4378-89b8-9cab1f53dc5f-metrics-certs\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.544167 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4ljp\" (UniqueName: \"kubernetes.io/projected/64704f76-28dc-42cf-a696-9473b337eee9-kube-api-access-m4ljp\") pod \"frr-k8s-webhook-server-7df86c4f6c-5crvf\" (UID: \"64704f76-28dc-42cf-a696-9473b337eee9\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5crvf" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.544331 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64704f76-28dc-42cf-a696-9473b337eee9-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-5crvf\" (UID: \"64704f76-28dc-42cf-a696-9473b337eee9\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5crvf" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.600285 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-m55jn"] Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.601801 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-m55jn" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.605747 4746 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-nz4q8" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.605948 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.606096 4746 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.613287 4746 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.618664 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-r2vlm"] Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.619805 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-r2vlm" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.622544 4746 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.640975 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-r2vlm"] Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.645447 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/be75902a-e591-4378-89b8-9cab1f53dc5f-frr-sockets\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.645518 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/be75902a-e591-4378-89b8-9cab1f53dc5f-metrics\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.645560 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-metallb-excludel2\") pod \"speaker-m55jn\" (UID: \"c3d285c6-0abf-4c0b-92f5-1c91659d1de1\") " pod="metallb-system/speaker-m55jn" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.645586 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/be75902a-e591-4378-89b8-9cab1f53dc5f-reloader\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.645622 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm6zl\" (UniqueName: \"kubernetes.io/projected/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-kube-api-access-cm6zl\") pod \"speaker-m55jn\" (UID: \"c3d285c6-0abf-4c0b-92f5-1c91659d1de1\") " pod="metallb-system/speaker-m55jn" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.645651 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swm5f\" (UniqueName: \"kubernetes.io/projected/be75902a-e591-4378-89b8-9cab1f53dc5f-kube-api-access-swm5f\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.645697 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/be75902a-e591-4378-89b8-9cab1f53dc5f-frr-startup\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.645730 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-metrics-certs\") pod \"speaker-m55jn\" (UID: \"c3d285c6-0abf-4c0b-92f5-1c91659d1de1\") " pod="metallb-system/speaker-m55jn" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.645756 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be75902a-e591-4378-89b8-9cab1f53dc5f-metrics-certs\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.645788 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4ljp\" (UniqueName: \"kubernetes.io/projected/64704f76-28dc-42cf-a696-9473b337eee9-kube-api-access-m4ljp\") pod \"frr-k8s-webhook-server-7df86c4f6c-5crvf\" (UID: \"64704f76-28dc-42cf-a696-9473b337eee9\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5crvf" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.645825 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64704f76-28dc-42cf-a696-9473b337eee9-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-5crvf\" (UID: \"64704f76-28dc-42cf-a696-9473b337eee9\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5crvf" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.645859 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-memberlist\") pod \"speaker-m55jn\" (UID: \"c3d285c6-0abf-4c0b-92f5-1c91659d1de1\") " pod="metallb-system/speaker-m55jn" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.645887 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/be75902a-e591-4378-89b8-9cab1f53dc5f-frr-conf\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.645934 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/be75902a-e591-4378-89b8-9cab1f53dc5f-frr-sockets\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.646339 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/be75902a-e591-4378-89b8-9cab1f53dc5f-metrics\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: E0128 20:54:40.646362 4746 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.646440 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/be75902a-e591-4378-89b8-9cab1f53dc5f-frr-conf\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: E0128 20:54:40.646466 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64704f76-28dc-42cf-a696-9473b337eee9-cert podName:64704f76-28dc-42cf-a696-9473b337eee9 nodeName:}" failed. No retries permitted until 2026-01-28 20:54:41.146438764 +0000 UTC m=+909.102625118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/64704f76-28dc-42cf-a696-9473b337eee9-cert") pod "frr-k8s-webhook-server-7df86c4f6c-5crvf" (UID: "64704f76-28dc-42cf-a696-9473b337eee9") : secret "frr-k8s-webhook-server-cert" not found Jan 28 20:54:40 crc kubenswrapper[4746]: E0128 20:54:40.646603 4746 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Jan 28 20:54:40 crc kubenswrapper[4746]: E0128 20:54:40.646659 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be75902a-e591-4378-89b8-9cab1f53dc5f-metrics-certs podName:be75902a-e591-4378-89b8-9cab1f53dc5f nodeName:}" failed. No retries permitted until 2026-01-28 20:54:41.146642529 +0000 UTC m=+909.102828883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/be75902a-e591-4378-89b8-9cab1f53dc5f-metrics-certs") pod "frr-k8s-hws9w" (UID: "be75902a-e591-4378-89b8-9cab1f53dc5f") : secret "frr-k8s-certs-secret" not found Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.646832 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/be75902a-e591-4378-89b8-9cab1f53dc5f-frr-startup\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.650504 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/be75902a-e591-4378-89b8-9cab1f53dc5f-reloader\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.679802 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4ljp\" (UniqueName: \"kubernetes.io/projected/64704f76-28dc-42cf-a696-9473b337eee9-kube-api-access-m4ljp\") pod \"frr-k8s-webhook-server-7df86c4f6c-5crvf\" (UID: \"64704f76-28dc-42cf-a696-9473b337eee9\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5crvf" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.687843 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swm5f\" (UniqueName: \"kubernetes.io/projected/be75902a-e591-4378-89b8-9cab1f53dc5f-kube-api-access-swm5f\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.747406 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d2mg\" (UniqueName: \"kubernetes.io/projected/b5150ca9-e86d-4087-bc5d-c2dd26234ecd-kube-api-access-5d2mg\") pod \"controller-6968d8fdc4-r2vlm\" (UID: \"b5150ca9-e86d-4087-bc5d-c2dd26234ecd\") " pod="metallb-system/controller-6968d8fdc4-r2vlm" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.747469 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-memberlist\") pod \"speaker-m55jn\" (UID: \"c3d285c6-0abf-4c0b-92f5-1c91659d1de1\") " pod="metallb-system/speaker-m55jn" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.747506 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5150ca9-e86d-4087-bc5d-c2dd26234ecd-metrics-certs\") pod \"controller-6968d8fdc4-r2vlm\" (UID: \"b5150ca9-e86d-4087-bc5d-c2dd26234ecd\") " pod="metallb-system/controller-6968d8fdc4-r2vlm" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.747547 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-metallb-excludel2\") pod \"speaker-m55jn\" (UID: \"c3d285c6-0abf-4c0b-92f5-1c91659d1de1\") " pod="metallb-system/speaker-m55jn" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.747573 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm6zl\" (UniqueName: \"kubernetes.io/projected/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-kube-api-access-cm6zl\") pod \"speaker-m55jn\" (UID: \"c3d285c6-0abf-4c0b-92f5-1c91659d1de1\") " pod="metallb-system/speaker-m55jn" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.747593 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5150ca9-e86d-4087-bc5d-c2dd26234ecd-cert\") pod \"controller-6968d8fdc4-r2vlm\" (UID: \"b5150ca9-e86d-4087-bc5d-c2dd26234ecd\") " pod="metallb-system/controller-6968d8fdc4-r2vlm" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.747631 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-metrics-certs\") pod \"speaker-m55jn\" (UID: \"c3d285c6-0abf-4c0b-92f5-1c91659d1de1\") " pod="metallb-system/speaker-m55jn" Jan 28 20:54:40 crc kubenswrapper[4746]: E0128 20:54:40.747674 4746 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 28 20:54:40 crc kubenswrapper[4746]: E0128 20:54:40.747765 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-memberlist podName:c3d285c6-0abf-4c0b-92f5-1c91659d1de1 nodeName:}" failed. No retries permitted until 2026-01-28 20:54:41.247720801 +0000 UTC m=+909.203907155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-memberlist") pod "speaker-m55jn" (UID: "c3d285c6-0abf-4c0b-92f5-1c91659d1de1") : secret "metallb-memberlist" not found Jan 28 20:54:40 crc kubenswrapper[4746]: E0128 20:54:40.747825 4746 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Jan 28 20:54:40 crc kubenswrapper[4746]: E0128 20:54:40.747896 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-metrics-certs podName:c3d285c6-0abf-4c0b-92f5-1c91659d1de1 nodeName:}" failed. No retries permitted until 2026-01-28 20:54:41.247871925 +0000 UTC m=+909.204058269 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-metrics-certs") pod "speaker-m55jn" (UID: "c3d285c6-0abf-4c0b-92f5-1c91659d1de1") : secret "speaker-certs-secret" not found Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.748539 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-metallb-excludel2\") pod \"speaker-m55jn\" (UID: \"c3d285c6-0abf-4c0b-92f5-1c91659d1de1\") " pod="metallb-system/speaker-m55jn" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.775275 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm6zl\" (UniqueName: \"kubernetes.io/projected/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-kube-api-access-cm6zl\") pod \"speaker-m55jn\" (UID: \"c3d285c6-0abf-4c0b-92f5-1c91659d1de1\") " pod="metallb-system/speaker-m55jn" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.849287 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5150ca9-e86d-4087-bc5d-c2dd26234ecd-metrics-certs\") pod \"controller-6968d8fdc4-r2vlm\" (UID: \"b5150ca9-e86d-4087-bc5d-c2dd26234ecd\") " pod="metallb-system/controller-6968d8fdc4-r2vlm" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.849377 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5150ca9-e86d-4087-bc5d-c2dd26234ecd-cert\") pod \"controller-6968d8fdc4-r2vlm\" (UID: \"b5150ca9-e86d-4087-bc5d-c2dd26234ecd\") " pod="metallb-system/controller-6968d8fdc4-r2vlm" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.849466 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d2mg\" (UniqueName: \"kubernetes.io/projected/b5150ca9-e86d-4087-bc5d-c2dd26234ecd-kube-api-access-5d2mg\") pod \"controller-6968d8fdc4-r2vlm\" (UID: \"b5150ca9-e86d-4087-bc5d-c2dd26234ecd\") " pod="metallb-system/controller-6968d8fdc4-r2vlm" Jan 28 20:54:40 crc kubenswrapper[4746]: E0128 20:54:40.849510 4746 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Jan 28 20:54:40 crc kubenswrapper[4746]: E0128 20:54:40.849611 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5150ca9-e86d-4087-bc5d-c2dd26234ecd-metrics-certs podName:b5150ca9-e86d-4087-bc5d-c2dd26234ecd nodeName:}" failed. No retries permitted until 2026-01-28 20:54:41.349587435 +0000 UTC m=+909.305773789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b5150ca9-e86d-4087-bc5d-c2dd26234ecd-metrics-certs") pod "controller-6968d8fdc4-r2vlm" (UID: "b5150ca9-e86d-4087-bc5d-c2dd26234ecd") : secret "controller-certs-secret" not found Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.853886 4746 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.875258 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5150ca9-e86d-4087-bc5d-c2dd26234ecd-cert\") pod \"controller-6968d8fdc4-r2vlm\" (UID: \"b5150ca9-e86d-4087-bc5d-c2dd26234ecd\") " pod="metallb-system/controller-6968d8fdc4-r2vlm" Jan 28 20:54:40 crc kubenswrapper[4746]: I0128 20:54:40.883297 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d2mg\" (UniqueName: \"kubernetes.io/projected/b5150ca9-e86d-4087-bc5d-c2dd26234ecd-kube-api-access-5d2mg\") pod \"controller-6968d8fdc4-r2vlm\" (UID: \"b5150ca9-e86d-4087-bc5d-c2dd26234ecd\") " pod="metallb-system/controller-6968d8fdc4-r2vlm" Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.154034 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64704f76-28dc-42cf-a696-9473b337eee9-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-5crvf\" (UID: \"64704f76-28dc-42cf-a696-9473b337eee9\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5crvf" Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.154504 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be75902a-e591-4378-89b8-9cab1f53dc5f-metrics-certs\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.158372 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be75902a-e591-4378-89b8-9cab1f53dc5f-metrics-certs\") pod \"frr-k8s-hws9w\" (UID: \"be75902a-e591-4378-89b8-9cab1f53dc5f\") " pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.158437 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64704f76-28dc-42cf-a696-9473b337eee9-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-5crvf\" (UID: \"64704f76-28dc-42cf-a696-9473b337eee9\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5crvf" Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.255620 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-metrics-certs\") pod \"speaker-m55jn\" (UID: \"c3d285c6-0abf-4c0b-92f5-1c91659d1de1\") " pod="metallb-system/speaker-m55jn" Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.255701 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-memberlist\") pod \"speaker-m55jn\" (UID: \"c3d285c6-0abf-4c0b-92f5-1c91659d1de1\") " pod="metallb-system/speaker-m55jn" Jan 28 20:54:41 crc kubenswrapper[4746]: E0128 20:54:41.255883 4746 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 28 20:54:41 crc kubenswrapper[4746]: E0128 20:54:41.255947 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-memberlist podName:c3d285c6-0abf-4c0b-92f5-1c91659d1de1 nodeName:}" failed. No retries permitted until 2026-01-28 20:54:42.255932289 +0000 UTC m=+910.212118643 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-memberlist") pod "speaker-m55jn" (UID: "c3d285c6-0abf-4c0b-92f5-1c91659d1de1") : secret "metallb-memberlist" not found Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.258910 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-metrics-certs\") pod \"speaker-m55jn\" (UID: \"c3d285c6-0abf-4c0b-92f5-1c91659d1de1\") " pod="metallb-system/speaker-m55jn" Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.321051 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5r4d" event={"ID":"f872cd48-4b0a-45b8-95de-1b42589d574a","Type":"ContainerStarted","Data":"b922a27da17231c8ee3d0efbde46795643956d243801d65f3415f2b9972deb67"} Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.346637 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n5r4d" podStartSLOduration=3.070150822 podStartE2EDuration="6.346608731s" podCreationTimestamp="2026-01-28 20:54:35 +0000 UTC" firstStartedPulling="2026-01-28 20:54:37.28400404 +0000 UTC m=+905.240190394" lastFinishedPulling="2026-01-28 20:54:40.560461959 +0000 UTC m=+908.516648303" observedRunningTime="2026-01-28 20:54:41.343432816 +0000 UTC m=+909.299619170" watchObservedRunningTime="2026-01-28 20:54:41.346608731 +0000 UTC m=+909.302795085" Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.357649 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5150ca9-e86d-4087-bc5d-c2dd26234ecd-metrics-certs\") pod \"controller-6968d8fdc4-r2vlm\" (UID: \"b5150ca9-e86d-4087-bc5d-c2dd26234ecd\") " pod="metallb-system/controller-6968d8fdc4-r2vlm" Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.361406 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5150ca9-e86d-4087-bc5d-c2dd26234ecd-metrics-certs\") pod \"controller-6968d8fdc4-r2vlm\" (UID: \"b5150ca9-e86d-4087-bc5d-c2dd26234ecd\") " pod="metallb-system/controller-6968d8fdc4-r2vlm" Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.389462 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.404176 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5crvf" Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.468148 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fcx4p" Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.468206 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fcx4p" Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.627491 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-r2vlm" Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.635801 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fcx4p" Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.845010 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-5crvf"] Jan 28 20:54:41 crc kubenswrapper[4746]: I0128 20:54:41.896191 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-r2vlm"] Jan 28 20:54:41 crc kubenswrapper[4746]: W0128 20:54:41.907023 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5150ca9_e86d_4087_bc5d_c2dd26234ecd.slice/crio-a5c3688fd1530615ae84a61ca3f6009a260910fbd9d7624a42b96c26a695c441 WatchSource:0}: Error finding container a5c3688fd1530615ae84a61ca3f6009a260910fbd9d7624a42b96c26a695c441: Status 404 returned error can't find the container with id a5c3688fd1530615ae84a61ca3f6009a260910fbd9d7624a42b96c26a695c441 Jan 28 20:54:42 crc kubenswrapper[4746]: I0128 20:54:42.278971 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-memberlist\") pod \"speaker-m55jn\" (UID: \"c3d285c6-0abf-4c0b-92f5-1c91659d1de1\") " pod="metallb-system/speaker-m55jn" Jan 28 20:54:42 crc kubenswrapper[4746]: I0128 20:54:42.290490 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c3d285c6-0abf-4c0b-92f5-1c91659d1de1-memberlist\") pod \"speaker-m55jn\" (UID: \"c3d285c6-0abf-4c0b-92f5-1c91659d1de1\") " pod="metallb-system/speaker-m55jn" Jan 28 20:54:42 crc kubenswrapper[4746]: I0128 20:54:42.331157 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-r2vlm" event={"ID":"b5150ca9-e86d-4087-bc5d-c2dd26234ecd","Type":"ContainerStarted","Data":"97ffdc5f1dc403b93a8d979ba7dce2b8775ad1aa9d17131dbaef14c072263bc5"} Jan 28 20:54:42 crc kubenswrapper[4746]: I0128 20:54:42.331224 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-r2vlm" event={"ID":"b5150ca9-e86d-4087-bc5d-c2dd26234ecd","Type":"ContainerStarted","Data":"b5bdfdb8c8a9283734751244884f7a68700d1593c06375ee239fcfed2d41ebef"} Jan 28 20:54:42 crc kubenswrapper[4746]: I0128 20:54:42.331237 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-r2vlm" event={"ID":"b5150ca9-e86d-4087-bc5d-c2dd26234ecd","Type":"ContainerStarted","Data":"a5c3688fd1530615ae84a61ca3f6009a260910fbd9d7624a42b96c26a695c441"} Jan 28 20:54:42 crc kubenswrapper[4746]: I0128 20:54:42.332229 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-r2vlm" Jan 28 20:54:42 crc kubenswrapper[4746]: I0128 20:54:42.333152 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5crvf" event={"ID":"64704f76-28dc-42cf-a696-9473b337eee9","Type":"ContainerStarted","Data":"df3515a25229fd69ec8941c2fcbeb1fe9662350864477fbedfdbaa506e784862"} Jan 28 20:54:42 crc kubenswrapper[4746]: I0128 20:54:42.334339 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hws9w" event={"ID":"be75902a-e591-4378-89b8-9cab1f53dc5f","Type":"ContainerStarted","Data":"2316e12c9ae7fec0209109f345f4304cc9faa68732fcd03784c65959257370e0"} Jan 28 20:54:42 crc kubenswrapper[4746]: I0128 20:54:42.355348 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-r2vlm" podStartSLOduration=2.355323089 podStartE2EDuration="2.355323089s" podCreationTimestamp="2026-01-28 20:54:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:54:42.353464138 +0000 UTC m=+910.309650492" watchObservedRunningTime="2026-01-28 20:54:42.355323089 +0000 UTC m=+910.311509443" Jan 28 20:54:42 crc kubenswrapper[4746]: I0128 20:54:42.381871 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fcx4p" Jan 28 20:54:42 crc kubenswrapper[4746]: I0128 20:54:42.459987 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-m55jn" Jan 28 20:54:43 crc kubenswrapper[4746]: I0128 20:54:43.402524 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-m55jn" event={"ID":"c3d285c6-0abf-4c0b-92f5-1c91659d1de1","Type":"ContainerStarted","Data":"d38a10c82da7d3ccea17f5e4ab576279b3f920b2533fa2d911b4c73e7b4b1cf4"} Jan 28 20:54:43 crc kubenswrapper[4746]: I0128 20:54:43.402586 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-m55jn" event={"ID":"c3d285c6-0abf-4c0b-92f5-1c91659d1de1","Type":"ContainerStarted","Data":"96a711aa9a2180b00c9def0a64d24a198ed21a198cd942dcdd39cf489f490474"} Jan 28 20:54:43 crc kubenswrapper[4746]: I0128 20:54:43.402597 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-m55jn" event={"ID":"c3d285c6-0abf-4c0b-92f5-1c91659d1de1","Type":"ContainerStarted","Data":"72a1c83a031266b91e430691954aef431a9c566f84503a993f4e69466521789a"} Jan 28 20:54:43 crc kubenswrapper[4746]: I0128 20:54:43.404188 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-m55jn" Jan 28 20:54:43 crc kubenswrapper[4746]: I0128 20:54:43.436529 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-m55jn" podStartSLOduration=3.436501218 podStartE2EDuration="3.436501218s" podCreationTimestamp="2026-01-28 20:54:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:54:43.434699769 +0000 UTC m=+911.390886123" watchObservedRunningTime="2026-01-28 20:54:43.436501218 +0000 UTC m=+911.392687572" Jan 28 20:54:45 crc kubenswrapper[4746]: I0128 20:54:45.112002 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fcx4p"] Jan 28 20:54:45 crc kubenswrapper[4746]: I0128 20:54:45.112439 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fcx4p" podUID="8b9535cd-70dd-4373-8309-0ea6ad3dfd34" containerName="registry-server" containerID="cri-o://9329f83f85b1cbdac37c069f9697ac7193e757b7400086d07685152d4c2d6cd8" gracePeriod=2 Jan 28 20:54:45 crc kubenswrapper[4746]: I0128 20:54:45.423843 4746 generic.go:334] "Generic (PLEG): container finished" podID="8b9535cd-70dd-4373-8309-0ea6ad3dfd34" containerID="9329f83f85b1cbdac37c069f9697ac7193e757b7400086d07685152d4c2d6cd8" exitCode=0 Jan 28 20:54:45 crc kubenswrapper[4746]: I0128 20:54:45.423896 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcx4p" event={"ID":"8b9535cd-70dd-4373-8309-0ea6ad3dfd34","Type":"ContainerDied","Data":"9329f83f85b1cbdac37c069f9697ac7193e757b7400086d07685152d4c2d6cd8"} Jan 28 20:54:45 crc kubenswrapper[4746]: I0128 20:54:45.545519 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fcx4p" Jan 28 20:54:45 crc kubenswrapper[4746]: I0128 20:54:45.643899 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq7r2\" (UniqueName: \"kubernetes.io/projected/8b9535cd-70dd-4373-8309-0ea6ad3dfd34-kube-api-access-mq7r2\") pod \"8b9535cd-70dd-4373-8309-0ea6ad3dfd34\" (UID: \"8b9535cd-70dd-4373-8309-0ea6ad3dfd34\") " Jan 28 20:54:45 crc kubenswrapper[4746]: I0128 20:54:45.643981 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9535cd-70dd-4373-8309-0ea6ad3dfd34-catalog-content\") pod \"8b9535cd-70dd-4373-8309-0ea6ad3dfd34\" (UID: \"8b9535cd-70dd-4373-8309-0ea6ad3dfd34\") " Jan 28 20:54:45 crc kubenswrapper[4746]: I0128 20:54:45.644019 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9535cd-70dd-4373-8309-0ea6ad3dfd34-utilities\") pod \"8b9535cd-70dd-4373-8309-0ea6ad3dfd34\" (UID: \"8b9535cd-70dd-4373-8309-0ea6ad3dfd34\") " Jan 28 20:54:45 crc kubenswrapper[4746]: I0128 20:54:45.645208 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b9535cd-70dd-4373-8309-0ea6ad3dfd34-utilities" (OuterVolumeSpecName: "utilities") pod "8b9535cd-70dd-4373-8309-0ea6ad3dfd34" (UID: "8b9535cd-70dd-4373-8309-0ea6ad3dfd34"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:54:45 crc kubenswrapper[4746]: I0128 20:54:45.653438 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b9535cd-70dd-4373-8309-0ea6ad3dfd34-kube-api-access-mq7r2" (OuterVolumeSpecName: "kube-api-access-mq7r2") pod "8b9535cd-70dd-4373-8309-0ea6ad3dfd34" (UID: "8b9535cd-70dd-4373-8309-0ea6ad3dfd34"). InnerVolumeSpecName "kube-api-access-mq7r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:54:45 crc kubenswrapper[4746]: I0128 20:54:45.667238 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b9535cd-70dd-4373-8309-0ea6ad3dfd34-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b9535cd-70dd-4373-8309-0ea6ad3dfd34" (UID: "8b9535cd-70dd-4373-8309-0ea6ad3dfd34"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:54:45 crc kubenswrapper[4746]: I0128 20:54:45.746409 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9535cd-70dd-4373-8309-0ea6ad3dfd34-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:54:45 crc kubenswrapper[4746]: I0128 20:54:45.746449 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq7r2\" (UniqueName: \"kubernetes.io/projected/8b9535cd-70dd-4373-8309-0ea6ad3dfd34-kube-api-access-mq7r2\") on node \"crc\" DevicePath \"\"" Jan 28 20:54:45 crc kubenswrapper[4746]: I0128 20:54:45.746463 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9535cd-70dd-4373-8309-0ea6ad3dfd34-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:54:46 crc kubenswrapper[4746]: I0128 20:54:46.045707 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n5r4d" Jan 28 20:54:46 crc kubenswrapper[4746]: I0128 20:54:46.046908 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n5r4d" Jan 28 20:54:46 crc kubenswrapper[4746]: I0128 20:54:46.106774 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n5r4d" Jan 28 20:54:46 crc kubenswrapper[4746]: I0128 20:54:46.441328 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcx4p" event={"ID":"8b9535cd-70dd-4373-8309-0ea6ad3dfd34","Type":"ContainerDied","Data":"e67c9e8283f130c19b441b963a01e60d4a9e6c53dfed4fb3d66fbc7def2b6ee8"} Jan 28 20:54:46 crc kubenswrapper[4746]: I0128 20:54:46.441414 4746 scope.go:117] "RemoveContainer" containerID="9329f83f85b1cbdac37c069f9697ac7193e757b7400086d07685152d4c2d6cd8" Jan 28 20:54:46 crc kubenswrapper[4746]: I0128 20:54:46.443048 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fcx4p" Jan 28 20:54:46 crc kubenswrapper[4746]: I0128 20:54:46.473426 4746 scope.go:117] "RemoveContainer" containerID="1d4a38770efc0618437a5f55f98c709e9453d6d18b2e61ae677f2d7358c3da12" Jan 28 20:54:46 crc kubenswrapper[4746]: I0128 20:54:46.495628 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fcx4p"] Jan 28 20:54:46 crc kubenswrapper[4746]: I0128 20:54:46.498656 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n5r4d" Jan 28 20:54:46 crc kubenswrapper[4746]: I0128 20:54:46.509648 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fcx4p"] Jan 28 20:54:46 crc kubenswrapper[4746]: I0128 20:54:46.528484 4746 scope.go:117] "RemoveContainer" containerID="cfb8b48d9d7b3295b816d461d4d5539e4ac32cab1dcc7144051ee26877167ff6" Jan 28 20:54:46 crc kubenswrapper[4746]: I0128 20:54:46.847064 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b9535cd-70dd-4373-8309-0ea6ad3dfd34" path="/var/lib/kubelet/pods/8b9535cd-70dd-4373-8309-0ea6ad3dfd34/volumes" Jan 28 20:54:47 crc kubenswrapper[4746]: I0128 20:54:47.313916 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n5r4d"] Jan 28 20:54:48 crc kubenswrapper[4746]: I0128 20:54:48.464018 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n5r4d" podUID="f872cd48-4b0a-45b8-95de-1b42589d574a" containerName="registry-server" containerID="cri-o://b922a27da17231c8ee3d0efbde46795643956d243801d65f3415f2b9972deb67" gracePeriod=2 Jan 28 20:54:49 crc kubenswrapper[4746]: I0128 20:54:49.472764 4746 generic.go:334] "Generic (PLEG): container finished" podID="f872cd48-4b0a-45b8-95de-1b42589d574a" containerID="b922a27da17231c8ee3d0efbde46795643956d243801d65f3415f2b9972deb67" exitCode=0 Jan 28 20:54:49 crc kubenswrapper[4746]: I0128 20:54:49.472827 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5r4d" event={"ID":"f872cd48-4b0a-45b8-95de-1b42589d574a","Type":"ContainerDied","Data":"b922a27da17231c8ee3d0efbde46795643956d243801d65f3415f2b9972deb67"} Jan 28 20:54:49 crc kubenswrapper[4746]: I0128 20:54:49.887059 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n5r4d" Jan 28 20:54:49 crc kubenswrapper[4746]: I0128 20:54:49.913011 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cqzq\" (UniqueName: \"kubernetes.io/projected/f872cd48-4b0a-45b8-95de-1b42589d574a-kube-api-access-9cqzq\") pod \"f872cd48-4b0a-45b8-95de-1b42589d574a\" (UID: \"f872cd48-4b0a-45b8-95de-1b42589d574a\") " Jan 28 20:54:49 crc kubenswrapper[4746]: I0128 20:54:49.913128 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f872cd48-4b0a-45b8-95de-1b42589d574a-catalog-content\") pod \"f872cd48-4b0a-45b8-95de-1b42589d574a\" (UID: \"f872cd48-4b0a-45b8-95de-1b42589d574a\") " Jan 28 20:54:49 crc kubenswrapper[4746]: I0128 20:54:49.913222 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f872cd48-4b0a-45b8-95de-1b42589d574a-utilities\") pod \"f872cd48-4b0a-45b8-95de-1b42589d574a\" (UID: \"f872cd48-4b0a-45b8-95de-1b42589d574a\") " Jan 28 20:54:49 crc kubenswrapper[4746]: I0128 20:54:49.914379 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f872cd48-4b0a-45b8-95de-1b42589d574a-utilities" (OuterVolumeSpecName: "utilities") pod "f872cd48-4b0a-45b8-95de-1b42589d574a" (UID: "f872cd48-4b0a-45b8-95de-1b42589d574a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:54:49 crc kubenswrapper[4746]: I0128 20:54:49.920435 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f872cd48-4b0a-45b8-95de-1b42589d574a-kube-api-access-9cqzq" (OuterVolumeSpecName: "kube-api-access-9cqzq") pod "f872cd48-4b0a-45b8-95de-1b42589d574a" (UID: "f872cd48-4b0a-45b8-95de-1b42589d574a"). InnerVolumeSpecName "kube-api-access-9cqzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:54:49 crc kubenswrapper[4746]: I0128 20:54:49.981929 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f872cd48-4b0a-45b8-95de-1b42589d574a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f872cd48-4b0a-45b8-95de-1b42589d574a" (UID: "f872cd48-4b0a-45b8-95de-1b42589d574a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:54:50 crc kubenswrapper[4746]: I0128 20:54:50.015543 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cqzq\" (UniqueName: \"kubernetes.io/projected/f872cd48-4b0a-45b8-95de-1b42589d574a-kube-api-access-9cqzq\") on node \"crc\" DevicePath \"\"" Jan 28 20:54:50 crc kubenswrapper[4746]: I0128 20:54:50.015596 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f872cd48-4b0a-45b8-95de-1b42589d574a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:54:50 crc kubenswrapper[4746]: I0128 20:54:50.015607 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f872cd48-4b0a-45b8-95de-1b42589d574a-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:54:50 crc kubenswrapper[4746]: I0128 20:54:50.484303 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5r4d" event={"ID":"f872cd48-4b0a-45b8-95de-1b42589d574a","Type":"ContainerDied","Data":"1f480bc6be7af1de3ce586cb2ab77875d80871461d725300cddef8257bcec9b9"} Jan 28 20:54:50 crc kubenswrapper[4746]: I0128 20:54:50.484342 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n5r4d" Jan 28 20:54:50 crc kubenswrapper[4746]: I0128 20:54:50.484439 4746 scope.go:117] "RemoveContainer" containerID="b922a27da17231c8ee3d0efbde46795643956d243801d65f3415f2b9972deb67" Jan 28 20:54:50 crc kubenswrapper[4746]: I0128 20:54:50.487017 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5crvf" event={"ID":"64704f76-28dc-42cf-a696-9473b337eee9","Type":"ContainerStarted","Data":"9e6bea446932d4e3a30cd63097114d1046852a7597c0771324a892f52e68dc96"} Jan 28 20:54:50 crc kubenswrapper[4746]: I0128 20:54:50.489364 4746 generic.go:334] "Generic (PLEG): container finished" podID="be75902a-e591-4378-89b8-9cab1f53dc5f" containerID="1fb6f06d1092981dd35c462eb69ab9f3f7287dca242edd47588d3f0457a50c42" exitCode=0 Jan 28 20:54:50 crc kubenswrapper[4746]: I0128 20:54:50.489420 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hws9w" event={"ID":"be75902a-e591-4378-89b8-9cab1f53dc5f","Type":"ContainerDied","Data":"1fb6f06d1092981dd35c462eb69ab9f3f7287dca242edd47588d3f0457a50c42"} Jan 28 20:54:50 crc kubenswrapper[4746]: I0128 20:54:50.508879 4746 scope.go:117] "RemoveContainer" containerID="a02a844a7d12231993293fde0d2effffe4998f25c384c23f1f265f3bd8483b65" Jan 28 20:54:50 crc kubenswrapper[4746]: I0128 20:54:50.516136 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5crvf" podStartSLOduration=2.680589779 podStartE2EDuration="10.516124391s" podCreationTimestamp="2026-01-28 20:54:40 +0000 UTC" firstStartedPulling="2026-01-28 20:54:41.870276845 +0000 UTC m=+909.826463199" lastFinishedPulling="2026-01-28 20:54:49.705811457 +0000 UTC m=+917.661997811" observedRunningTime="2026-01-28 20:54:50.509952225 +0000 UTC m=+918.466138579" watchObservedRunningTime="2026-01-28 20:54:50.516124391 +0000 UTC m=+918.472310735" Jan 28 20:54:50 crc kubenswrapper[4746]: I0128 20:54:50.538101 4746 scope.go:117] "RemoveContainer" containerID="926cd735fd62c33dfd234c154769d997132dcbb79c6e700b487194684268ca15" Jan 28 20:54:50 crc kubenswrapper[4746]: I0128 20:54:50.563373 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n5r4d"] Jan 28 20:54:50 crc kubenswrapper[4746]: I0128 20:54:50.569029 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n5r4d"] Jan 28 20:54:50 crc kubenswrapper[4746]: I0128 20:54:50.844184 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f872cd48-4b0a-45b8-95de-1b42589d574a" path="/var/lib/kubelet/pods/f872cd48-4b0a-45b8-95de-1b42589d574a/volumes" Jan 28 20:54:51 crc kubenswrapper[4746]: I0128 20:54:51.404657 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5crvf" Jan 28 20:54:51 crc kubenswrapper[4746]: I0128 20:54:51.499719 4746 generic.go:334] "Generic (PLEG): container finished" podID="be75902a-e591-4378-89b8-9cab1f53dc5f" containerID="4a3987d7fb18a4543e7c9a2764a0dc55c1ff1e903a26214887b29d505d1006ef" exitCode=0 Jan 28 20:54:51 crc kubenswrapper[4746]: I0128 20:54:51.500304 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hws9w" event={"ID":"be75902a-e591-4378-89b8-9cab1f53dc5f","Type":"ContainerDied","Data":"4a3987d7fb18a4543e7c9a2764a0dc55c1ff1e903a26214887b29d505d1006ef"} Jan 28 20:54:52 crc kubenswrapper[4746]: I0128 20:54:52.466352 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-m55jn" Jan 28 20:54:52 crc kubenswrapper[4746]: I0128 20:54:52.517984 4746 generic.go:334] "Generic (PLEG): container finished" podID="be75902a-e591-4378-89b8-9cab1f53dc5f" containerID="92323a8f36f8c0b4bad61d1adf9cf090b136a945b4b64f7f200296895637c74a" exitCode=0 Jan 28 20:54:52 crc kubenswrapper[4746]: I0128 20:54:52.518224 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hws9w" event={"ID":"be75902a-e591-4378-89b8-9cab1f53dc5f","Type":"ContainerDied","Data":"92323a8f36f8c0b4bad61d1adf9cf090b136a945b4b64f7f200296895637c74a"} Jan 28 20:54:53 crc kubenswrapper[4746]: I0128 20:54:53.535459 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hws9w" event={"ID":"be75902a-e591-4378-89b8-9cab1f53dc5f","Type":"ContainerStarted","Data":"6276d60bbe8a48529f8f99c60d862a6716a9bb87e4e3de445ee451f12a986c3b"} Jan 28 20:54:53 crc kubenswrapper[4746]: I0128 20:54:53.535965 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hws9w" event={"ID":"be75902a-e591-4378-89b8-9cab1f53dc5f","Type":"ContainerStarted","Data":"4fe16b399fb8f85c67e50e78589012e9851799b508847100f10d0729c94fa197"} Jan 28 20:54:53 crc kubenswrapper[4746]: I0128 20:54:53.535983 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hws9w" event={"ID":"be75902a-e591-4378-89b8-9cab1f53dc5f","Type":"ContainerStarted","Data":"a0f7ae18ed934ea56b087bee39ecc7daf364b76a0e37ddab42ccfc5ce4e89d85"} Jan 28 20:54:53 crc kubenswrapper[4746]: I0128 20:54:53.535995 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hws9w" event={"ID":"be75902a-e591-4378-89b8-9cab1f53dc5f","Type":"ContainerStarted","Data":"926d6637f743902e51ce33485e978ba1f5940128e14e58fc6705fada9e32163d"} Jan 28 20:54:53 crc kubenswrapper[4746]: I0128 20:54:53.536008 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hws9w" event={"ID":"be75902a-e591-4378-89b8-9cab1f53dc5f","Type":"ContainerStarted","Data":"ea579ad1ed4aba233ddc25559b8eec534816cb7ad67fa784f7cd3ccb644b6f18"} Jan 28 20:54:54 crc kubenswrapper[4746]: I0128 20:54:54.558677 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hws9w" event={"ID":"be75902a-e591-4378-89b8-9cab1f53dc5f","Type":"ContainerStarted","Data":"4403d6af8987952ce336da955ff29bdbb9490b39ee2a3e574b96c601f2a9b243"} Jan 28 20:54:54 crc kubenswrapper[4746]: I0128 20:54:54.559331 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:54 crc kubenswrapper[4746]: I0128 20:54:54.588321 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-hws9w" podStartSLOduration=6.515785741 podStartE2EDuration="14.588292355s" podCreationTimestamp="2026-01-28 20:54:40 +0000 UTC" firstStartedPulling="2026-01-28 20:54:41.672724215 +0000 UTC m=+909.628910569" lastFinishedPulling="2026-01-28 20:54:49.745230829 +0000 UTC m=+917.701417183" observedRunningTime="2026-01-28 20:54:54.58252726 +0000 UTC m=+922.538713624" watchObservedRunningTime="2026-01-28 20:54:54.588292355 +0000 UTC m=+922.544478709" Jan 28 20:54:55 crc kubenswrapper[4746]: I0128 20:54:55.441392 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-22fd6"] Jan 28 20:54:55 crc kubenswrapper[4746]: E0128 20:54:55.441678 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9535cd-70dd-4373-8309-0ea6ad3dfd34" containerName="extract-utilities" Jan 28 20:54:55 crc kubenswrapper[4746]: I0128 20:54:55.441709 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9535cd-70dd-4373-8309-0ea6ad3dfd34" containerName="extract-utilities" Jan 28 20:54:55 crc kubenswrapper[4746]: E0128 20:54:55.441721 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f872cd48-4b0a-45b8-95de-1b42589d574a" containerName="extract-content" Jan 28 20:54:55 crc kubenswrapper[4746]: I0128 20:54:55.441728 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f872cd48-4b0a-45b8-95de-1b42589d574a" containerName="extract-content" Jan 28 20:54:55 crc kubenswrapper[4746]: E0128 20:54:55.441738 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9535cd-70dd-4373-8309-0ea6ad3dfd34" containerName="registry-server" Jan 28 20:54:55 crc kubenswrapper[4746]: I0128 20:54:55.441745 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9535cd-70dd-4373-8309-0ea6ad3dfd34" containerName="registry-server" Jan 28 20:54:55 crc kubenswrapper[4746]: E0128 20:54:55.441758 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9535cd-70dd-4373-8309-0ea6ad3dfd34" containerName="extract-content" Jan 28 20:54:55 crc kubenswrapper[4746]: I0128 20:54:55.441764 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9535cd-70dd-4373-8309-0ea6ad3dfd34" containerName="extract-content" Jan 28 20:54:55 crc kubenswrapper[4746]: E0128 20:54:55.441777 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f872cd48-4b0a-45b8-95de-1b42589d574a" containerName="registry-server" Jan 28 20:54:55 crc kubenswrapper[4746]: I0128 20:54:55.441782 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f872cd48-4b0a-45b8-95de-1b42589d574a" containerName="registry-server" Jan 28 20:54:55 crc kubenswrapper[4746]: E0128 20:54:55.441799 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f872cd48-4b0a-45b8-95de-1b42589d574a" containerName="extract-utilities" Jan 28 20:54:55 crc kubenswrapper[4746]: I0128 20:54:55.441805 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f872cd48-4b0a-45b8-95de-1b42589d574a" containerName="extract-utilities" Jan 28 20:54:55 crc kubenswrapper[4746]: I0128 20:54:55.441915 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="f872cd48-4b0a-45b8-95de-1b42589d574a" containerName="registry-server" Jan 28 20:54:55 crc kubenswrapper[4746]: I0128 20:54:55.441944 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9535cd-70dd-4373-8309-0ea6ad3dfd34" containerName="registry-server" Jan 28 20:54:55 crc kubenswrapper[4746]: I0128 20:54:55.442476 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-22fd6" Jan 28 20:54:55 crc kubenswrapper[4746]: I0128 20:54:55.445126 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 28 20:54:55 crc kubenswrapper[4746]: I0128 20:54:55.445131 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 28 20:54:55 crc kubenswrapper[4746]: I0128 20:54:55.449713 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-5rlcj" Jan 28 20:54:55 crc kubenswrapper[4746]: I0128 20:54:55.458242 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-22fd6"] Jan 28 20:54:55 crc kubenswrapper[4746]: I0128 20:54:55.502314 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wd7b\" (UniqueName: \"kubernetes.io/projected/18bf7107-67e1-4ee2-8d94-663a18a54a23-kube-api-access-9wd7b\") pod \"openstack-operator-index-22fd6\" (UID: \"18bf7107-67e1-4ee2-8d94-663a18a54a23\") " pod="openstack-operators/openstack-operator-index-22fd6" Jan 28 20:54:55 crc kubenswrapper[4746]: I0128 20:54:55.604067 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wd7b\" (UniqueName: \"kubernetes.io/projected/18bf7107-67e1-4ee2-8d94-663a18a54a23-kube-api-access-9wd7b\") pod \"openstack-operator-index-22fd6\" (UID: \"18bf7107-67e1-4ee2-8d94-663a18a54a23\") " pod="openstack-operators/openstack-operator-index-22fd6" Jan 28 20:54:55 crc kubenswrapper[4746]: I0128 20:54:55.625989 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wd7b\" (UniqueName: \"kubernetes.io/projected/18bf7107-67e1-4ee2-8d94-663a18a54a23-kube-api-access-9wd7b\") pod \"openstack-operator-index-22fd6\" (UID: \"18bf7107-67e1-4ee2-8d94-663a18a54a23\") " pod="openstack-operators/openstack-operator-index-22fd6" Jan 28 20:54:55 crc kubenswrapper[4746]: I0128 20:54:55.761768 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-22fd6" Jan 28 20:54:56 crc kubenswrapper[4746]: I0128 20:54:56.296573 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-22fd6"] Jan 28 20:54:56 crc kubenswrapper[4746]: I0128 20:54:56.390230 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:56 crc kubenswrapper[4746]: I0128 20:54:56.447375 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-hws9w" Jan 28 20:54:56 crc kubenswrapper[4746]: I0128 20:54:56.575032 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-22fd6" event={"ID":"18bf7107-67e1-4ee2-8d94-663a18a54a23","Type":"ContainerStarted","Data":"401b55a11deefca43ec977cc38784eb0cb706831143910093e48ecb717056522"} Jan 28 20:54:58 crc kubenswrapper[4746]: I0128 20:54:58.809341 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-22fd6"] Jan 28 20:54:59 crc kubenswrapper[4746]: I0128 20:54:59.415989 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hk7sk"] Jan 28 20:54:59 crc kubenswrapper[4746]: I0128 20:54:59.417135 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hk7sk" Jan 28 20:54:59 crc kubenswrapper[4746]: I0128 20:54:59.439397 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hk7sk"] Jan 28 20:54:59 crc kubenswrapper[4746]: I0128 20:54:59.486702 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8bl9\" (UniqueName: \"kubernetes.io/projected/161bd1ce-304a-4bcd-9188-568b362f4739-kube-api-access-c8bl9\") pod \"openstack-operator-index-hk7sk\" (UID: \"161bd1ce-304a-4bcd-9188-568b362f4739\") " pod="openstack-operators/openstack-operator-index-hk7sk" Jan 28 20:54:59 crc kubenswrapper[4746]: I0128 20:54:59.588609 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8bl9\" (UniqueName: \"kubernetes.io/projected/161bd1ce-304a-4bcd-9188-568b362f4739-kube-api-access-c8bl9\") pod \"openstack-operator-index-hk7sk\" (UID: \"161bd1ce-304a-4bcd-9188-568b362f4739\") " pod="openstack-operators/openstack-operator-index-hk7sk" Jan 28 20:54:59 crc kubenswrapper[4746]: I0128 20:54:59.600513 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-22fd6" event={"ID":"18bf7107-67e1-4ee2-8d94-663a18a54a23","Type":"ContainerStarted","Data":"0d02e0084d6f0e5ecb8473ea0f36186de2d0a04fccfb9f2cebb08fd17523b8c9"} Jan 28 20:54:59 crc kubenswrapper[4746]: I0128 20:54:59.600664 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-22fd6" podUID="18bf7107-67e1-4ee2-8d94-663a18a54a23" containerName="registry-server" containerID="cri-o://0d02e0084d6f0e5ecb8473ea0f36186de2d0a04fccfb9f2cebb08fd17523b8c9" gracePeriod=2 Jan 28 20:54:59 crc kubenswrapper[4746]: I0128 20:54:59.616457 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-22fd6" podStartSLOduration=1.6748935839999999 podStartE2EDuration="4.616435838s" podCreationTimestamp="2026-01-28 20:54:55 +0000 UTC" firstStartedPulling="2026-01-28 20:54:56.307116508 +0000 UTC m=+924.263302862" lastFinishedPulling="2026-01-28 20:54:59.248658762 +0000 UTC m=+927.204845116" observedRunningTime="2026-01-28 20:54:59.615386099 +0000 UTC m=+927.571572453" watchObservedRunningTime="2026-01-28 20:54:59.616435838 +0000 UTC m=+927.572622192" Jan 28 20:54:59 crc kubenswrapper[4746]: I0128 20:54:59.622184 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8bl9\" (UniqueName: \"kubernetes.io/projected/161bd1ce-304a-4bcd-9188-568b362f4739-kube-api-access-c8bl9\") pod \"openstack-operator-index-hk7sk\" (UID: \"161bd1ce-304a-4bcd-9188-568b362f4739\") " pod="openstack-operators/openstack-operator-index-hk7sk" Jan 28 20:54:59 crc kubenswrapper[4746]: I0128 20:54:59.733238 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hk7sk" Jan 28 20:55:00 crc kubenswrapper[4746]: I0128 20:55:00.003178 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-22fd6" Jan 28 20:55:00 crc kubenswrapper[4746]: I0128 20:55:00.097226 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wd7b\" (UniqueName: \"kubernetes.io/projected/18bf7107-67e1-4ee2-8d94-663a18a54a23-kube-api-access-9wd7b\") pod \"18bf7107-67e1-4ee2-8d94-663a18a54a23\" (UID: \"18bf7107-67e1-4ee2-8d94-663a18a54a23\") " Jan 28 20:55:00 crc kubenswrapper[4746]: I0128 20:55:00.100666 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18bf7107-67e1-4ee2-8d94-663a18a54a23-kube-api-access-9wd7b" (OuterVolumeSpecName: "kube-api-access-9wd7b") pod "18bf7107-67e1-4ee2-8d94-663a18a54a23" (UID: "18bf7107-67e1-4ee2-8d94-663a18a54a23"). InnerVolumeSpecName "kube-api-access-9wd7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:55:00 crc kubenswrapper[4746]: I0128 20:55:00.198616 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wd7b\" (UniqueName: \"kubernetes.io/projected/18bf7107-67e1-4ee2-8d94-663a18a54a23-kube-api-access-9wd7b\") on node \"crc\" DevicePath \"\"" Jan 28 20:55:00 crc kubenswrapper[4746]: I0128 20:55:00.204943 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hk7sk"] Jan 28 20:55:00 crc kubenswrapper[4746]: W0128 20:55:00.210734 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod161bd1ce_304a_4bcd_9188_568b362f4739.slice/crio-297a04813b1b032c6173e0333a557e61106b5e0422c09424b6fbb85736ea1f22 WatchSource:0}: Error finding container 297a04813b1b032c6173e0333a557e61106b5e0422c09424b6fbb85736ea1f22: Status 404 returned error can't find the container with id 297a04813b1b032c6173e0333a557e61106b5e0422c09424b6fbb85736ea1f22 Jan 28 20:55:00 crc kubenswrapper[4746]: I0128 20:55:00.615509 4746 generic.go:334] "Generic (PLEG): container finished" podID="18bf7107-67e1-4ee2-8d94-663a18a54a23" containerID="0d02e0084d6f0e5ecb8473ea0f36186de2d0a04fccfb9f2cebb08fd17523b8c9" exitCode=0 Jan 28 20:55:00 crc kubenswrapper[4746]: I0128 20:55:00.620448 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-22fd6" event={"ID":"18bf7107-67e1-4ee2-8d94-663a18a54a23","Type":"ContainerDied","Data":"0d02e0084d6f0e5ecb8473ea0f36186de2d0a04fccfb9f2cebb08fd17523b8c9"} Jan 28 20:55:00 crc kubenswrapper[4746]: I0128 20:55:00.623368 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-22fd6" event={"ID":"18bf7107-67e1-4ee2-8d94-663a18a54a23","Type":"ContainerDied","Data":"401b55a11deefca43ec977cc38784eb0cb706831143910093e48ecb717056522"} Jan 28 20:55:00 crc kubenswrapper[4746]: I0128 20:55:00.623404 4746 scope.go:117] "RemoveContainer" containerID="0d02e0084d6f0e5ecb8473ea0f36186de2d0a04fccfb9f2cebb08fd17523b8c9" Jan 28 20:55:00 crc kubenswrapper[4746]: I0128 20:55:00.620741 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-22fd6" Jan 28 20:55:00 crc kubenswrapper[4746]: I0128 20:55:00.633016 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hk7sk" event={"ID":"161bd1ce-304a-4bcd-9188-568b362f4739","Type":"ContainerStarted","Data":"a013c516fc2ad5b8596fbb2d5f7cdd9caf22712cb4f90cd0fccc2e9e3d1a40b9"} Jan 28 20:55:00 crc kubenswrapper[4746]: I0128 20:55:00.633318 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hk7sk" event={"ID":"161bd1ce-304a-4bcd-9188-568b362f4739","Type":"ContainerStarted","Data":"297a04813b1b032c6173e0333a557e61106b5e0422c09424b6fbb85736ea1f22"} Jan 28 20:55:00 crc kubenswrapper[4746]: I0128 20:55:00.666436 4746 scope.go:117] "RemoveContainer" containerID="0d02e0084d6f0e5ecb8473ea0f36186de2d0a04fccfb9f2cebb08fd17523b8c9" Jan 28 20:55:00 crc kubenswrapper[4746]: E0128 20:55:00.671621 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d02e0084d6f0e5ecb8473ea0f36186de2d0a04fccfb9f2cebb08fd17523b8c9\": container with ID starting with 0d02e0084d6f0e5ecb8473ea0f36186de2d0a04fccfb9f2cebb08fd17523b8c9 not found: ID does not exist" containerID="0d02e0084d6f0e5ecb8473ea0f36186de2d0a04fccfb9f2cebb08fd17523b8c9" Jan 28 20:55:00 crc kubenswrapper[4746]: I0128 20:55:00.671693 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d02e0084d6f0e5ecb8473ea0f36186de2d0a04fccfb9f2cebb08fd17523b8c9"} err="failed to get container status \"0d02e0084d6f0e5ecb8473ea0f36186de2d0a04fccfb9f2cebb08fd17523b8c9\": rpc error: code = NotFound desc = could not find container \"0d02e0084d6f0e5ecb8473ea0f36186de2d0a04fccfb9f2cebb08fd17523b8c9\": container with ID starting with 0d02e0084d6f0e5ecb8473ea0f36186de2d0a04fccfb9f2cebb08fd17523b8c9 not found: ID does not exist" Jan 28 20:55:00 crc kubenswrapper[4746]: I0128 20:55:00.672451 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hk7sk" podStartSLOduration=1.6169484729999999 podStartE2EDuration="1.672437168s" podCreationTimestamp="2026-01-28 20:54:59 +0000 UTC" firstStartedPulling="2026-01-28 20:55:00.216905789 +0000 UTC m=+928.173092143" lastFinishedPulling="2026-01-28 20:55:00.272394464 +0000 UTC m=+928.228580838" observedRunningTime="2026-01-28 20:55:00.648381901 +0000 UTC m=+928.604568275" watchObservedRunningTime="2026-01-28 20:55:00.672437168 +0000 UTC m=+928.628623522" Jan 28 20:55:00 crc kubenswrapper[4746]: I0128 20:55:00.681176 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-22fd6"] Jan 28 20:55:00 crc kubenswrapper[4746]: I0128 20:55:00.686907 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-22fd6"] Jan 28 20:55:00 crc kubenswrapper[4746]: I0128 20:55:00.853721 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18bf7107-67e1-4ee2-8d94-663a18a54a23" path="/var/lib/kubelet/pods/18bf7107-67e1-4ee2-8d94-663a18a54a23/volumes" Jan 28 20:55:01 crc kubenswrapper[4746]: I0128 20:55:01.423701 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5crvf" Jan 28 20:55:01 crc kubenswrapper[4746]: I0128 20:55:01.641035 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-r2vlm" Jan 28 20:55:03 crc kubenswrapper[4746]: I0128 20:55:03.236778 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tbgr9"] Jan 28 20:55:03 crc kubenswrapper[4746]: E0128 20:55:03.237884 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18bf7107-67e1-4ee2-8d94-663a18a54a23" containerName="registry-server" Jan 28 20:55:03 crc kubenswrapper[4746]: I0128 20:55:03.237916 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="18bf7107-67e1-4ee2-8d94-663a18a54a23" containerName="registry-server" Jan 28 20:55:03 crc kubenswrapper[4746]: I0128 20:55:03.238298 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="18bf7107-67e1-4ee2-8d94-663a18a54a23" containerName="registry-server" Jan 28 20:55:03 crc kubenswrapper[4746]: I0128 20:55:03.240377 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbgr9" Jan 28 20:55:03 crc kubenswrapper[4746]: I0128 20:55:03.248580 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tbgr9"] Jan 28 20:55:03 crc kubenswrapper[4746]: I0128 20:55:03.345981 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c4b9237-e5b6-43f5-8f23-f4239c9e515c-catalog-content\") pod \"community-operators-tbgr9\" (UID: \"1c4b9237-e5b6-43f5-8f23-f4239c9e515c\") " pod="openshift-marketplace/community-operators-tbgr9" Jan 28 20:55:03 crc kubenswrapper[4746]: I0128 20:55:03.346067 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwzrd\" (UniqueName: \"kubernetes.io/projected/1c4b9237-e5b6-43f5-8f23-f4239c9e515c-kube-api-access-cwzrd\") pod \"community-operators-tbgr9\" (UID: \"1c4b9237-e5b6-43f5-8f23-f4239c9e515c\") " pod="openshift-marketplace/community-operators-tbgr9" Jan 28 20:55:03 crc kubenswrapper[4746]: I0128 20:55:03.346116 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c4b9237-e5b6-43f5-8f23-f4239c9e515c-utilities\") pod \"community-operators-tbgr9\" (UID: \"1c4b9237-e5b6-43f5-8f23-f4239c9e515c\") " pod="openshift-marketplace/community-operators-tbgr9" Jan 28 20:55:03 crc kubenswrapper[4746]: I0128 20:55:03.446999 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwzrd\" (UniqueName: \"kubernetes.io/projected/1c4b9237-e5b6-43f5-8f23-f4239c9e515c-kube-api-access-cwzrd\") pod \"community-operators-tbgr9\" (UID: \"1c4b9237-e5b6-43f5-8f23-f4239c9e515c\") " pod="openshift-marketplace/community-operators-tbgr9" Jan 28 20:55:03 crc kubenswrapper[4746]: I0128 20:55:03.447054 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c4b9237-e5b6-43f5-8f23-f4239c9e515c-utilities\") pod \"community-operators-tbgr9\" (UID: \"1c4b9237-e5b6-43f5-8f23-f4239c9e515c\") " pod="openshift-marketplace/community-operators-tbgr9" Jan 28 20:55:03 crc kubenswrapper[4746]: I0128 20:55:03.447123 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c4b9237-e5b6-43f5-8f23-f4239c9e515c-catalog-content\") pod \"community-operators-tbgr9\" (UID: \"1c4b9237-e5b6-43f5-8f23-f4239c9e515c\") " pod="openshift-marketplace/community-operators-tbgr9" Jan 28 20:55:03 crc kubenswrapper[4746]: I0128 20:55:03.447889 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c4b9237-e5b6-43f5-8f23-f4239c9e515c-catalog-content\") pod \"community-operators-tbgr9\" (UID: \"1c4b9237-e5b6-43f5-8f23-f4239c9e515c\") " pod="openshift-marketplace/community-operators-tbgr9" Jan 28 20:55:03 crc kubenswrapper[4746]: I0128 20:55:03.448141 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c4b9237-e5b6-43f5-8f23-f4239c9e515c-utilities\") pod \"community-operators-tbgr9\" (UID: \"1c4b9237-e5b6-43f5-8f23-f4239c9e515c\") " pod="openshift-marketplace/community-operators-tbgr9" Jan 28 20:55:03 crc kubenswrapper[4746]: I0128 20:55:03.469910 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwzrd\" (UniqueName: \"kubernetes.io/projected/1c4b9237-e5b6-43f5-8f23-f4239c9e515c-kube-api-access-cwzrd\") pod \"community-operators-tbgr9\" (UID: \"1c4b9237-e5b6-43f5-8f23-f4239c9e515c\") " pod="openshift-marketplace/community-operators-tbgr9" Jan 28 20:55:03 crc kubenswrapper[4746]: I0128 20:55:03.564115 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbgr9" Jan 28 20:55:03 crc kubenswrapper[4746]: I0128 20:55:03.859985 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tbgr9"] Jan 28 20:55:04 crc kubenswrapper[4746]: I0128 20:55:04.671369 4746 generic.go:334] "Generic (PLEG): container finished" podID="1c4b9237-e5b6-43f5-8f23-f4239c9e515c" containerID="26781c8d7ce980d50ad22fc950d35a536bd0fe1537047259ec3eb457efa394b4" exitCode=0 Jan 28 20:55:04 crc kubenswrapper[4746]: I0128 20:55:04.671881 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbgr9" event={"ID":"1c4b9237-e5b6-43f5-8f23-f4239c9e515c","Type":"ContainerDied","Data":"26781c8d7ce980d50ad22fc950d35a536bd0fe1537047259ec3eb457efa394b4"} Jan 28 20:55:04 crc kubenswrapper[4746]: I0128 20:55:04.671926 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbgr9" event={"ID":"1c4b9237-e5b6-43f5-8f23-f4239c9e515c","Type":"ContainerStarted","Data":"47945d4d84775fa145f0a64499018656cb40de52df2c4738eb4ee3676fa56cb9"} Jan 28 20:55:05 crc kubenswrapper[4746]: I0128 20:55:05.682617 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbgr9" event={"ID":"1c4b9237-e5b6-43f5-8f23-f4239c9e515c","Type":"ContainerStarted","Data":"7b49908ae7ad888d3a261c8b5f37290e45ce52165f2a127363d2f85273690772"} Jan 28 20:55:06 crc kubenswrapper[4746]: I0128 20:55:06.692030 4746 generic.go:334] "Generic (PLEG): container finished" podID="1c4b9237-e5b6-43f5-8f23-f4239c9e515c" containerID="7b49908ae7ad888d3a261c8b5f37290e45ce52165f2a127363d2f85273690772" exitCode=0 Jan 28 20:55:06 crc kubenswrapper[4746]: I0128 20:55:06.692097 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbgr9" event={"ID":"1c4b9237-e5b6-43f5-8f23-f4239c9e515c","Type":"ContainerDied","Data":"7b49908ae7ad888d3a261c8b5f37290e45ce52165f2a127363d2f85273690772"} Jan 28 20:55:07 crc kubenswrapper[4746]: I0128 20:55:07.702834 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbgr9" event={"ID":"1c4b9237-e5b6-43f5-8f23-f4239c9e515c","Type":"ContainerStarted","Data":"7897650b8bdf63f3de597ce7d02ec7fe553613c4207077e2d688436b3e37ef9e"} Jan 28 20:55:07 crc kubenswrapper[4746]: I0128 20:55:07.728897 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tbgr9" podStartSLOduration=2.280904256 podStartE2EDuration="4.728863637s" podCreationTimestamp="2026-01-28 20:55:03 +0000 UTC" firstStartedPulling="2026-01-28 20:55:04.681680537 +0000 UTC m=+932.637866931" lastFinishedPulling="2026-01-28 20:55:07.129639948 +0000 UTC m=+935.085826312" observedRunningTime="2026-01-28 20:55:07.726467233 +0000 UTC m=+935.682653617" watchObservedRunningTime="2026-01-28 20:55:07.728863637 +0000 UTC m=+935.685049991" Jan 28 20:55:09 crc kubenswrapper[4746]: I0128 20:55:09.734464 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-hk7sk" Jan 28 20:55:09 crc kubenswrapper[4746]: I0128 20:55:09.734918 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-hk7sk" Jan 28 20:55:09 crc kubenswrapper[4746]: I0128 20:55:09.776676 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-hk7sk" Jan 28 20:55:10 crc kubenswrapper[4746]: I0128 20:55:10.769596 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-hk7sk" Jan 28 20:55:11 crc kubenswrapper[4746]: I0128 20:55:11.393186 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-hws9w" Jan 28 20:55:13 crc kubenswrapper[4746]: I0128 20:55:13.564497 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tbgr9" Jan 28 20:55:13 crc kubenswrapper[4746]: I0128 20:55:13.565568 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tbgr9" Jan 28 20:55:13 crc kubenswrapper[4746]: I0128 20:55:13.640771 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tbgr9" Jan 28 20:55:13 crc kubenswrapper[4746]: I0128 20:55:13.805807 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tbgr9" Jan 28 20:55:16 crc kubenswrapper[4746]: I0128 20:55:16.675070 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp"] Jan 28 20:55:16 crc kubenswrapper[4746]: I0128 20:55:16.678600 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp" Jan 28 20:55:16 crc kubenswrapper[4746]: I0128 20:55:16.683246 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-sxfnf" Jan 28 20:55:16 crc kubenswrapper[4746]: I0128 20:55:16.683386 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp"] Jan 28 20:55:16 crc kubenswrapper[4746]: I0128 20:55:16.753919 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57qg6\" (UniqueName: \"kubernetes.io/projected/c1bfc71e-7105-4567-b92a-37c08b17a97c-kube-api-access-57qg6\") pod \"1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp\" (UID: \"c1bfc71e-7105-4567-b92a-37c08b17a97c\") " pod="openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp" Jan 28 20:55:16 crc kubenswrapper[4746]: I0128 20:55:16.753968 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1bfc71e-7105-4567-b92a-37c08b17a97c-bundle\") pod \"1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp\" (UID: \"c1bfc71e-7105-4567-b92a-37c08b17a97c\") " pod="openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp" Jan 28 20:55:16 crc kubenswrapper[4746]: I0128 20:55:16.753998 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1bfc71e-7105-4567-b92a-37c08b17a97c-util\") pod \"1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp\" (UID: \"c1bfc71e-7105-4567-b92a-37c08b17a97c\") " pod="openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp" Jan 28 20:55:16 crc kubenswrapper[4746]: I0128 20:55:16.855452 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57qg6\" (UniqueName: \"kubernetes.io/projected/c1bfc71e-7105-4567-b92a-37c08b17a97c-kube-api-access-57qg6\") pod \"1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp\" (UID: \"c1bfc71e-7105-4567-b92a-37c08b17a97c\") " pod="openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp" Jan 28 20:55:16 crc kubenswrapper[4746]: I0128 20:55:16.855508 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1bfc71e-7105-4567-b92a-37c08b17a97c-bundle\") pod \"1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp\" (UID: \"c1bfc71e-7105-4567-b92a-37c08b17a97c\") " pod="openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp" Jan 28 20:55:16 crc kubenswrapper[4746]: I0128 20:55:16.855549 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1bfc71e-7105-4567-b92a-37c08b17a97c-util\") pod \"1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp\" (UID: \"c1bfc71e-7105-4567-b92a-37c08b17a97c\") " pod="openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp" Jan 28 20:55:16 crc kubenswrapper[4746]: I0128 20:55:16.856115 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1bfc71e-7105-4567-b92a-37c08b17a97c-util\") pod \"1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp\" (UID: \"c1bfc71e-7105-4567-b92a-37c08b17a97c\") " pod="openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp" Jan 28 20:55:16 crc kubenswrapper[4746]: I0128 20:55:16.856113 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1bfc71e-7105-4567-b92a-37c08b17a97c-bundle\") pod \"1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp\" (UID: \"c1bfc71e-7105-4567-b92a-37c08b17a97c\") " pod="openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp" Jan 28 20:55:16 crc kubenswrapper[4746]: I0128 20:55:16.877183 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57qg6\" (UniqueName: \"kubernetes.io/projected/c1bfc71e-7105-4567-b92a-37c08b17a97c-kube-api-access-57qg6\") pod \"1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp\" (UID: \"c1bfc71e-7105-4567-b92a-37c08b17a97c\") " pod="openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp" Jan 28 20:55:17 crc kubenswrapper[4746]: I0128 20:55:17.038132 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp" Jan 28 20:55:17 crc kubenswrapper[4746]: I0128 20:55:17.419960 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tbgr9"] Jan 28 20:55:17 crc kubenswrapper[4746]: I0128 20:55:17.421003 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tbgr9" podUID="1c4b9237-e5b6-43f5-8f23-f4239c9e515c" containerName="registry-server" containerID="cri-o://7897650b8bdf63f3de597ce7d02ec7fe553613c4207077e2d688436b3e37ef9e" gracePeriod=2 Jan 28 20:55:17 crc kubenswrapper[4746]: I0128 20:55:17.494808 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp"] Jan 28 20:55:17 crc kubenswrapper[4746]: I0128 20:55:17.785809 4746 generic.go:334] "Generic (PLEG): container finished" podID="1c4b9237-e5b6-43f5-8f23-f4239c9e515c" containerID="7897650b8bdf63f3de597ce7d02ec7fe553613c4207077e2d688436b3e37ef9e" exitCode=0 Jan 28 20:55:17 crc kubenswrapper[4746]: I0128 20:55:17.785905 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbgr9" event={"ID":"1c4b9237-e5b6-43f5-8f23-f4239c9e515c","Type":"ContainerDied","Data":"7897650b8bdf63f3de597ce7d02ec7fe553613c4207077e2d688436b3e37ef9e"} Jan 28 20:55:17 crc kubenswrapper[4746]: I0128 20:55:17.787388 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp" event={"ID":"c1bfc71e-7105-4567-b92a-37c08b17a97c","Type":"ContainerStarted","Data":"d5afaa12f8005c83a6b477900135056626e87edd696d7614043af394e6c9d70a"} Jan 28 20:55:17 crc kubenswrapper[4746]: I0128 20:55:17.997165 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbgr9" Jan 28 20:55:18 crc kubenswrapper[4746]: I0128 20:55:18.072891 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwzrd\" (UniqueName: \"kubernetes.io/projected/1c4b9237-e5b6-43f5-8f23-f4239c9e515c-kube-api-access-cwzrd\") pod \"1c4b9237-e5b6-43f5-8f23-f4239c9e515c\" (UID: \"1c4b9237-e5b6-43f5-8f23-f4239c9e515c\") " Jan 28 20:55:18 crc kubenswrapper[4746]: I0128 20:55:18.072962 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c4b9237-e5b6-43f5-8f23-f4239c9e515c-utilities\") pod \"1c4b9237-e5b6-43f5-8f23-f4239c9e515c\" (UID: \"1c4b9237-e5b6-43f5-8f23-f4239c9e515c\") " Jan 28 20:55:18 crc kubenswrapper[4746]: I0128 20:55:18.073071 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c4b9237-e5b6-43f5-8f23-f4239c9e515c-catalog-content\") pod \"1c4b9237-e5b6-43f5-8f23-f4239c9e515c\" (UID: \"1c4b9237-e5b6-43f5-8f23-f4239c9e515c\") " Jan 28 20:55:18 crc kubenswrapper[4746]: I0128 20:55:18.073825 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c4b9237-e5b6-43f5-8f23-f4239c9e515c-utilities" (OuterVolumeSpecName: "utilities") pod "1c4b9237-e5b6-43f5-8f23-f4239c9e515c" (UID: "1c4b9237-e5b6-43f5-8f23-f4239c9e515c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:55:18 crc kubenswrapper[4746]: I0128 20:55:18.094388 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c4b9237-e5b6-43f5-8f23-f4239c9e515c-kube-api-access-cwzrd" (OuterVolumeSpecName: "kube-api-access-cwzrd") pod "1c4b9237-e5b6-43f5-8f23-f4239c9e515c" (UID: "1c4b9237-e5b6-43f5-8f23-f4239c9e515c"). InnerVolumeSpecName "kube-api-access-cwzrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:55:18 crc kubenswrapper[4746]: I0128 20:55:18.128743 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c4b9237-e5b6-43f5-8f23-f4239c9e515c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c4b9237-e5b6-43f5-8f23-f4239c9e515c" (UID: "1c4b9237-e5b6-43f5-8f23-f4239c9e515c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:55:18 crc kubenswrapper[4746]: I0128 20:55:18.175226 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c4b9237-e5b6-43f5-8f23-f4239c9e515c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 20:55:18 crc kubenswrapper[4746]: I0128 20:55:18.175267 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwzrd\" (UniqueName: \"kubernetes.io/projected/1c4b9237-e5b6-43f5-8f23-f4239c9e515c-kube-api-access-cwzrd\") on node \"crc\" DevicePath \"\"" Jan 28 20:55:18 crc kubenswrapper[4746]: I0128 20:55:18.175281 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c4b9237-e5b6-43f5-8f23-f4239c9e515c-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 20:55:18 crc kubenswrapper[4746]: I0128 20:55:18.796743 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbgr9" Jan 28 20:55:18 crc kubenswrapper[4746]: I0128 20:55:18.796726 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbgr9" event={"ID":"1c4b9237-e5b6-43f5-8f23-f4239c9e515c","Type":"ContainerDied","Data":"47945d4d84775fa145f0a64499018656cb40de52df2c4738eb4ee3676fa56cb9"} Jan 28 20:55:18 crc kubenswrapper[4746]: I0128 20:55:18.796928 4746 scope.go:117] "RemoveContainer" containerID="7897650b8bdf63f3de597ce7d02ec7fe553613c4207077e2d688436b3e37ef9e" Jan 28 20:55:18 crc kubenswrapper[4746]: I0128 20:55:18.798573 4746 generic.go:334] "Generic (PLEG): container finished" podID="c1bfc71e-7105-4567-b92a-37c08b17a97c" containerID="cc10cc821d64c78c8a35af42c55259565d65b2bbb82952f6c3c63865ba0c73c2" exitCode=0 Jan 28 20:55:18 crc kubenswrapper[4746]: I0128 20:55:18.798628 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp" event={"ID":"c1bfc71e-7105-4567-b92a-37c08b17a97c","Type":"ContainerDied","Data":"cc10cc821d64c78c8a35af42c55259565d65b2bbb82952f6c3c63865ba0c73c2"} Jan 28 20:55:18 crc kubenswrapper[4746]: I0128 20:55:18.850429 4746 scope.go:117] "RemoveContainer" containerID="7b49908ae7ad888d3a261c8b5f37290e45ce52165f2a127363d2f85273690772" Jan 28 20:55:18 crc kubenswrapper[4746]: I0128 20:55:18.867881 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tbgr9"] Jan 28 20:55:18 crc kubenswrapper[4746]: I0128 20:55:18.877261 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tbgr9"] Jan 28 20:55:18 crc kubenswrapper[4746]: I0128 20:55:18.922212 4746 scope.go:117] "RemoveContainer" containerID="26781c8d7ce980d50ad22fc950d35a536bd0fe1537047259ec3eb457efa394b4" Jan 28 20:55:19 crc kubenswrapper[4746]: I0128 20:55:19.809058 4746 generic.go:334] "Generic (PLEG): container finished" podID="c1bfc71e-7105-4567-b92a-37c08b17a97c" containerID="bb0e63ab6aeffaa65eecd2a34a36c2ce0e7170e8413ec9983a57cc9321afa309" exitCode=0 Jan 28 20:55:19 crc kubenswrapper[4746]: I0128 20:55:19.809133 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp" event={"ID":"c1bfc71e-7105-4567-b92a-37c08b17a97c","Type":"ContainerDied","Data":"bb0e63ab6aeffaa65eecd2a34a36c2ce0e7170e8413ec9983a57cc9321afa309"} Jan 28 20:55:20 crc kubenswrapper[4746]: I0128 20:55:20.831065 4746 generic.go:334] "Generic (PLEG): container finished" podID="c1bfc71e-7105-4567-b92a-37c08b17a97c" containerID="48338689c77ae000ebc0c9f8363a53a869c525e801a95fe55917aa5a640f8b9a" exitCode=0 Jan 28 20:55:20 crc kubenswrapper[4746]: I0128 20:55:20.831143 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp" event={"ID":"c1bfc71e-7105-4567-b92a-37c08b17a97c","Type":"ContainerDied","Data":"48338689c77ae000ebc0c9f8363a53a869c525e801a95fe55917aa5a640f8b9a"} Jan 28 20:55:20 crc kubenswrapper[4746]: I0128 20:55:20.843901 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c4b9237-e5b6-43f5-8f23-f4239c9e515c" path="/var/lib/kubelet/pods/1c4b9237-e5b6-43f5-8f23-f4239c9e515c/volumes" Jan 28 20:55:22 crc kubenswrapper[4746]: I0128 20:55:22.144262 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp" Jan 28 20:55:22 crc kubenswrapper[4746]: I0128 20:55:22.245921 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1bfc71e-7105-4567-b92a-37c08b17a97c-util\") pod \"c1bfc71e-7105-4567-b92a-37c08b17a97c\" (UID: \"c1bfc71e-7105-4567-b92a-37c08b17a97c\") " Jan 28 20:55:22 crc kubenswrapper[4746]: I0128 20:55:22.246060 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1bfc71e-7105-4567-b92a-37c08b17a97c-bundle\") pod \"c1bfc71e-7105-4567-b92a-37c08b17a97c\" (UID: \"c1bfc71e-7105-4567-b92a-37c08b17a97c\") " Jan 28 20:55:22 crc kubenswrapper[4746]: I0128 20:55:22.246153 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57qg6\" (UniqueName: \"kubernetes.io/projected/c1bfc71e-7105-4567-b92a-37c08b17a97c-kube-api-access-57qg6\") pod \"c1bfc71e-7105-4567-b92a-37c08b17a97c\" (UID: \"c1bfc71e-7105-4567-b92a-37c08b17a97c\") " Jan 28 20:55:22 crc kubenswrapper[4746]: I0128 20:55:22.246881 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1bfc71e-7105-4567-b92a-37c08b17a97c-bundle" (OuterVolumeSpecName: "bundle") pod "c1bfc71e-7105-4567-b92a-37c08b17a97c" (UID: "c1bfc71e-7105-4567-b92a-37c08b17a97c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:55:22 crc kubenswrapper[4746]: I0128 20:55:22.253445 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1bfc71e-7105-4567-b92a-37c08b17a97c-kube-api-access-57qg6" (OuterVolumeSpecName: "kube-api-access-57qg6") pod "c1bfc71e-7105-4567-b92a-37c08b17a97c" (UID: "c1bfc71e-7105-4567-b92a-37c08b17a97c"). InnerVolumeSpecName "kube-api-access-57qg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:55:22 crc kubenswrapper[4746]: I0128 20:55:22.261663 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1bfc71e-7105-4567-b92a-37c08b17a97c-util" (OuterVolumeSpecName: "util") pod "c1bfc71e-7105-4567-b92a-37c08b17a97c" (UID: "c1bfc71e-7105-4567-b92a-37c08b17a97c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:55:22 crc kubenswrapper[4746]: I0128 20:55:22.347552 4746 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1bfc71e-7105-4567-b92a-37c08b17a97c-util\") on node \"crc\" DevicePath \"\"" Jan 28 20:55:22 crc kubenswrapper[4746]: I0128 20:55:22.347607 4746 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1bfc71e-7105-4567-b92a-37c08b17a97c-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:55:22 crc kubenswrapper[4746]: I0128 20:55:22.347618 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57qg6\" (UniqueName: \"kubernetes.io/projected/c1bfc71e-7105-4567-b92a-37c08b17a97c-kube-api-access-57qg6\") on node \"crc\" DevicePath \"\"" Jan 28 20:55:22 crc kubenswrapper[4746]: I0128 20:55:22.856298 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp" event={"ID":"c1bfc71e-7105-4567-b92a-37c08b17a97c","Type":"ContainerDied","Data":"d5afaa12f8005c83a6b477900135056626e87edd696d7614043af394e6c9d70a"} Jan 28 20:55:22 crc kubenswrapper[4746]: I0128 20:55:22.856699 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5afaa12f8005c83a6b477900135056626e87edd696d7614043af394e6c9d70a" Jan 28 20:55:22 crc kubenswrapper[4746]: I0128 20:55:22.856435 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp" Jan 28 20:55:25 crc kubenswrapper[4746]: I0128 20:55:25.368744 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-58bd5c8549-ggt7x"] Jan 28 20:55:25 crc kubenswrapper[4746]: E0128 20:55:25.369012 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c4b9237-e5b6-43f5-8f23-f4239c9e515c" containerName="extract-utilities" Jan 28 20:55:25 crc kubenswrapper[4746]: I0128 20:55:25.369025 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4b9237-e5b6-43f5-8f23-f4239c9e515c" containerName="extract-utilities" Jan 28 20:55:25 crc kubenswrapper[4746]: E0128 20:55:25.369040 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1bfc71e-7105-4567-b92a-37c08b17a97c" containerName="extract" Jan 28 20:55:25 crc kubenswrapper[4746]: I0128 20:55:25.369046 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1bfc71e-7105-4567-b92a-37c08b17a97c" containerName="extract" Jan 28 20:55:25 crc kubenswrapper[4746]: E0128 20:55:25.369066 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c4b9237-e5b6-43f5-8f23-f4239c9e515c" containerName="extract-content" Jan 28 20:55:25 crc kubenswrapper[4746]: I0128 20:55:25.369072 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4b9237-e5b6-43f5-8f23-f4239c9e515c" containerName="extract-content" Jan 28 20:55:25 crc kubenswrapper[4746]: E0128 20:55:25.369096 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1bfc71e-7105-4567-b92a-37c08b17a97c" containerName="pull" Jan 28 20:55:25 crc kubenswrapper[4746]: I0128 20:55:25.369102 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1bfc71e-7105-4567-b92a-37c08b17a97c" containerName="pull" Jan 28 20:55:25 crc kubenswrapper[4746]: E0128 20:55:25.369112 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c4b9237-e5b6-43f5-8f23-f4239c9e515c" containerName="registry-server" Jan 28 20:55:25 crc kubenswrapper[4746]: I0128 20:55:25.369117 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4b9237-e5b6-43f5-8f23-f4239c9e515c" containerName="registry-server" Jan 28 20:55:25 crc kubenswrapper[4746]: E0128 20:55:25.369126 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1bfc71e-7105-4567-b92a-37c08b17a97c" containerName="util" Jan 28 20:55:25 crc kubenswrapper[4746]: I0128 20:55:25.369132 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1bfc71e-7105-4567-b92a-37c08b17a97c" containerName="util" Jan 28 20:55:25 crc kubenswrapper[4746]: I0128 20:55:25.369231 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1bfc71e-7105-4567-b92a-37c08b17a97c" containerName="extract" Jan 28 20:55:25 crc kubenswrapper[4746]: I0128 20:55:25.369240 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c4b9237-e5b6-43f5-8f23-f4239c9e515c" containerName="registry-server" Jan 28 20:55:25 crc kubenswrapper[4746]: I0128 20:55:25.369720 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-58bd5c8549-ggt7x" Jan 28 20:55:25 crc kubenswrapper[4746]: I0128 20:55:25.372643 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-wnp4c" Jan 28 20:55:25 crc kubenswrapper[4746]: I0128 20:55:25.390589 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-58bd5c8549-ggt7x"] Jan 28 20:55:25 crc kubenswrapper[4746]: I0128 20:55:25.489409 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxb27\" (UniqueName: \"kubernetes.io/projected/0e81bc43-baa9-4cbd-a255-233e12e2b84b-kube-api-access-vxb27\") pod \"openstack-operator-controller-init-58bd5c8549-ggt7x\" (UID: \"0e81bc43-baa9-4cbd-a255-233e12e2b84b\") " pod="openstack-operators/openstack-operator-controller-init-58bd5c8549-ggt7x" Jan 28 20:55:25 crc kubenswrapper[4746]: I0128 20:55:25.590921 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxb27\" (UniqueName: \"kubernetes.io/projected/0e81bc43-baa9-4cbd-a255-233e12e2b84b-kube-api-access-vxb27\") pod \"openstack-operator-controller-init-58bd5c8549-ggt7x\" (UID: \"0e81bc43-baa9-4cbd-a255-233e12e2b84b\") " pod="openstack-operators/openstack-operator-controller-init-58bd5c8549-ggt7x" Jan 28 20:55:25 crc kubenswrapper[4746]: I0128 20:55:25.616886 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxb27\" (UniqueName: \"kubernetes.io/projected/0e81bc43-baa9-4cbd-a255-233e12e2b84b-kube-api-access-vxb27\") pod \"openstack-operator-controller-init-58bd5c8549-ggt7x\" (UID: \"0e81bc43-baa9-4cbd-a255-233e12e2b84b\") " pod="openstack-operators/openstack-operator-controller-init-58bd5c8549-ggt7x" Jan 28 20:55:25 crc kubenswrapper[4746]: I0128 20:55:25.690643 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-58bd5c8549-ggt7x" Jan 28 20:55:26 crc kubenswrapper[4746]: I0128 20:55:26.170960 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-58bd5c8549-ggt7x"] Jan 28 20:55:26 crc kubenswrapper[4746]: I0128 20:55:26.186782 4746 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 20:55:26 crc kubenswrapper[4746]: I0128 20:55:26.882089 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-58bd5c8549-ggt7x" event={"ID":"0e81bc43-baa9-4cbd-a255-233e12e2b84b","Type":"ContainerStarted","Data":"0f2ac606a007f0591809e984dc9487337fbf711aa11041080e220da643b651de"} Jan 28 20:55:30 crc kubenswrapper[4746]: I0128 20:55:30.922701 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-58bd5c8549-ggt7x" event={"ID":"0e81bc43-baa9-4cbd-a255-233e12e2b84b","Type":"ContainerStarted","Data":"b48db345a0068ae4182be928337873dddbf870b59bd24fb4729c0a33b75c2811"} Jan 28 20:55:30 crc kubenswrapper[4746]: I0128 20:55:30.923929 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-58bd5c8549-ggt7x" Jan 28 20:55:30 crc kubenswrapper[4746]: I0128 20:55:30.966935 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-58bd5c8549-ggt7x" podStartSLOduration=1.6225325229999998 podStartE2EDuration="5.966901779s" podCreationTimestamp="2026-01-28 20:55:25 +0000 UTC" firstStartedPulling="2026-01-28 20:55:26.18651061 +0000 UTC m=+954.142696964" lastFinishedPulling="2026-01-28 20:55:30.530879866 +0000 UTC m=+958.487066220" observedRunningTime="2026-01-28 20:55:30.964034512 +0000 UTC m=+958.920220866" watchObservedRunningTime="2026-01-28 20:55:30.966901779 +0000 UTC m=+958.923088173" Jan 28 20:55:35 crc kubenswrapper[4746]: I0128 20:55:35.694038 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-58bd5c8549-ggt7x" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.581438 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7f86f8796f-kll6j"] Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.583322 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-kll6j" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.585947 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-dwzcw" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.592609 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7478f7dbf9-n6qr7"] Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.593676 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-n6qr7" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.596969 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-64r6g" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.598185 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7f86f8796f-kll6j"] Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.607980 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7478f7dbf9-n6qr7"] Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.639919 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-b45d7bf98-cm85d"] Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.641057 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-cm85d" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.647918 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-62scq" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.660294 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-b45d7bf98-cm85d"] Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.662491 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpv9p\" (UniqueName: \"kubernetes.io/projected/f86e66ed-9f28-4514-8ff8-97b8353026d1-kube-api-access-fpv9p\") pod \"cinder-operator-controller-manager-7478f7dbf9-n6qr7\" (UID: \"f86e66ed-9f28-4514-8ff8-97b8353026d1\") " pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-n6qr7" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.662699 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7hl4\" (UniqueName: \"kubernetes.io/projected/3c81bd6e-961b-42ae-8840-2607a13046df-kube-api-access-c7hl4\") pod \"barbican-operator-controller-manager-7f86f8796f-kll6j\" (UID: \"3c81bd6e-961b-42ae-8840-2607a13046df\") " pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-kll6j" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.667202 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-78fdd796fd-bxtxd"] Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.668735 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-bxtxd" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.685867 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-r4kj6" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.688754 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-78fdd796fd-bxtxd"] Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.704502 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-ws7k7"] Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.728778 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-ws7k7" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.735451 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-tjzxp" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.764064 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-p6qjg"] Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.777248 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74pq9\" (UniqueName: \"kubernetes.io/projected/fe660f4f-8806-4674-ab58-ea3303f51683-kube-api-access-74pq9\") pod \"glance-operator-controller-manager-78fdd796fd-bxtxd\" (UID: \"fe660f4f-8806-4674-ab58-ea3303f51683\") " pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-bxtxd" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.777433 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sf94\" (UniqueName: \"kubernetes.io/projected/63794c40-0128-457d-b223-84e87943cca9-kube-api-access-6sf94\") pod \"designate-operator-controller-manager-b45d7bf98-cm85d\" (UID: \"63794c40-0128-457d-b223-84e87943cca9\") " pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-cm85d" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.777579 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q29xm\" (UniqueName: \"kubernetes.io/projected/760877c4-6e86-4445-a4cf-002b48e93841-kube-api-access-q29xm\") pod \"horizon-operator-controller-manager-77d5c5b54f-ws7k7\" (UID: \"760877c4-6e86-4445-a4cf-002b48e93841\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-ws7k7" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.777654 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpv9p\" (UniqueName: \"kubernetes.io/projected/f86e66ed-9f28-4514-8ff8-97b8353026d1-kube-api-access-fpv9p\") pod \"cinder-operator-controller-manager-7478f7dbf9-n6qr7\" (UID: \"f86e66ed-9f28-4514-8ff8-97b8353026d1\") " pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-n6qr7" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.777752 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7hl4\" (UniqueName: \"kubernetes.io/projected/3c81bd6e-961b-42ae-8840-2607a13046df-kube-api-access-c7hl4\") pod \"barbican-operator-controller-manager-7f86f8796f-kll6j\" (UID: \"3c81bd6e-961b-42ae-8840-2607a13046df\") " pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-kll6j" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.788426 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-p6qjg" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.796654 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-6cflg" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.849438 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7hl4\" (UniqueName: \"kubernetes.io/projected/3c81bd6e-961b-42ae-8840-2607a13046df-kube-api-access-c7hl4\") pod \"barbican-operator-controller-manager-7f86f8796f-kll6j\" (UID: \"3c81bd6e-961b-42ae-8840-2607a13046df\") " pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-kll6j" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.868052 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpv9p\" (UniqueName: \"kubernetes.io/projected/f86e66ed-9f28-4514-8ff8-97b8353026d1-kube-api-access-fpv9p\") pod \"cinder-operator-controller-manager-7478f7dbf9-n6qr7\" (UID: \"f86e66ed-9f28-4514-8ff8-97b8353026d1\") " pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-n6qr7" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.872921 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-ws7k7"] Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.877300 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-p6qjg"] Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.878770 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q29xm\" (UniqueName: \"kubernetes.io/projected/760877c4-6e86-4445-a4cf-002b48e93841-kube-api-access-q29xm\") pod \"horizon-operator-controller-manager-77d5c5b54f-ws7k7\" (UID: \"760877c4-6e86-4445-a4cf-002b48e93841\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-ws7k7" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.878856 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dbwr\" (UniqueName: \"kubernetes.io/projected/677d2ab0-897d-4fd5-8ca5-b75f310e38da-kube-api-access-4dbwr\") pod \"heat-operator-controller-manager-594c8c9d5d-p6qjg\" (UID: \"677d2ab0-897d-4fd5-8ca5-b75f310e38da\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-p6qjg" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.878892 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74pq9\" (UniqueName: \"kubernetes.io/projected/fe660f4f-8806-4674-ab58-ea3303f51683-kube-api-access-74pq9\") pod \"glance-operator-controller-manager-78fdd796fd-bxtxd\" (UID: \"fe660f4f-8806-4674-ab58-ea3303f51683\") " pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-bxtxd" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.878955 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sf94\" (UniqueName: \"kubernetes.io/projected/63794c40-0128-457d-b223-84e87943cca9-kube-api-access-6sf94\") pod \"designate-operator-controller-manager-b45d7bf98-cm85d\" (UID: \"63794c40-0128-457d-b223-84e87943cca9\") " pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-cm85d" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.898482 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg"] Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.899768 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.908623 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.908894 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-rz87l" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.912858 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-kll6j" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.915253 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q29xm\" (UniqueName: \"kubernetes.io/projected/760877c4-6e86-4445-a4cf-002b48e93841-kube-api-access-q29xm\") pod \"horizon-operator-controller-manager-77d5c5b54f-ws7k7\" (UID: \"760877c4-6e86-4445-a4cf-002b48e93841\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-ws7k7" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.924203 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg"] Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.930458 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-n6qr7" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.931068 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sf94\" (UniqueName: \"kubernetes.io/projected/63794c40-0128-457d-b223-84e87943cca9-kube-api-access-6sf94\") pod \"designate-operator-controller-manager-b45d7bf98-cm85d\" (UID: \"63794c40-0128-457d-b223-84e87943cca9\") " pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-cm85d" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.931639 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74pq9\" (UniqueName: \"kubernetes.io/projected/fe660f4f-8806-4674-ab58-ea3303f51683-kube-api-access-74pq9\") pod \"glance-operator-controller-manager-78fdd796fd-bxtxd\" (UID: \"fe660f4f-8806-4674-ab58-ea3303f51683\") " pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-bxtxd" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.940790 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b8b6d4659-65qb5"] Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.941676 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-65qb5" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.945375 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-pmz2z" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.957214 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b8b6d4659-65qb5"] Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.958582 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-78c6999f6f-m5qbs"] Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.961104 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-m5qbs" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.977626 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-cm85d" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.977626 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zcvgn"] Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.979771 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgqxx\" (UniqueName: \"kubernetes.io/projected/f682c47e-2151-466d-8cc5-9ef0fca79785-kube-api-access-kgqxx\") pod \"manila-operator-controller-manager-78c6999f6f-m5qbs\" (UID: \"f682c47e-2151-466d-8cc5-9ef0fca79785\") " pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-m5qbs" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.979800 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrwpt\" (UniqueName: \"kubernetes.io/projected/28de2427-e250-44f5-add2-1b738cf6ce3b-kube-api-access-rrwpt\") pod \"infra-operator-controller-manager-694cf4f878-th2hg\" (UID: \"28de2427-e250-44f5-add2-1b738cf6ce3b\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.979833 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjxzk\" (UniqueName: \"kubernetes.io/projected/fc220202-4669-4c2e-94b0-583048b56c83-kube-api-access-gjxzk\") pod \"keystone-operator-controller-manager-b8b6d4659-65qb5\" (UID: \"fc220202-4669-4c2e-94b0-583048b56c83\") " pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-65qb5" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.979882 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dbwr\" (UniqueName: \"kubernetes.io/projected/677d2ab0-897d-4fd5-8ca5-b75f310e38da-kube-api-access-4dbwr\") pod \"heat-operator-controller-manager-594c8c9d5d-p6qjg\" (UID: \"677d2ab0-897d-4fd5-8ca5-b75f310e38da\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-p6qjg" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.979905 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28de2427-e250-44f5-add2-1b738cf6ce3b-cert\") pod \"infra-operator-controller-manager-694cf4f878-th2hg\" (UID: \"28de2427-e250-44f5-add2-1b738cf6ce3b\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.980003 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zcvgn" Jan 28 20:55:56 crc kubenswrapper[4746]: I0128 20:55:56.992949 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-5gg6f" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:56.998044 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-gwqgg" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:56.999667 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zcvgn"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.009194 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-78c6999f6f-m5qbs"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.019903 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78d58447c5-pcprz"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.020790 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-pcprz" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.023135 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-598f7747c9-5lc6j"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.024204 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-5lc6j" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.036799 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-2g9vj" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.036942 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dbwr\" (UniqueName: \"kubernetes.io/projected/677d2ab0-897d-4fd5-8ca5-b75f310e38da-kube-api-access-4dbwr\") pod \"heat-operator-controller-manager-594c8c9d5d-p6qjg\" (UID: \"677d2ab0-897d-4fd5-8ca5-b75f310e38da\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-p6qjg" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.038318 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-sdznh" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.039920 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78d58447c5-pcprz"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.047217 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7bdb645866-pg4s4"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.048533 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-bxtxd" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.050198 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-pg4s4" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.056511 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-kjbtl" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.060244 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-598f7747c9-5lc6j"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.085052 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgqxx\" (UniqueName: \"kubernetes.io/projected/f682c47e-2151-466d-8cc5-9ef0fca79785-kube-api-access-kgqxx\") pod \"manila-operator-controller-manager-78c6999f6f-m5qbs\" (UID: \"f682c47e-2151-466d-8cc5-9ef0fca79785\") " pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-m5qbs" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.085132 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrwpt\" (UniqueName: \"kubernetes.io/projected/28de2427-e250-44f5-add2-1b738cf6ce3b-kube-api-access-rrwpt\") pod \"infra-operator-controller-manager-694cf4f878-th2hg\" (UID: \"28de2427-e250-44f5-add2-1b738cf6ce3b\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.085163 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99fj2\" (UniqueName: \"kubernetes.io/projected/b182a0df-d0f9-46d6-9a0c-a3e332c84cff-kube-api-access-99fj2\") pod \"neutron-operator-controller-manager-78d58447c5-pcprz\" (UID: \"b182a0df-d0f9-46d6-9a0c-a3e332c84cff\") " pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-pcprz" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.085200 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nglmv\" (UniqueName: \"kubernetes.io/projected/5521c5f5-d2f6-461b-a2fc-ee97a5b2df11-kube-api-access-nglmv\") pod \"mariadb-operator-controller-manager-6b9fb5fdcb-zcvgn\" (UID: \"5521c5f5-d2f6-461b-a2fc-ee97a5b2df11\") " pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zcvgn" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.085237 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjxzk\" (UniqueName: \"kubernetes.io/projected/fc220202-4669-4c2e-94b0-583048b56c83-kube-api-access-gjxzk\") pod \"keystone-operator-controller-manager-b8b6d4659-65qb5\" (UID: \"fc220202-4669-4c2e-94b0-583048b56c83\") " pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-65qb5" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.085464 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-ws7k7" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.087065 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrqs2\" (UniqueName: \"kubernetes.io/projected/b44b1510-0a60-4b4e-9541-cc6d18e10a7f-kube-api-access-rrqs2\") pod \"ironic-operator-controller-manager-598f7747c9-5lc6j\" (UID: \"b44b1510-0a60-4b4e-9541-cc6d18e10a7f\") " pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-5lc6j" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.087140 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-555m9\" (UniqueName: \"kubernetes.io/projected/ced3eeee-ed33-4c50-8531-a7e4df1849f6-kube-api-access-555m9\") pod \"nova-operator-controller-manager-7bdb645866-pg4s4\" (UID: \"ced3eeee-ed33-4c50-8531-a7e4df1849f6\") " pod="openstack-operators/nova-operator-controller-manager-7bdb645866-pg4s4" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.087213 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28de2427-e250-44f5-add2-1b738cf6ce3b-cert\") pod \"infra-operator-controller-manager-694cf4f878-th2hg\" (UID: \"28de2427-e250-44f5-add2-1b738cf6ce3b\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg" Jan 28 20:55:57 crc kubenswrapper[4746]: E0128 20:55:57.087373 4746 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 28 20:55:57 crc kubenswrapper[4746]: E0128 20:55:57.087432 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28de2427-e250-44f5-add2-1b738cf6ce3b-cert podName:28de2427-e250-44f5-add2-1b738cf6ce3b nodeName:}" failed. No retries permitted until 2026-01-28 20:55:57.587417321 +0000 UTC m=+985.543603675 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/28de2427-e250-44f5-add2-1b738cf6ce3b-cert") pod "infra-operator-controller-manager-694cf4f878-th2hg" (UID: "28de2427-e250-44f5-add2-1b738cf6ce3b") : secret "infra-operator-webhook-server-cert" not found Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.092566 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7bdb645866-pg4s4"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.101930 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4cd88d46-hb6t9"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.117195 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-hb6t9" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.118819 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgqxx\" (UniqueName: \"kubernetes.io/projected/f682c47e-2151-466d-8cc5-9ef0fca79785-kube-api-access-kgqxx\") pod \"manila-operator-controller-manager-78c6999f6f-m5qbs\" (UID: \"f682c47e-2151-466d-8cc5-9ef0fca79785\") " pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-m5qbs" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.123596 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjxzk\" (UniqueName: \"kubernetes.io/projected/fc220202-4669-4c2e-94b0-583048b56c83-kube-api-access-gjxzk\") pod \"keystone-operator-controller-manager-b8b6d4659-65qb5\" (UID: \"fc220202-4669-4c2e-94b0-583048b56c83\") " pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-65qb5" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.124735 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-vg69w" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.141567 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4cd88d46-hb6t9"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.141857 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrwpt\" (UniqueName: \"kubernetes.io/projected/28de2427-e250-44f5-add2-1b738cf6ce3b-kube-api-access-rrwpt\") pod \"infra-operator-controller-manager-694cf4f878-th2hg\" (UID: \"28de2427-e250-44f5-add2-1b738cf6ce3b\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.161718 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f75f45d54-kpcqr"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.162595 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-kpcqr" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.163459 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-p6qjg" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.178600 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-vdv59" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.196965 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqbzd\" (UniqueName: \"kubernetes.io/projected/3b28dc9c-6dcf-4fd1-8cbd-f13d0da9e954-kube-api-access-cqbzd\") pod \"octavia-operator-controller-manager-5f4cd88d46-hb6t9\" (UID: \"3b28dc9c-6dcf-4fd1-8cbd-f13d0da9e954\") " pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-hb6t9" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.198157 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99fj2\" (UniqueName: \"kubernetes.io/projected/b182a0df-d0f9-46d6-9a0c-a3e332c84cff-kube-api-access-99fj2\") pod \"neutron-operator-controller-manager-78d58447c5-pcprz\" (UID: \"b182a0df-d0f9-46d6-9a0c-a3e332c84cff\") " pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-pcprz" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.198184 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nglmv\" (UniqueName: \"kubernetes.io/projected/5521c5f5-d2f6-461b-a2fc-ee97a5b2df11-kube-api-access-nglmv\") pod \"mariadb-operator-controller-manager-6b9fb5fdcb-zcvgn\" (UID: \"5521c5f5-d2f6-461b-a2fc-ee97a5b2df11\") " pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zcvgn" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.198258 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6p67\" (UniqueName: \"kubernetes.io/projected/e3360f0f-1430-4b7e-9ee0-0a126a9b657d-kube-api-access-x6p67\") pod \"ovn-operator-controller-manager-6f75f45d54-kpcqr\" (UID: \"e3360f0f-1430-4b7e-9ee0-0a126a9b657d\") " pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-kpcqr" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.198303 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrqs2\" (UniqueName: \"kubernetes.io/projected/b44b1510-0a60-4b4e-9541-cc6d18e10a7f-kube-api-access-rrqs2\") pod \"ironic-operator-controller-manager-598f7747c9-5lc6j\" (UID: \"b44b1510-0a60-4b4e-9541-cc6d18e10a7f\") " pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-5lc6j" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.198350 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-555m9\" (UniqueName: \"kubernetes.io/projected/ced3eeee-ed33-4c50-8531-a7e4df1849f6-kube-api-access-555m9\") pod \"nova-operator-controller-manager-7bdb645866-pg4s4\" (UID: \"ced3eeee-ed33-4c50-8531-a7e4df1849f6\") " pod="openstack-operators/nova-operator-controller-manager-7bdb645866-pg4s4" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.199095 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.200107 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.206169 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.206400 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-9knjr" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.229066 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-555m9\" (UniqueName: \"kubernetes.io/projected/ced3eeee-ed33-4c50-8531-a7e4df1849f6-kube-api-access-555m9\") pod \"nova-operator-controller-manager-7bdb645866-pg4s4\" (UID: \"ced3eeee-ed33-4c50-8531-a7e4df1849f6\") " pod="openstack-operators/nova-operator-controller-manager-7bdb645866-pg4s4" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.232018 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f75f45d54-kpcqr"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.273596 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.274529 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrqs2\" (UniqueName: \"kubernetes.io/projected/b44b1510-0a60-4b4e-9541-cc6d18e10a7f-kube-api-access-rrqs2\") pod \"ironic-operator-controller-manager-598f7747c9-5lc6j\" (UID: \"b44b1510-0a60-4b4e-9541-cc6d18e10a7f\") " pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-5lc6j" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.275136 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-pg4s4" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.278798 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99fj2\" (UniqueName: \"kubernetes.io/projected/b182a0df-d0f9-46d6-9a0c-a3e332c84cff-kube-api-access-99fj2\") pod \"neutron-operator-controller-manager-78d58447c5-pcprz\" (UID: \"b182a0df-d0f9-46d6-9a0c-a3e332c84cff\") " pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-pcprz" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.279370 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nglmv\" (UniqueName: \"kubernetes.io/projected/5521c5f5-d2f6-461b-a2fc-ee97a5b2df11-kube-api-access-nglmv\") pod \"mariadb-operator-controller-manager-6b9fb5fdcb-zcvgn\" (UID: \"5521c5f5-d2f6-461b-a2fc-ee97a5b2df11\") " pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zcvgn" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.302157 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff4c44c-0290-4ab0-abb8-316375200dc0-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b85449cmp\" (UID: \"3ff4c44c-0290-4ab0-abb8-316375200dc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.302433 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6p67\" (UniqueName: \"kubernetes.io/projected/e3360f0f-1430-4b7e-9ee0-0a126a9b657d-kube-api-access-x6p67\") pod \"ovn-operator-controller-manager-6f75f45d54-kpcqr\" (UID: \"e3360f0f-1430-4b7e-9ee0-0a126a9b657d\") " pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-kpcqr" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.302519 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcc4f\" (UniqueName: \"kubernetes.io/projected/3ff4c44c-0290-4ab0-abb8-316375200dc0-kube-api-access-jcc4f\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b85449cmp\" (UID: \"3ff4c44c-0290-4ab0-abb8-316375200dc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.302594 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqbzd\" (UniqueName: \"kubernetes.io/projected/3b28dc9c-6dcf-4fd1-8cbd-f13d0da9e954-kube-api-access-cqbzd\") pod \"octavia-operator-controller-manager-5f4cd88d46-hb6t9\" (UID: \"3b28dc9c-6dcf-4fd1-8cbd-f13d0da9e954\") " pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-hb6t9" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.317239 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-79d5ccc684-6klzp"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.319101 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6klzp" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.325877 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-qkzbv" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.356563 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-65qb5" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.360856 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-547cbdb99f-fjs9l"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.363268 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-m5qbs" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.363723 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-fjs9l" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.371536 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-prft7" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.405137 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqbzd\" (UniqueName: \"kubernetes.io/projected/3b28dc9c-6dcf-4fd1-8cbd-f13d0da9e954-kube-api-access-cqbzd\") pod \"octavia-operator-controller-manager-5f4cd88d46-hb6t9\" (UID: \"3b28dc9c-6dcf-4fd1-8cbd-f13d0da9e954\") " pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-hb6t9" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.406488 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-79d5ccc684-6klzp"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.419524 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zcvgn" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.420768 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll8vr\" (UniqueName: \"kubernetes.io/projected/1f4e2d58-bbd0-45d2-81ba-2b4b47ab5af6-kube-api-access-ll8vr\") pod \"swift-operator-controller-manager-547cbdb99f-fjs9l\" (UID: \"1f4e2d58-bbd0-45d2-81ba-2b4b47ab5af6\") " pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-fjs9l" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.420826 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff4c44c-0290-4ab0-abb8-316375200dc0-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b85449cmp\" (UID: \"3ff4c44c-0290-4ab0-abb8-316375200dc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.420892 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcc4f\" (UniqueName: \"kubernetes.io/projected/3ff4c44c-0290-4ab0-abb8-316375200dc0-kube-api-access-jcc4f\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b85449cmp\" (UID: \"3ff4c44c-0290-4ab0-abb8-316375200dc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" Jan 28 20:55:57 crc kubenswrapper[4746]: E0128 20:55:57.423184 4746 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 20:55:57 crc kubenswrapper[4746]: E0128 20:55:57.423256 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ff4c44c-0290-4ab0-abb8-316375200dc0-cert podName:3ff4c44c-0290-4ab0-abb8-316375200dc0 nodeName:}" failed. No retries permitted until 2026-01-28 20:55:57.923234533 +0000 UTC m=+985.879420887 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ff4c44c-0290-4ab0-abb8-316375200dc0-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" (UID: "3ff4c44c-0290-4ab0-abb8-316375200dc0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.423260 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6p67\" (UniqueName: \"kubernetes.io/projected/e3360f0f-1430-4b7e-9ee0-0a126a9b657d-kube-api-access-x6p67\") pod \"ovn-operator-controller-manager-6f75f45d54-kpcqr\" (UID: \"e3360f0f-1430-4b7e-9ee0-0a126a9b657d\") " pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-kpcqr" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.443480 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-pcprz" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.479179 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcc4f\" (UniqueName: \"kubernetes.io/projected/3ff4c44c-0290-4ab0-abb8-316375200dc0-kube-api-access-jcc4f\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b85449cmp\" (UID: \"3ff4c44c-0290-4ab0-abb8-316375200dc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.485486 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-547cbdb99f-fjs9l"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.520534 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-9477bbd48-z984g"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.521505 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-5lc6j" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.522306 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-9477bbd48-z984g" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.524957 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll8vr\" (UniqueName: \"kubernetes.io/projected/1f4e2d58-bbd0-45d2-81ba-2b4b47ab5af6-kube-api-access-ll8vr\") pod \"swift-operator-controller-manager-547cbdb99f-fjs9l\" (UID: \"1f4e2d58-bbd0-45d2-81ba-2b4b47ab5af6\") " pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-fjs9l" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.525058 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svd5j\" (UniqueName: \"kubernetes.io/projected/370a5739-7af0-4065-986c-af68a265423c-kube-api-access-svd5j\") pod \"placement-operator-controller-manager-79d5ccc684-6klzp\" (UID: \"370a5739-7af0-4065-986c-af68a265423c\") " pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6klzp" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.533064 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-sn7ln" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.563367 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-9477bbd48-z984g"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.568914 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll8vr\" (UniqueName: \"kubernetes.io/projected/1f4e2d58-bbd0-45d2-81ba-2b4b47ab5af6-kube-api-access-ll8vr\") pod \"swift-operator-controller-manager-547cbdb99f-fjs9l\" (UID: \"1f4e2d58-bbd0-45d2-81ba-2b4b47ab5af6\") " pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-fjs9l" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.600938 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-kpcqr" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.620469 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-69797bbcbd-m4x6j"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.623113 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-m4x6j" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.629552 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-kb8df" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.629979 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-hb6t9" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.644314 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-69797bbcbd-m4x6j"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.656393 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg4hr\" (UniqueName: \"kubernetes.io/projected/beba987e-69be-47aa-a84c-7ea511c4d151-kube-api-access-gg4hr\") pod \"test-operator-controller-manager-69797bbcbd-m4x6j\" (UID: \"beba987e-69be-47aa-a84c-7ea511c4d151\") " pod="openstack-operators/test-operator-controller-manager-69797bbcbd-m4x6j" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.656496 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svd5j\" (UniqueName: \"kubernetes.io/projected/370a5739-7af0-4065-986c-af68a265423c-kube-api-access-svd5j\") pod \"placement-operator-controller-manager-79d5ccc684-6klzp\" (UID: \"370a5739-7af0-4065-986c-af68a265423c\") " pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6klzp" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.656617 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6htnl\" (UniqueName: \"kubernetes.io/projected/e42669f3-6865-4ab6-9a9a-241c7b07509d-kube-api-access-6htnl\") pod \"telemetry-operator-controller-manager-9477bbd48-z984g\" (UID: \"e42669f3-6865-4ab6-9a9a-241c7b07509d\") " pod="openstack-operators/telemetry-operator-controller-manager-9477bbd48-z984g" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.656683 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28de2427-e250-44f5-add2-1b738cf6ce3b-cert\") pod \"infra-operator-controller-manager-694cf4f878-th2hg\" (UID: \"28de2427-e250-44f5-add2-1b738cf6ce3b\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg" Jan 28 20:55:57 crc kubenswrapper[4746]: E0128 20:55:57.656980 4746 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 28 20:55:57 crc kubenswrapper[4746]: E0128 20:55:57.657030 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28de2427-e250-44f5-add2-1b738cf6ce3b-cert podName:28de2427-e250-44f5-add2-1b738cf6ce3b nodeName:}" failed. No retries permitted until 2026-01-28 20:55:58.65701274 +0000 UTC m=+986.613199094 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/28de2427-e250-44f5-add2-1b738cf6ce3b-cert") pod "infra-operator-controller-manager-694cf4f878-th2hg" (UID: "28de2427-e250-44f5-add2-1b738cf6ce3b") : secret "infra-operator-webhook-server-cert" not found Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.701735 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-hd4k9"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.702713 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-hd4k9" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.705732 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-k4x42" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.709313 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svd5j\" (UniqueName: \"kubernetes.io/projected/370a5739-7af0-4065-986c-af68a265423c-kube-api-access-svd5j\") pod \"placement-operator-controller-manager-79d5ccc684-6klzp\" (UID: \"370a5739-7af0-4065-986c-af68a265423c\") " pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6klzp" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.719561 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-hd4k9"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.758767 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg4hr\" (UniqueName: \"kubernetes.io/projected/beba987e-69be-47aa-a84c-7ea511c4d151-kube-api-access-gg4hr\") pod \"test-operator-controller-manager-69797bbcbd-m4x6j\" (UID: \"beba987e-69be-47aa-a84c-7ea511c4d151\") " pod="openstack-operators/test-operator-controller-manager-69797bbcbd-m4x6j" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.770413 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6htnl\" (UniqueName: \"kubernetes.io/projected/e42669f3-6865-4ab6-9a9a-241c7b07509d-kube-api-access-6htnl\") pod \"telemetry-operator-controller-manager-9477bbd48-z984g\" (UID: \"e42669f3-6865-4ab6-9a9a-241c7b07509d\") " pod="openstack-operators/telemetry-operator-controller-manager-9477bbd48-z984g" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.766202 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6klzp" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.775408 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-fjs9l" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.794616 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.795725 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.816280 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-44gql" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.816489 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.816556 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6htnl\" (UniqueName: \"kubernetes.io/projected/e42669f3-6865-4ab6-9a9a-241c7b07509d-kube-api-access-6htnl\") pod \"telemetry-operator-controller-manager-9477bbd48-z984g\" (UID: \"e42669f3-6865-4ab6-9a9a-241c7b07509d\") " pod="openstack-operators/telemetry-operator-controller-manager-9477bbd48-z984g" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.816588 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.817222 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.817424 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg4hr\" (UniqueName: \"kubernetes.io/projected/beba987e-69be-47aa-a84c-7ea511c4d151-kube-api-access-gg4hr\") pod \"test-operator-controller-manager-69797bbcbd-m4x6j\" (UID: \"beba987e-69be-47aa-a84c-7ea511c4d151\") " pod="openstack-operators/test-operator-controller-manager-69797bbcbd-m4x6j" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.873297 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsf9v\" (UniqueName: \"kubernetes.io/projected/90c190b4-36db-406b-bca5-6c45ac745ed6-kube-api-access-vsf9v\") pod \"watcher-operator-controller-manager-564965969-hd4k9\" (UID: \"90c190b4-36db-406b-bca5-6c45ac745ed6\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-hd4k9" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.873956 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-webhook-certs\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.874034 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-metrics-certs\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.874057 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhknp\" (UniqueName: \"kubernetes.io/projected/a7c2547a-3282-4748-a823-c3a0cc41ad46-kube-api-access-xhknp\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.875453 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8kpht"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.876768 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8kpht" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.880460 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-mpjk5" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.886783 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8kpht"] Jan 28 20:55:57 crc kubenswrapper[4746]: W0128 20:55:57.900978 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c81bd6e_961b_42ae_8840_2607a13046df.slice/crio-b447d6ab37d2e83a8ee71aee68821bf8d02db8873640445ab85b9373e9d84b45 WatchSource:0}: Error finding container b447d6ab37d2e83a8ee71aee68821bf8d02db8873640445ab85b9373e9d84b45: Status 404 returned error can't find the container with id b447d6ab37d2e83a8ee71aee68821bf8d02db8873640445ab85b9373e9d84b45 Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.920844 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7f86f8796f-kll6j"] Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.936912 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-m4x6j" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.974862 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff4c44c-0290-4ab0-abb8-316375200dc0-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b85449cmp\" (UID: \"3ff4c44c-0290-4ab0-abb8-316375200dc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.974905 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hhdd\" (UniqueName: \"kubernetes.io/projected/6b7a0005-11ec-4c8a-87e9-872855585d4d-kube-api-access-4hhdd\") pod \"rabbitmq-cluster-operator-manager-668c99d594-8kpht\" (UID: \"6b7a0005-11ec-4c8a-87e9-872855585d4d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8kpht" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.974925 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsf9v\" (UniqueName: \"kubernetes.io/projected/90c190b4-36db-406b-bca5-6c45ac745ed6-kube-api-access-vsf9v\") pod \"watcher-operator-controller-manager-564965969-hd4k9\" (UID: \"90c190b4-36db-406b-bca5-6c45ac745ed6\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-hd4k9" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.974960 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-webhook-certs\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.974997 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-metrics-certs\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.975015 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhknp\" (UniqueName: \"kubernetes.io/projected/a7c2547a-3282-4748-a823-c3a0cc41ad46-kube-api-access-xhknp\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:55:57 crc kubenswrapper[4746]: I0128 20:55:57.975257 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7478f7dbf9-n6qr7"] Jan 28 20:55:57 crc kubenswrapper[4746]: E0128 20:55:57.975800 4746 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 28 20:55:57 crc kubenswrapper[4746]: E0128 20:55:57.975841 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-webhook-certs podName:a7c2547a-3282-4748-a823-c3a0cc41ad46 nodeName:}" failed. No retries permitted until 2026-01-28 20:55:58.475827583 +0000 UTC m=+986.432013937 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-webhook-certs") pod "openstack-operator-controller-manager-65d466cb7d-vf8n9" (UID: "a7c2547a-3282-4748-a823-c3a0cc41ad46") : secret "webhook-server-cert" not found Jan 28 20:55:57 crc kubenswrapper[4746]: E0128 20:55:57.975917 4746 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 20:55:57 crc kubenswrapper[4746]: E0128 20:55:57.975998 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ff4c44c-0290-4ab0-abb8-316375200dc0-cert podName:3ff4c44c-0290-4ab0-abb8-316375200dc0 nodeName:}" failed. No retries permitted until 2026-01-28 20:55:58.975963927 +0000 UTC m=+986.932150501 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ff4c44c-0290-4ab0-abb8-316375200dc0-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" (UID: "3ff4c44c-0290-4ab0-abb8-316375200dc0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 20:55:57 crc kubenswrapper[4746]: E0128 20:55:57.976000 4746 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 28 20:55:57 crc kubenswrapper[4746]: E0128 20:55:57.976051 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-metrics-certs podName:a7c2547a-3282-4748-a823-c3a0cc41ad46 nodeName:}" failed. No retries permitted until 2026-01-28 20:55:58.476040959 +0000 UTC m=+986.432227313 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-metrics-certs") pod "openstack-operator-controller-manager-65d466cb7d-vf8n9" (UID: "a7c2547a-3282-4748-a823-c3a0cc41ad46") : secret "metrics-server-cert" not found Jan 28 20:55:58 crc kubenswrapper[4746]: I0128 20:55:58.015052 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsf9v\" (UniqueName: \"kubernetes.io/projected/90c190b4-36db-406b-bca5-6c45ac745ed6-kube-api-access-vsf9v\") pod \"watcher-operator-controller-manager-564965969-hd4k9\" (UID: \"90c190b4-36db-406b-bca5-6c45ac745ed6\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-hd4k9" Jan 28 20:55:58 crc kubenswrapper[4746]: I0128 20:55:58.016331 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhknp\" (UniqueName: \"kubernetes.io/projected/a7c2547a-3282-4748-a823-c3a0cc41ad46-kube-api-access-xhknp\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:55:58 crc kubenswrapper[4746]: I0128 20:55:58.080505 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hhdd\" (UniqueName: \"kubernetes.io/projected/6b7a0005-11ec-4c8a-87e9-872855585d4d-kube-api-access-4hhdd\") pod \"rabbitmq-cluster-operator-manager-668c99d594-8kpht\" (UID: \"6b7a0005-11ec-4c8a-87e9-872855585d4d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8kpht" Jan 28 20:55:58 crc kubenswrapper[4746]: W0128 20:55:58.088050 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf86e66ed_9f28_4514_8ff8_97b8353026d1.slice/crio-3f570df23f717d8a0d061e747e690a92ae0d628b143e822c2c7a5f787dd24475 WatchSource:0}: Error finding container 3f570df23f717d8a0d061e747e690a92ae0d628b143e822c2c7a5f787dd24475: Status 404 returned error can't find the container with id 3f570df23f717d8a0d061e747e690a92ae0d628b143e822c2c7a5f787dd24475 Jan 28 20:55:58 crc kubenswrapper[4746]: I0128 20:55:58.100556 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-9477bbd48-z984g" Jan 28 20:55:58 crc kubenswrapper[4746]: I0128 20:55:58.105451 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hhdd\" (UniqueName: \"kubernetes.io/projected/6b7a0005-11ec-4c8a-87e9-872855585d4d-kube-api-access-4hhdd\") pod \"rabbitmq-cluster-operator-manager-668c99d594-8kpht\" (UID: \"6b7a0005-11ec-4c8a-87e9-872855585d4d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8kpht" Jan 28 20:55:58 crc kubenswrapper[4746]: I0128 20:55:58.128216 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-hd4k9" Jan 28 20:55:58 crc kubenswrapper[4746]: I0128 20:55:58.177199 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8kpht" Jan 28 20:55:58 crc kubenswrapper[4746]: I0128 20:55:58.290757 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-n6qr7" event={"ID":"f86e66ed-9f28-4514-8ff8-97b8353026d1","Type":"ContainerStarted","Data":"3f570df23f717d8a0d061e747e690a92ae0d628b143e822c2c7a5f787dd24475"} Jan 28 20:55:58 crc kubenswrapper[4746]: I0128 20:55:58.292682 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-kll6j" event={"ID":"3c81bd6e-961b-42ae-8840-2607a13046df","Type":"ContainerStarted","Data":"b447d6ab37d2e83a8ee71aee68821bf8d02db8873640445ab85b9373e9d84b45"} Jan 28 20:55:58 crc kubenswrapper[4746]: I0128 20:55:58.494827 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-webhook-certs\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:55:58 crc kubenswrapper[4746]: E0128 20:55:58.495110 4746 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 28 20:55:58 crc kubenswrapper[4746]: E0128 20:55:58.495282 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-webhook-certs podName:a7c2547a-3282-4748-a823-c3a0cc41ad46 nodeName:}" failed. No retries permitted until 2026-01-28 20:55:59.495253188 +0000 UTC m=+987.451439722 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-webhook-certs") pod "openstack-operator-controller-manager-65d466cb7d-vf8n9" (UID: "a7c2547a-3282-4748-a823-c3a0cc41ad46") : secret "webhook-server-cert" not found Jan 28 20:55:58 crc kubenswrapper[4746]: E0128 20:55:58.495448 4746 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 28 20:55:58 crc kubenswrapper[4746]: E0128 20:55:58.495590 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-metrics-certs podName:a7c2547a-3282-4748-a823-c3a0cc41ad46 nodeName:}" failed. No retries permitted until 2026-01-28 20:55:59.495569867 +0000 UTC m=+987.451756221 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-metrics-certs") pod "openstack-operator-controller-manager-65d466cb7d-vf8n9" (UID: "a7c2547a-3282-4748-a823-c3a0cc41ad46") : secret "metrics-server-cert" not found Jan 28 20:55:58 crc kubenswrapper[4746]: I0128 20:55:58.495180 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-metrics-certs\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:55:58 crc kubenswrapper[4746]: I0128 20:55:58.684805 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-78fdd796fd-bxtxd"] Jan 28 20:55:58 crc kubenswrapper[4746]: I0128 20:55:58.698254 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28de2427-e250-44f5-add2-1b738cf6ce3b-cert\") pod \"infra-operator-controller-manager-694cf4f878-th2hg\" (UID: \"28de2427-e250-44f5-add2-1b738cf6ce3b\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg" Jan 28 20:55:58 crc kubenswrapper[4746]: E0128 20:55:58.698511 4746 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 28 20:55:58 crc kubenswrapper[4746]: E0128 20:55:58.706702 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28de2427-e250-44f5-add2-1b738cf6ce3b-cert podName:28de2427-e250-44f5-add2-1b738cf6ce3b nodeName:}" failed. No retries permitted until 2026-01-28 20:56:00.706647542 +0000 UTC m=+988.662833896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/28de2427-e250-44f5-add2-1b738cf6ce3b-cert") pod "infra-operator-controller-manager-694cf4f878-th2hg" (UID: "28de2427-e250-44f5-add2-1b738cf6ce3b") : secret "infra-operator-webhook-server-cert" not found Jan 28 20:55:58 crc kubenswrapper[4746]: I0128 20:55:58.746173 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-p6qjg"] Jan 28 20:55:58 crc kubenswrapper[4746]: I0128 20:55:58.767354 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-ws7k7"] Jan 28 20:55:58 crc kubenswrapper[4746]: I0128 20:55:58.779336 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-b45d7bf98-cm85d"] Jan 28 20:55:58 crc kubenswrapper[4746]: W0128 20:55:58.981835 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b28dc9c_6dcf_4fd1_8cbd_f13d0da9e954.slice/crio-853799931f2b01efb67f742f2b3224906b278b0e35707e8c53fd658a71b22602 WatchSource:0}: Error finding container 853799931f2b01efb67f742f2b3224906b278b0e35707e8c53fd658a71b22602: Status 404 returned error can't find the container with id 853799931f2b01efb67f742f2b3224906b278b0e35707e8c53fd658a71b22602 Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.002410 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff4c44c-0290-4ab0-abb8-316375200dc0-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b85449cmp\" (UID: \"3ff4c44c-0290-4ab0-abb8-316375200dc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.002688 4746 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.002739 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ff4c44c-0290-4ab0-abb8-316375200dc0-cert podName:3ff4c44c-0290-4ab0-abb8-316375200dc0 nodeName:}" failed. No retries permitted until 2026-01-28 20:56:01.002723941 +0000 UTC m=+988.958910285 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ff4c44c-0290-4ab0-abb8-316375200dc0-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" (UID: "3ff4c44c-0290-4ab0-abb8-316375200dc0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.006306 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4cd88d46-hb6t9"] Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.012241 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-78c6999f6f-m5qbs"] Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.029981 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zcvgn"] Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.037017 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b8b6d4659-65qb5"] Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.043297 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7bdb645866-pg4s4"] Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.180179 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f75f45d54-kpcqr"] Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.225055 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-598f7747c9-5lc6j"] Jan 28 20:55:59 crc kubenswrapper[4746]: W0128 20:55:59.225109 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb44b1510_0a60_4b4e_9541_cc6d18e10a7f.slice/crio-f974695a1a885b8ad73dbd76de7baa136518398766f16ea43f6176613289a749 WatchSource:0}: Error finding container f974695a1a885b8ad73dbd76de7baa136518398766f16ea43f6176613289a749: Status 404 returned error can't find the container with id f974695a1a885b8ad73dbd76de7baa136518398766f16ea43f6176613289a749 Jan 28 20:55:59 crc kubenswrapper[4746]: W0128 20:55:59.229137 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeba987e_69be_47aa_a84c_7ea511c4d151.slice/crio-4a165dbcf0a5deb32bbdf006322c01923506653af394753f6e5f4c503f0426ea WatchSource:0}: Error finding container 4a165dbcf0a5deb32bbdf006322c01923506653af394753f6e5f4c503f0426ea: Status 404 returned error can't find the container with id 4a165dbcf0a5deb32bbdf006322c01923506653af394753f6e5f4c503f0426ea Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.241860 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:c8dde42dafd41026ed2e4cfc26efc0fff63c4ba9d31326ae7dc644ccceaafa9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gg4hr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-69797bbcbd-m4x6j_openstack-operators(beba987e-69be-47aa-a84c-7ea511c4d151): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.243099 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-m4x6j" podUID="beba987e-69be-47aa-a84c-7ea511c4d151" Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.244262 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-69797bbcbd-m4x6j"] Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.245737 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.50:5001/openstack-k8s-operators/telemetry-operator:a5bcf05e2d71c610156d017fdf197f7c58570d79,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6htnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-9477bbd48-z984g_openstack-operators(e42669f3-6865-4ab6-9a9a-241c7b07509d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.247452 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-9477bbd48-z984g" podUID="e42669f3-6865-4ab6-9a9a-241c7b07509d" Jan 28 20:55:59 crc kubenswrapper[4746]: W0128 20:55:59.252245 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb182a0df_d0f9_46d6_9a0c_a3e332c84cff.slice/crio-00ec183221201382aacc189e46d0c1a6c154bd80e0cc4fbe3fedd17c9cb6ddfe WatchSource:0}: Error finding container 00ec183221201382aacc189e46d0c1a6c154bd80e0cc4fbe3fedd17c9cb6ddfe: Status 404 returned error can't find the container with id 00ec183221201382aacc189e46d0c1a6c154bd80e0cc4fbe3fedd17c9cb6ddfe Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.256166 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-9477bbd48-z984g"] Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.259159 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:816d474f502d730d6a2522a272b0e09a2d579ac63617817655d60c54bda4191e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-99fj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-78d58447c5-pcprz_openstack-operators(b182a0df-d0f9-46d6-9a0c-a3e332c84cff): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.260276 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-pcprz" podUID="b182a0df-d0f9-46d6-9a0c-a3e332c84cff" Jan 28 20:55:59 crc kubenswrapper[4746]: W0128 20:55:59.273378 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f4e2d58_bbd0_45d2_81ba_2b4b47ab5af6.slice/crio-c026018bd62ed15f9d81479af0ecec8c911255c5b2552222665513e6d863f85d WatchSource:0}: Error finding container c026018bd62ed15f9d81479af0ecec8c911255c5b2552222665513e6d863f85d: Status 404 returned error can't find the container with id c026018bd62ed15f9d81479af0ecec8c911255c5b2552222665513e6d863f85d Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.273832 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-hd4k9"] Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.283491 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78d58447c5-pcprz"] Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.288182 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:445e951df2f21df6d33a466f75917e0f6103052ae751ae11887136e8ab165922,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ll8vr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-547cbdb99f-fjs9l_openstack-operators(1f4e2d58-bbd0-45d2-81ba-2b4b47ab5af6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.289223 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:013c0ad82d21a21c7eece5cd4b5d5c4b8eb410b6671ac33a6f3fb78c8510811d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-svd5j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-79d5ccc684-6klzp_openstack-operators(370a5739-7af0-4065-986c-af68a265423c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.289283 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-fjs9l" podUID="1f4e2d58-bbd0-45d2-81ba-2b4b47ab5af6" Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.290362 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6klzp" podUID="370a5739-7af0-4065-986c-af68a265423c" Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.295225 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4hhdd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-8kpht_openstack-operators(6b7a0005-11ec-4c8a-87e9-872855585d4d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.296533 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8kpht" podUID="6b7a0005-11ec-4c8a-87e9-872855585d4d" Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.298747 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-547cbdb99f-fjs9l"] Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.303817 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-m4x6j" event={"ID":"beba987e-69be-47aa-a84c-7ea511c4d151","Type":"ContainerStarted","Data":"4a165dbcf0a5deb32bbdf006322c01923506653af394753f6e5f4c503f0426ea"} Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.305396 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:c8dde42dafd41026ed2e4cfc26efc0fff63c4ba9d31326ae7dc644ccceaafa9d\\\"\"" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-m4x6j" podUID="beba987e-69be-47aa-a84c-7ea511c4d151" Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.305721 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-hd4k9" event={"ID":"90c190b4-36db-406b-bca5-6c45ac745ed6","Type":"ContainerStarted","Data":"3698de267b265973566b84eaf98f209367701c84a75659b62b04c008f68d05ec"} Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.306752 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-fjs9l" event={"ID":"1f4e2d58-bbd0-45d2-81ba-2b4b47ab5af6","Type":"ContainerStarted","Data":"c026018bd62ed15f9d81479af0ecec8c911255c5b2552222665513e6d863f85d"} Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.308531 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:445e951df2f21df6d33a466f75917e0f6103052ae751ae11887136e8ab165922\\\"\"" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-fjs9l" podUID="1f4e2d58-bbd0-45d2-81ba-2b4b47ab5af6" Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.312223 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-bxtxd" event={"ID":"fe660f4f-8806-4674-ab58-ea3303f51683","Type":"ContainerStarted","Data":"43c456aad096f8c18073399c8e5e28588618501b082da48b89e202ea1afb91bc"} Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.312613 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-79d5ccc684-6klzp"] Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.313592 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-9477bbd48-z984g" event={"ID":"e42669f3-6865-4ab6-9a9a-241c7b07509d","Type":"ContainerStarted","Data":"3a41400ef317f0555632eccd06fab91237f42249e48330327f5fb9221169ae89"} Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.315402 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.50:5001/openstack-k8s-operators/telemetry-operator:a5bcf05e2d71c610156d017fdf197f7c58570d79\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-9477bbd48-z984g" podUID="e42669f3-6865-4ab6-9a9a-241c7b07509d" Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.315434 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-pg4s4" event={"ID":"ced3eeee-ed33-4c50-8531-a7e4df1849f6","Type":"ContainerStarted","Data":"f1067062d75fc7ccbd34769f5f3d3351165fb222418764d87dd00d548ee30c1b"} Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.316809 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-p6qjg" event={"ID":"677d2ab0-897d-4fd5-8ca5-b75f310e38da","Type":"ContainerStarted","Data":"58858abd792a3ff6d75514d300fd0e50c6aab00c49292bcc5f4bf42c5c3eb4f6"} Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.318681 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zcvgn" event={"ID":"5521c5f5-d2f6-461b-a2fc-ee97a5b2df11","Type":"ContainerStarted","Data":"59bdeef1ba9b57b1811fb8dfffe5ae571abecb5a9dbc8cf9b84409f09e7ab6c9"} Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.326198 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8kpht"] Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.326238 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-65qb5" event={"ID":"fc220202-4669-4c2e-94b0-583048b56c83","Type":"ContainerStarted","Data":"a22df86a6b1665e20f548e816632a879e36010ed018cc0fd47d69f489615a6a6"} Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.338624 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-cm85d" event={"ID":"63794c40-0128-457d-b223-84e87943cca9","Type":"ContainerStarted","Data":"100dd6023735f37ca10e1e3f62ad5239151fe0d8320f6bf873b714f0d81a372e"} Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.341484 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-ws7k7" event={"ID":"760877c4-6e86-4445-a4cf-002b48e93841","Type":"ContainerStarted","Data":"2059be607e3f80e608ab01990be18fb3afeeca4eefe49f9049b534b20fbdfb50"} Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.345241 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-hb6t9" event={"ID":"3b28dc9c-6dcf-4fd1-8cbd-f13d0da9e954","Type":"ContainerStarted","Data":"853799931f2b01efb67f742f2b3224906b278b0e35707e8c53fd658a71b22602"} Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.347107 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6klzp" event={"ID":"370a5739-7af0-4065-986c-af68a265423c","Type":"ContainerStarted","Data":"c2480e99c096ea7d9121a5d0fcdfebb2b3b1effc9df8e8ef779761aa62008b48"} Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.348932 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-pcprz" event={"ID":"b182a0df-d0f9-46d6-9a0c-a3e332c84cff","Type":"ContainerStarted","Data":"00ec183221201382aacc189e46d0c1a6c154bd80e0cc4fbe3fedd17c9cb6ddfe"} Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.349917 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:013c0ad82d21a21c7eece5cd4b5d5c4b8eb410b6671ac33a6f3fb78c8510811d\\\"\"" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6klzp" podUID="370a5739-7af0-4065-986c-af68a265423c" Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.354813 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-m5qbs" event={"ID":"f682c47e-2151-466d-8cc5-9ef0fca79785","Type":"ContainerStarted","Data":"391972afc5b390c063a557ddca58b287e065ead5a76e52e1528fb5e396d10233"} Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.354967 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:816d474f502d730d6a2522a272b0e09a2d579ac63617817655d60c54bda4191e\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-pcprz" podUID="b182a0df-d0f9-46d6-9a0c-a3e332c84cff" Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.361855 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-5lc6j" event={"ID":"b44b1510-0a60-4b4e-9541-cc6d18e10a7f","Type":"ContainerStarted","Data":"f974695a1a885b8ad73dbd76de7baa136518398766f16ea43f6176613289a749"} Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.367321 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-kpcqr" event={"ID":"e3360f0f-1430-4b7e-9ee0-0a126a9b657d","Type":"ContainerStarted","Data":"66610a82cd113f002ae40afbe8c0b86c12ab4d5297b6127342502fc002cb0ff3"} Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.516493 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-metrics-certs\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:55:59 crc kubenswrapper[4746]: I0128 20:55:59.516633 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-webhook-certs\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.516716 4746 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.516748 4746 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.516802 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-webhook-certs podName:a7c2547a-3282-4748-a823-c3a0cc41ad46 nodeName:}" failed. No retries permitted until 2026-01-28 20:56:01.516788631 +0000 UTC m=+989.472974985 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-webhook-certs") pod "openstack-operator-controller-manager-65d466cb7d-vf8n9" (UID: "a7c2547a-3282-4748-a823-c3a0cc41ad46") : secret "webhook-server-cert" not found Jan 28 20:55:59 crc kubenswrapper[4746]: E0128 20:55:59.516828 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-metrics-certs podName:a7c2547a-3282-4748-a823-c3a0cc41ad46 nodeName:}" failed. No retries permitted until 2026-01-28 20:56:01.516821682 +0000 UTC m=+989.473008036 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-metrics-certs") pod "openstack-operator-controller-manager-65d466cb7d-vf8n9" (UID: "a7c2547a-3282-4748-a823-c3a0cc41ad46") : secret "metrics-server-cert" not found Jan 28 20:56:00 crc kubenswrapper[4746]: I0128 20:56:00.383257 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8kpht" event={"ID":"6b7a0005-11ec-4c8a-87e9-872855585d4d","Type":"ContainerStarted","Data":"2c77bfe5e07c7d77d6e6458836e0cfb46c6840faac22a36621a608c7a930cc87"} Jan 28 20:56:00 crc kubenswrapper[4746]: E0128 20:56:00.384962 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:c8dde42dafd41026ed2e4cfc26efc0fff63c4ba9d31326ae7dc644ccceaafa9d\\\"\"" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-m4x6j" podUID="beba987e-69be-47aa-a84c-7ea511c4d151" Jan 28 20:56:00 crc kubenswrapper[4746]: E0128 20:56:00.385626 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8kpht" podUID="6b7a0005-11ec-4c8a-87e9-872855585d4d" Jan 28 20:56:00 crc kubenswrapper[4746]: E0128 20:56:00.385874 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:445e951df2f21df6d33a466f75917e0f6103052ae751ae11887136e8ab165922\\\"\"" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-fjs9l" podUID="1f4e2d58-bbd0-45d2-81ba-2b4b47ab5af6" Jan 28 20:56:00 crc kubenswrapper[4746]: E0128 20:56:00.386339 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:013c0ad82d21a21c7eece5cd4b5d5c4b8eb410b6671ac33a6f3fb78c8510811d\\\"\"" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6klzp" podUID="370a5739-7af0-4065-986c-af68a265423c" Jan 28 20:56:00 crc kubenswrapper[4746]: E0128 20:56:00.386567 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:816d474f502d730d6a2522a272b0e09a2d579ac63617817655d60c54bda4191e\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-pcprz" podUID="b182a0df-d0f9-46d6-9a0c-a3e332c84cff" Jan 28 20:56:00 crc kubenswrapper[4746]: E0128 20:56:00.386683 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.50:5001/openstack-k8s-operators/telemetry-operator:a5bcf05e2d71c610156d017fdf197f7c58570d79\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-9477bbd48-z984g" podUID="e42669f3-6865-4ab6-9a9a-241c7b07509d" Jan 28 20:56:00 crc kubenswrapper[4746]: I0128 20:56:00.748748 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28de2427-e250-44f5-add2-1b738cf6ce3b-cert\") pod \"infra-operator-controller-manager-694cf4f878-th2hg\" (UID: \"28de2427-e250-44f5-add2-1b738cf6ce3b\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg" Jan 28 20:56:00 crc kubenswrapper[4746]: E0128 20:56:00.748908 4746 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 28 20:56:00 crc kubenswrapper[4746]: E0128 20:56:00.749380 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28de2427-e250-44f5-add2-1b738cf6ce3b-cert podName:28de2427-e250-44f5-add2-1b738cf6ce3b nodeName:}" failed. No retries permitted until 2026-01-28 20:56:04.749364258 +0000 UTC m=+992.705550612 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/28de2427-e250-44f5-add2-1b738cf6ce3b-cert") pod "infra-operator-controller-manager-694cf4f878-th2hg" (UID: "28de2427-e250-44f5-add2-1b738cf6ce3b") : secret "infra-operator-webhook-server-cert" not found Jan 28 20:56:01 crc kubenswrapper[4746]: I0128 20:56:01.054432 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff4c44c-0290-4ab0-abb8-316375200dc0-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b85449cmp\" (UID: \"3ff4c44c-0290-4ab0-abb8-316375200dc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" Jan 28 20:56:01 crc kubenswrapper[4746]: E0128 20:56:01.054734 4746 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 20:56:01 crc kubenswrapper[4746]: E0128 20:56:01.054873 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ff4c44c-0290-4ab0-abb8-316375200dc0-cert podName:3ff4c44c-0290-4ab0-abb8-316375200dc0 nodeName:}" failed. No retries permitted until 2026-01-28 20:56:05.054839372 +0000 UTC m=+993.011025726 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ff4c44c-0290-4ab0-abb8-316375200dc0-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" (UID: "3ff4c44c-0290-4ab0-abb8-316375200dc0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 20:56:01 crc kubenswrapper[4746]: E0128 20:56:01.412001 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8kpht" podUID="6b7a0005-11ec-4c8a-87e9-872855585d4d" Jan 28 20:56:01 crc kubenswrapper[4746]: I0128 20:56:01.566035 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-webhook-certs\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:56:01 crc kubenswrapper[4746]: I0128 20:56:01.566122 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-metrics-certs\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:56:01 crc kubenswrapper[4746]: E0128 20:56:01.566261 4746 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 28 20:56:01 crc kubenswrapper[4746]: E0128 20:56:01.566328 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-metrics-certs podName:a7c2547a-3282-4748-a823-c3a0cc41ad46 nodeName:}" failed. No retries permitted until 2026-01-28 20:56:05.566311902 +0000 UTC m=+993.522498256 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-metrics-certs") pod "openstack-operator-controller-manager-65d466cb7d-vf8n9" (UID: "a7c2547a-3282-4748-a823-c3a0cc41ad46") : secret "metrics-server-cert" not found Jan 28 20:56:01 crc kubenswrapper[4746]: E0128 20:56:01.566660 4746 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 28 20:56:01 crc kubenswrapper[4746]: E0128 20:56:01.566704 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-webhook-certs podName:a7c2547a-3282-4748-a823-c3a0cc41ad46 nodeName:}" failed. No retries permitted until 2026-01-28 20:56:05.566695212 +0000 UTC m=+993.522881566 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-webhook-certs") pod "openstack-operator-controller-manager-65d466cb7d-vf8n9" (UID: "a7c2547a-3282-4748-a823-c3a0cc41ad46") : secret "webhook-server-cert" not found Jan 28 20:56:04 crc kubenswrapper[4746]: I0128 20:56:04.844017 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28de2427-e250-44f5-add2-1b738cf6ce3b-cert\") pod \"infra-operator-controller-manager-694cf4f878-th2hg\" (UID: \"28de2427-e250-44f5-add2-1b738cf6ce3b\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg" Jan 28 20:56:04 crc kubenswrapper[4746]: E0128 20:56:04.844221 4746 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 28 20:56:04 crc kubenswrapper[4746]: E0128 20:56:04.844608 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28de2427-e250-44f5-add2-1b738cf6ce3b-cert podName:28de2427-e250-44f5-add2-1b738cf6ce3b nodeName:}" failed. No retries permitted until 2026-01-28 20:56:12.844583537 +0000 UTC m=+1000.800769961 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/28de2427-e250-44f5-add2-1b738cf6ce3b-cert") pod "infra-operator-controller-manager-694cf4f878-th2hg" (UID: "28de2427-e250-44f5-add2-1b738cf6ce3b") : secret "infra-operator-webhook-server-cert" not found Jan 28 20:56:05 crc kubenswrapper[4746]: I0128 20:56:05.153175 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff4c44c-0290-4ab0-abb8-316375200dc0-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b85449cmp\" (UID: \"3ff4c44c-0290-4ab0-abb8-316375200dc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" Jan 28 20:56:05 crc kubenswrapper[4746]: E0128 20:56:05.153513 4746 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 20:56:05 crc kubenswrapper[4746]: E0128 20:56:05.153613 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ff4c44c-0290-4ab0-abb8-316375200dc0-cert podName:3ff4c44c-0290-4ab0-abb8-316375200dc0 nodeName:}" failed. No retries permitted until 2026-01-28 20:56:13.153568654 +0000 UTC m=+1001.109755008 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ff4c44c-0290-4ab0-abb8-316375200dc0-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" (UID: "3ff4c44c-0290-4ab0-abb8-316375200dc0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 28 20:56:05 crc kubenswrapper[4746]: I0128 20:56:05.660850 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-webhook-certs\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:56:05 crc kubenswrapper[4746]: I0128 20:56:05.660918 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-metrics-certs\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:56:05 crc kubenswrapper[4746]: E0128 20:56:05.661048 4746 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 28 20:56:05 crc kubenswrapper[4746]: E0128 20:56:05.661093 4746 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 28 20:56:05 crc kubenswrapper[4746]: E0128 20:56:05.661114 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-metrics-certs podName:a7c2547a-3282-4748-a823-c3a0cc41ad46 nodeName:}" failed. No retries permitted until 2026-01-28 20:56:13.661100679 +0000 UTC m=+1001.617287033 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-metrics-certs") pod "openstack-operator-controller-manager-65d466cb7d-vf8n9" (UID: "a7c2547a-3282-4748-a823-c3a0cc41ad46") : secret "metrics-server-cert" not found Jan 28 20:56:05 crc kubenswrapper[4746]: E0128 20:56:05.661201 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-webhook-certs podName:a7c2547a-3282-4748-a823-c3a0cc41ad46 nodeName:}" failed. No retries permitted until 2026-01-28 20:56:13.661182541 +0000 UTC m=+1001.617368895 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-webhook-certs") pod "openstack-operator-controller-manager-65d466cb7d-vf8n9" (UID: "a7c2547a-3282-4748-a823-c3a0cc41ad46") : secret "webhook-server-cert" not found Jan 28 20:56:12 crc kubenswrapper[4746]: I0128 20:56:12.899007 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28de2427-e250-44f5-add2-1b738cf6ce3b-cert\") pod \"infra-operator-controller-manager-694cf4f878-th2hg\" (UID: \"28de2427-e250-44f5-add2-1b738cf6ce3b\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg" Jan 28 20:56:12 crc kubenswrapper[4746]: I0128 20:56:12.908169 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28de2427-e250-44f5-add2-1b738cf6ce3b-cert\") pod \"infra-operator-controller-manager-694cf4f878-th2hg\" (UID: \"28de2427-e250-44f5-add2-1b738cf6ce3b\") " pod="openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg" Jan 28 20:56:12 crc kubenswrapper[4746]: I0128 20:56:12.937240 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg" Jan 28 20:56:13 crc kubenswrapper[4746]: I0128 20:56:13.204237 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff4c44c-0290-4ab0-abb8-316375200dc0-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b85449cmp\" (UID: \"3ff4c44c-0290-4ab0-abb8-316375200dc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" Jan 28 20:56:13 crc kubenswrapper[4746]: I0128 20:56:13.209873 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ff4c44c-0290-4ab0-abb8-316375200dc0-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b85449cmp\" (UID: \"3ff4c44c-0290-4ab0-abb8-316375200dc0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" Jan 28 20:56:13 crc kubenswrapper[4746]: I0128 20:56:13.320533 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" Jan 28 20:56:13 crc kubenswrapper[4746]: I0128 20:56:13.710236 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-metrics-certs\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:56:13 crc kubenswrapper[4746]: I0128 20:56:13.710718 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-webhook-certs\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:56:13 crc kubenswrapper[4746]: I0128 20:56:13.714765 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-metrics-certs\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:56:13 crc kubenswrapper[4746]: I0128 20:56:13.721141 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a7c2547a-3282-4748-a823-c3a0cc41ad46-webhook-certs\") pod \"openstack-operator-controller-manager-65d466cb7d-vf8n9\" (UID: \"a7c2547a-3282-4748-a823-c3a0cc41ad46\") " pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:56:13 crc kubenswrapper[4746]: I0128 20:56:13.750161 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:56:15 crc kubenswrapper[4746]: E0128 20:56:15.094808 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:6c88312afa9673f7b72c558368034d7a488ead73080cdcdf581fe85b99263ece" Jan 28 20:56:15 crc kubenswrapper[4746]: E0128 20:56:15.095545 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:6c88312afa9673f7b72c558368034d7a488ead73080cdcdf581fe85b99263ece,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6sf94,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-b45d7bf98-cm85d_openstack-operators(63794c40-0128-457d-b223-84e87943cca9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 20:56:15 crc kubenswrapper[4746]: E0128 20:56:15.097036 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-cm85d" podUID="63794c40-0128-457d-b223-84e87943cca9" Jan 28 20:56:15 crc kubenswrapper[4746]: E0128 20:56:15.571643 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:6c88312afa9673f7b72c558368034d7a488ead73080cdcdf581fe85b99263ece\\\"\"" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-cm85d" podUID="63794c40-0128-457d-b223-84e87943cca9" Jan 28 20:56:15 crc kubenswrapper[4746]: E0128 20:56:15.848608 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:b916c87806b7eadd83e0ca890c3c24fb990fc5beb48ddc4537e3384efd3e62f7" Jan 28 20:56:15 crc kubenswrapper[4746]: E0128 20:56:15.848784 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:b916c87806b7eadd83e0ca890c3c24fb990fc5beb48ddc4537e3384efd3e62f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fpv9p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-7478f7dbf9-n6qr7_openstack-operators(f86e66ed-9f28-4514-8ff8-97b8353026d1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 20:56:15 crc kubenswrapper[4746]: E0128 20:56:15.849943 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-n6qr7" podUID="f86e66ed-9f28-4514-8ff8-97b8353026d1" Jan 28 20:56:15 crc kubenswrapper[4746]: I0128 20:56:15.871333 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:56:15 crc kubenswrapper[4746]: I0128 20:56:15.871423 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:56:16 crc kubenswrapper[4746]: E0128 20:56:16.580587 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:b916c87806b7eadd83e0ca890c3c24fb990fc5beb48ddc4537e3384efd3e62f7\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-n6qr7" podUID="f86e66ed-9f28-4514-8ff8-97b8353026d1" Jan 28 20:56:16 crc kubenswrapper[4746]: E0128 20:56:16.921453 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:8abfbec47f0119a6c22c61a0ff80a4b1c6c14439a327bc75d4c529c5d8f59658" Jan 28 20:56:16 crc kubenswrapper[4746]: E0128 20:56:16.921680 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:8abfbec47f0119a6c22c61a0ff80a4b1c6c14439a327bc75d4c529c5d8f59658,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-555m9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-7bdb645866-pg4s4_openstack-operators(ced3eeee-ed33-4c50-8531-a7e4df1849f6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 20:56:16 crc kubenswrapper[4746]: E0128 20:56:16.923462 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-pg4s4" podUID="ced3eeee-ed33-4c50-8531-a7e4df1849f6" Jan 28 20:56:17 crc kubenswrapper[4746]: E0128 20:56:17.504731 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:8e340ff11922b38e811261de96982e1aff5f4eb8f225d1d9f5973025a4fe8349" Jan 28 20:56:17 crc kubenswrapper[4746]: E0128 20:56:17.505379 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:8e340ff11922b38e811261de96982e1aff5f4eb8f225d1d9f5973025a4fe8349,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gjxzk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b8b6d4659-65qb5_openstack-operators(fc220202-4669-4c2e-94b0-583048b56c83): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 20:56:17 crc kubenswrapper[4746]: E0128 20:56:17.506549 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-65qb5" podUID="fc220202-4669-4c2e-94b0-583048b56c83" Jan 28 20:56:17 crc kubenswrapper[4746]: E0128 20:56:17.592047 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:8e340ff11922b38e811261de96982e1aff5f4eb8f225d1d9f5973025a4fe8349\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-65qb5" podUID="fc220202-4669-4c2e-94b0-583048b56c83" Jan 28 20:56:17 crc kubenswrapper[4746]: E0128 20:56:17.592232 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:8abfbec47f0119a6c22c61a0ff80a4b1c6c14439a327bc75d4c529c5d8f59658\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-pg4s4" podUID="ced3eeee-ed33-4c50-8531-a7e4df1849f6" Jan 28 20:56:28 crc kubenswrapper[4746]: E0128 20:56:28.317907 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.50:5001/openstack-k8s-operators/telemetry-operator:a5bcf05e2d71c610156d017fdf197f7c58570d79" Jan 28 20:56:28 crc kubenswrapper[4746]: E0128 20:56:28.318544 4746 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.50:5001/openstack-k8s-operators/telemetry-operator:a5bcf05e2d71c610156d017fdf197f7c58570d79" Jan 28 20:56:28 crc kubenswrapper[4746]: E0128 20:56:28.318782 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.50:5001/openstack-k8s-operators/telemetry-operator:a5bcf05e2d71c610156d017fdf197f7c58570d79,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6htnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-9477bbd48-z984g_openstack-operators(e42669f3-6865-4ab6-9a9a-241c7b07509d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 20:56:28 crc kubenswrapper[4746]: E0128 20:56:28.320002 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-9477bbd48-z984g" podUID="e42669f3-6865-4ab6-9a9a-241c7b07509d" Jan 28 20:56:28 crc kubenswrapper[4746]: I0128 20:56:28.635301 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg"] Jan 28 20:56:28 crc kubenswrapper[4746]: I0128 20:56:28.642854 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp"] Jan 28 20:56:28 crc kubenswrapper[4746]: I0128 20:56:28.719901 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9"] Jan 28 20:56:28 crc kubenswrapper[4746]: W0128 20:56:28.817140 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ff4c44c_0290_4ab0_abb8_316375200dc0.slice/crio-f5cf3062e0efe9043c81cd552c478ab42b5ee77132fbd26534f1651cde7e486e WatchSource:0}: Error finding container f5cf3062e0efe9043c81cd552c478ab42b5ee77132fbd26534f1651cde7e486e: Status 404 returned error can't find the container with id f5cf3062e0efe9043c81cd552c478ab42b5ee77132fbd26534f1651cde7e486e Jan 28 20:56:28 crc kubenswrapper[4746]: W0128 20:56:28.818555 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7c2547a_3282_4748_a823_c3a0cc41ad46.slice/crio-c0912de9eaa49522b5ac51ee8ce9ca55e0c73952ede17a27778a9722dbd9a3ea WatchSource:0}: Error finding container c0912de9eaa49522b5ac51ee8ce9ca55e0c73952ede17a27778a9722dbd9a3ea: Status 404 returned error can't find the container with id c0912de9eaa49522b5ac51ee8ce9ca55e0c73952ede17a27778a9722dbd9a3ea Jan 28 20:56:28 crc kubenswrapper[4746]: W0128 20:56:28.825762 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28de2427_e250_44f5_add2_1b738cf6ce3b.slice/crio-d8bdfdf1894957fbe1b175a7c2054096787660fb4609624a84fc6d9018fe341d WatchSource:0}: Error finding container d8bdfdf1894957fbe1b175a7c2054096787660fb4609624a84fc6d9018fe341d: Status 404 returned error can't find the container with id d8bdfdf1894957fbe1b175a7c2054096787660fb4609624a84fc6d9018fe341d Jan 28 20:56:28 crc kubenswrapper[4746]: E0128 20:56:28.826225 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 28 20:56:28 crc kubenswrapper[4746]: E0128 20:56:28.826496 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4hhdd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-8kpht_openstack-operators(6b7a0005-11ec-4c8a-87e9-872855585d4d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 20:56:28 crc kubenswrapper[4746]: E0128 20:56:28.827889 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8kpht" podUID="6b7a0005-11ec-4c8a-87e9-872855585d4d" Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.743544 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-p6qjg" event={"ID":"677d2ab0-897d-4fd5-8ca5-b75f310e38da","Type":"ContainerStarted","Data":"fd012700a43a655f827c152b6a0bf542ce386b679075b04c066f9c4723bc05d6"} Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.745657 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-p6qjg" Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.759243 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" event={"ID":"3ff4c44c-0290-4ab0-abb8-316375200dc0","Type":"ContainerStarted","Data":"f5cf3062e0efe9043c81cd552c478ab42b5ee77132fbd26534f1651cde7e486e"} Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.800296 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-m5qbs" event={"ID":"f682c47e-2151-466d-8cc5-9ef0fca79785","Type":"ContainerStarted","Data":"260feb2d67e756e14263b3c21b50002f9b8e1ba9c263cdde3ee9490afe7351dc"} Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.800423 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-p6qjg" podStartSLOduration=9.286271505 podStartE2EDuration="33.800393253s" podCreationTimestamp="2026-01-28 20:55:56 +0000 UTC" firstStartedPulling="2026-01-28 20:55:58.752598251 +0000 UTC m=+986.708784605" lastFinishedPulling="2026-01-28 20:56:23.266719989 +0000 UTC m=+1011.222906353" observedRunningTime="2026-01-28 20:56:29.791545585 +0000 UTC m=+1017.747731939" watchObservedRunningTime="2026-01-28 20:56:29.800393253 +0000 UTC m=+1017.756579607" Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.801338 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-m5qbs" Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.836797 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-hd4k9" event={"ID":"90c190b4-36db-406b-bca5-6c45ac745ed6","Type":"ContainerStarted","Data":"9d82c40a886767ee34cb262495e3af741532266c237ca09f1a4bb414db75c0fe"} Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.839495 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-564965969-hd4k9" Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.846714 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-hb6t9" event={"ID":"3b28dc9c-6dcf-4fd1-8cbd-f13d0da9e954","Type":"ContainerStarted","Data":"3755cb6901f1488fa704c7912f97c31d16d1bbec44bd5b8169073ea1c860e40c"} Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.847625 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-hb6t9" Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.848966 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-m5qbs" podStartSLOduration=14.53760857 podStartE2EDuration="33.848954884s" podCreationTimestamp="2026-01-28 20:55:56 +0000 UTC" firstStartedPulling="2026-01-28 20:55:58.981224761 +0000 UTC m=+986.937411115" lastFinishedPulling="2026-01-28 20:56:18.292571075 +0000 UTC m=+1006.248757429" observedRunningTime="2026-01-28 20:56:29.840823784 +0000 UTC m=+1017.797010138" watchObservedRunningTime="2026-01-28 20:56:29.848954884 +0000 UTC m=+1017.805141238" Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.875506 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-564965969-hd4k9" podStartSLOduration=8.850020317 podStartE2EDuration="32.87548658s" podCreationTimestamp="2026-01-28 20:55:57 +0000 UTC" firstStartedPulling="2026-01-28 20:55:59.241209755 +0000 UTC m=+987.197396109" lastFinishedPulling="2026-01-28 20:56:23.266676028 +0000 UTC m=+1011.222862372" observedRunningTime="2026-01-28 20:56:29.872667644 +0000 UTC m=+1017.828853998" watchObservedRunningTime="2026-01-28 20:56:29.87548658 +0000 UTC m=+1017.831672934" Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.880150 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zcvgn" event={"ID":"5521c5f5-d2f6-461b-a2fc-ee97a5b2df11","Type":"ContainerStarted","Data":"cc61ab4e45a72d5269a683816d8225b79d716830f5d1c03d9c62ccb3f95ea4c0"} Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.880623 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zcvgn" Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.908008 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-5lc6j" event={"ID":"b44b1510-0a60-4b4e-9541-cc6d18e10a7f","Type":"ContainerStarted","Data":"b0a2fee6ab7b7b3bec4553bca76b6d937e7201f0209ad80ba24c004796e409fb"} Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.908839 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-5lc6j" Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.914364 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" event={"ID":"a7c2547a-3282-4748-a823-c3a0cc41ad46","Type":"ContainerStarted","Data":"c0912de9eaa49522b5ac51ee8ce9ca55e0c73952ede17a27778a9722dbd9a3ea"} Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.919834 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-bxtxd" event={"ID":"fe660f4f-8806-4674-ab58-ea3303f51683","Type":"ContainerStarted","Data":"52db49b5fc7ff2117da73d60e99bd9a7f2b9ff773678cd57002ab70fc424b072"} Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.920715 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-bxtxd" Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.921885 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-ws7k7" event={"ID":"760877c4-6e86-4445-a4cf-002b48e93841","Type":"ContainerStarted","Data":"379a0a1908ca819445c7b32d7c7663f4a2a60685de978888d71a0ca201782f6e"} Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.922282 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-ws7k7" Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.925322 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg" event={"ID":"28de2427-e250-44f5-add2-1b738cf6ce3b","Type":"ContainerStarted","Data":"d8bdfdf1894957fbe1b175a7c2054096787660fb4609624a84fc6d9018fe341d"} Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.938611 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-kll6j" event={"ID":"3c81bd6e-961b-42ae-8840-2607a13046df","Type":"ContainerStarted","Data":"0d55b9578eeb2f77881e5fbb19b19777a0c7902facb8b04cadbd1bcf02402e75"} Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.939320 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-kll6j" Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.941274 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-kpcqr" event={"ID":"e3360f0f-1430-4b7e-9ee0-0a126a9b657d","Type":"ContainerStarted","Data":"e8760807cca57e8133a3a8a4a9391b13292a68cae44ed529cc0adac07565a028"} Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.941612 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-kpcqr" Jan 28 20:56:29 crc kubenswrapper[4746]: I0128 20:56:29.987973 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-hb6t9" podStartSLOduration=14.680882205 podStartE2EDuration="33.987955784s" podCreationTimestamp="2026-01-28 20:55:56 +0000 UTC" firstStartedPulling="2026-01-28 20:55:58.985365673 +0000 UTC m=+986.941552027" lastFinishedPulling="2026-01-28 20:56:18.292439252 +0000 UTC m=+1006.248625606" observedRunningTime="2026-01-28 20:56:29.984718597 +0000 UTC m=+1017.940904951" watchObservedRunningTime="2026-01-28 20:56:29.987955784 +0000 UTC m=+1017.944142138" Jan 28 20:56:30 crc kubenswrapper[4746]: I0128 20:56:30.039238 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zcvgn" podStartSLOduration=14.748578362 podStartE2EDuration="34.039212648s" podCreationTimestamp="2026-01-28 20:55:56 +0000 UTC" firstStartedPulling="2026-01-28 20:55:59.000968323 +0000 UTC m=+986.957154687" lastFinishedPulling="2026-01-28 20:56:18.291602619 +0000 UTC m=+1006.247788973" observedRunningTime="2026-01-28 20:56:30.029997629 +0000 UTC m=+1017.986183983" watchObservedRunningTime="2026-01-28 20:56:30.039212648 +0000 UTC m=+1017.995399002" Jan 28 20:56:30 crc kubenswrapper[4746]: I0128 20:56:30.081180 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-bxtxd" podStartSLOduration=14.498441323 podStartE2EDuration="34.08116507s" podCreationTimestamp="2026-01-28 20:55:56 +0000 UTC" firstStartedPulling="2026-01-28 20:55:58.708022429 +0000 UTC m=+986.664208783" lastFinishedPulling="2026-01-28 20:56:18.290746176 +0000 UTC m=+1006.246932530" observedRunningTime="2026-01-28 20:56:30.079863734 +0000 UTC m=+1018.036050088" watchObservedRunningTime="2026-01-28 20:56:30.08116507 +0000 UTC m=+1018.037351424" Jan 28 20:56:30 crc kubenswrapper[4746]: I0128 20:56:30.146272 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-ws7k7" podStartSLOduration=7.451498149 podStartE2EDuration="34.146253505s" podCreationTimestamp="2026-01-28 20:55:56 +0000 UTC" firstStartedPulling="2026-01-28 20:55:58.763915697 +0000 UTC m=+986.720102051" lastFinishedPulling="2026-01-28 20:56:25.458671063 +0000 UTC m=+1013.414857407" observedRunningTime="2026-01-28 20:56:30.140562932 +0000 UTC m=+1018.096749286" watchObservedRunningTime="2026-01-28 20:56:30.146253505 +0000 UTC m=+1018.102439859" Jan 28 20:56:30 crc kubenswrapper[4746]: I0128 20:56:30.183336 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-5lc6j" podStartSLOduration=15.121194546 podStartE2EDuration="34.183314476s" podCreationTimestamp="2026-01-28 20:55:56 +0000 UTC" firstStartedPulling="2026-01-28 20:55:59.228627356 +0000 UTC m=+987.184813710" lastFinishedPulling="2026-01-28 20:56:18.290747286 +0000 UTC m=+1006.246933640" observedRunningTime="2026-01-28 20:56:30.180988143 +0000 UTC m=+1018.137174507" watchObservedRunningTime="2026-01-28 20:56:30.183314476 +0000 UTC m=+1018.139500830" Jan 28 20:56:30 crc kubenswrapper[4746]: I0128 20:56:30.246495 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-kpcqr" podStartSLOduration=10.184856622 podStartE2EDuration="34.24647115s" podCreationTimestamp="2026-01-28 20:55:56 +0000 UTC" firstStartedPulling="2026-01-28 20:55:59.205107012 +0000 UTC m=+987.161293366" lastFinishedPulling="2026-01-28 20:56:23.26672153 +0000 UTC m=+1011.222907894" observedRunningTime="2026-01-28 20:56:30.220508069 +0000 UTC m=+1018.176694443" watchObservedRunningTime="2026-01-28 20:56:30.24647115 +0000 UTC m=+1018.202657504" Jan 28 20:56:30 crc kubenswrapper[4746]: I0128 20:56:30.955362 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-cm85d" event={"ID":"63794c40-0128-457d-b223-84e87943cca9","Type":"ContainerStarted","Data":"a5be25a83caa2ad76396bf12aaa1193b1e8df68b42ceb20350b1ef28f1d0dadb"} Jan 28 20:56:30 crc kubenswrapper[4746]: I0128 20:56:30.956222 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-cm85d" Jan 28 20:56:30 crc kubenswrapper[4746]: I0128 20:56:30.958391 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-fjs9l" event={"ID":"1f4e2d58-bbd0-45d2-81ba-2b4b47ab5af6","Type":"ContainerStarted","Data":"44a3e33f1ebabf684ca570e29d6bd96db4f4f682e15c7b040d5c8c3262a8dc69"} Jan 28 20:56:30 crc kubenswrapper[4746]: I0128 20:56:30.958676 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-fjs9l" Jan 28 20:56:30 crc kubenswrapper[4746]: I0128 20:56:30.963141 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-pcprz" event={"ID":"b182a0df-d0f9-46d6-9a0c-a3e332c84cff","Type":"ContainerStarted","Data":"1d2c89c3d6b38140e9224ff6fa9ea3ad81fcfbab4ead9c23f2fcb146c39ea235"} Jan 28 20:56:30 crc kubenswrapper[4746]: I0128 20:56:30.964503 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-pcprz" Jan 28 20:56:30 crc kubenswrapper[4746]: I0128 20:56:30.979576 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-kll6j" podStartSLOduration=15.35078528 podStartE2EDuration="34.979534299s" podCreationTimestamp="2026-01-28 20:55:56 +0000 UTC" firstStartedPulling="2026-01-28 20:55:57.912804562 +0000 UTC m=+985.868990916" lastFinishedPulling="2026-01-28 20:56:17.541553581 +0000 UTC m=+1005.497739935" observedRunningTime="2026-01-28 20:56:30.24388028 +0000 UTC m=+1018.200066634" watchObservedRunningTime="2026-01-28 20:56:30.979534299 +0000 UTC m=+1018.935720653" Jan 28 20:56:30 crc kubenswrapper[4746]: I0128 20:56:30.984000 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-cm85d" podStartSLOduration=4.507711938 podStartE2EDuration="34.983974549s" podCreationTimestamp="2026-01-28 20:55:56 +0000 UTC" firstStartedPulling="2026-01-28 20:55:58.77180375 +0000 UTC m=+986.727990104" lastFinishedPulling="2026-01-28 20:56:29.248066361 +0000 UTC m=+1017.204252715" observedRunningTime="2026-01-28 20:56:30.975570092 +0000 UTC m=+1018.931756446" watchObservedRunningTime="2026-01-28 20:56:30.983974549 +0000 UTC m=+1018.940160903" Jan 28 20:56:30 crc kubenswrapper[4746]: I0128 20:56:30.987011 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-pg4s4" event={"ID":"ced3eeee-ed33-4c50-8531-a7e4df1849f6","Type":"ContainerStarted","Data":"310698e4e0049786129258c7376e1a9aa0d6ab0c3482ad8794998a17947d33f6"} Jan 28 20:56:30 crc kubenswrapper[4746]: I0128 20:56:30.988099 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-pg4s4" Jan 28 20:56:30 crc kubenswrapper[4746]: I0128 20:56:30.999553 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-fjs9l" podStartSLOduration=5.464195408 podStartE2EDuration="34.999533109s" podCreationTimestamp="2026-01-28 20:55:56 +0000 UTC" firstStartedPulling="2026-01-28 20:55:59.28805487 +0000 UTC m=+987.244241224" lastFinishedPulling="2026-01-28 20:56:28.823392571 +0000 UTC m=+1016.779578925" observedRunningTime="2026-01-28 20:56:30.998635195 +0000 UTC m=+1018.954821549" watchObservedRunningTime="2026-01-28 20:56:30.999533109 +0000 UTC m=+1018.955719453" Jan 28 20:56:31 crc kubenswrapper[4746]: I0128 20:56:31.004318 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-m4x6j" event={"ID":"beba987e-69be-47aa-a84c-7ea511c4d151","Type":"ContainerStarted","Data":"5b66ba71d59b390dbb46148f5047c8622555deb1cd6b58fcf0eb579ad71a9102"} Jan 28 20:56:31 crc kubenswrapper[4746]: I0128 20:56:31.005039 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-m4x6j" Jan 28 20:56:31 crc kubenswrapper[4746]: I0128 20:56:31.018966 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6klzp" event={"ID":"370a5739-7af0-4065-986c-af68a265423c","Type":"ContainerStarted","Data":"16f003df1b0ac17e03098b7d8dd56a7890f41d89b4c17c2d326ab00ba5833c37"} Jan 28 20:56:31 crc kubenswrapper[4746]: I0128 20:56:31.019652 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6klzp" Jan 28 20:56:31 crc kubenswrapper[4746]: I0128 20:56:31.019937 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-pcprz" podStartSLOduration=5.4545890870000004 podStartE2EDuration="35.019925299s" podCreationTimestamp="2026-01-28 20:55:56 +0000 UTC" firstStartedPulling="2026-01-28 20:55:59.258996695 +0000 UTC m=+987.215183059" lastFinishedPulling="2026-01-28 20:56:28.824332917 +0000 UTC m=+1016.780519271" observedRunningTime="2026-01-28 20:56:31.017062402 +0000 UTC m=+1018.973248756" watchObservedRunningTime="2026-01-28 20:56:31.019925299 +0000 UTC m=+1018.976111653" Jan 28 20:56:31 crc kubenswrapper[4746]: I0128 20:56:31.026710 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-65qb5" event={"ID":"fc220202-4669-4c2e-94b0-583048b56c83","Type":"ContainerStarted","Data":"2a45edc5ccb65f3d2fad6d986c56d33c0a4403e8c9fcb8e2acdc3337c6507be8"} Jan 28 20:56:31 crc kubenswrapper[4746]: I0128 20:56:31.027443 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-65qb5" Jan 28 20:56:31 crc kubenswrapper[4746]: I0128 20:56:31.035014 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-n6qr7" event={"ID":"f86e66ed-9f28-4514-8ff8-97b8353026d1","Type":"ContainerStarted","Data":"f80ba2578396404c2b51b0817d27dbf8164850063ee0ff1dfe9efb7ff0fcaef0"} Jan 28 20:56:31 crc kubenswrapper[4746]: I0128 20:56:31.035404 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-n6qr7" Jan 28 20:56:31 crc kubenswrapper[4746]: I0128 20:56:31.037769 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" event={"ID":"a7c2547a-3282-4748-a823-c3a0cc41ad46","Type":"ContainerStarted","Data":"2cf19894c1d86d7c1581e2634880c81efd58b048e5715bcce25b7fe4006b2655"} Jan 28 20:56:31 crc kubenswrapper[4746]: I0128 20:56:31.037804 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:56:31 crc kubenswrapper[4746]: I0128 20:56:31.040414 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-m4x6j" podStartSLOduration=4.457800865 podStartE2EDuration="34.040401782s" podCreationTimestamp="2026-01-28 20:55:57 +0000 UTC" firstStartedPulling="2026-01-28 20:55:59.241675108 +0000 UTC m=+987.197861462" lastFinishedPulling="2026-01-28 20:56:28.824276025 +0000 UTC m=+1016.780462379" observedRunningTime="2026-01-28 20:56:31.036455775 +0000 UTC m=+1018.992642129" watchObservedRunningTime="2026-01-28 20:56:31.040401782 +0000 UTC m=+1018.996588136" Jan 28 20:56:31 crc kubenswrapper[4746]: I0128 20:56:31.070624 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-pg4s4" podStartSLOduration=3.80358472 podStartE2EDuration="35.070603896s" podCreationTimestamp="2026-01-28 20:55:56 +0000 UTC" firstStartedPulling="2026-01-28 20:55:59.022224647 +0000 UTC m=+986.978411001" lastFinishedPulling="2026-01-28 20:56:30.289243823 +0000 UTC m=+1018.245430177" observedRunningTime="2026-01-28 20:56:31.06591889 +0000 UTC m=+1019.022105244" watchObservedRunningTime="2026-01-28 20:56:31.070603896 +0000 UTC m=+1019.026790250" Jan 28 20:56:31 crc kubenswrapper[4746]: I0128 20:56:31.104819 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-n6qr7" podStartSLOduration=4.28134518 podStartE2EDuration="35.104803359s" podCreationTimestamp="2026-01-28 20:55:56 +0000 UTC" firstStartedPulling="2026-01-28 20:55:58.102948842 +0000 UTC m=+986.059135206" lastFinishedPulling="2026-01-28 20:56:28.926407021 +0000 UTC m=+1016.882593385" observedRunningTime="2026-01-28 20:56:31.100445632 +0000 UTC m=+1019.056631986" watchObservedRunningTime="2026-01-28 20:56:31.104803359 +0000 UTC m=+1019.060989703" Jan 28 20:56:31 crc kubenswrapper[4746]: I0128 20:56:31.134443 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-65qb5" podStartSLOduration=4.738451826 podStartE2EDuration="35.134420769s" podCreationTimestamp="2026-01-28 20:55:56 +0000 UTC" firstStartedPulling="2026-01-28 20:55:59.019397241 +0000 UTC m=+986.975583595" lastFinishedPulling="2026-01-28 20:56:29.415366184 +0000 UTC m=+1017.371552538" observedRunningTime="2026-01-28 20:56:31.129634459 +0000 UTC m=+1019.085820813" watchObservedRunningTime="2026-01-28 20:56:31.134420769 +0000 UTC m=+1019.090607123" Jan 28 20:56:31 crc kubenswrapper[4746]: I0128 20:56:31.168174 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6klzp" podStartSLOduration=5.620630958 podStartE2EDuration="35.168158399s" podCreationTimestamp="2026-01-28 20:55:56 +0000 UTC" firstStartedPulling="2026-01-28 20:55:59.289098838 +0000 UTC m=+987.245285192" lastFinishedPulling="2026-01-28 20:56:28.836626259 +0000 UTC m=+1016.792812633" observedRunningTime="2026-01-28 20:56:31.164692295 +0000 UTC m=+1019.120878649" watchObservedRunningTime="2026-01-28 20:56:31.168158399 +0000 UTC m=+1019.124344753" Jan 28 20:56:31 crc kubenswrapper[4746]: I0128 20:56:31.260674 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" podStartSLOduration=34.260659905 podStartE2EDuration="34.260659905s" podCreationTimestamp="2026-01-28 20:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:56:31.257210652 +0000 UTC m=+1019.213397006" watchObservedRunningTime="2026-01-28 20:56:31.260659905 +0000 UTC m=+1019.216846259" Jan 28 20:56:35 crc kubenswrapper[4746]: I0128 20:56:35.089703 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" event={"ID":"3ff4c44c-0290-4ab0-abb8-316375200dc0","Type":"ContainerStarted","Data":"fa4c73656ec0048573ad6e314fcdb75c579a2cb8a0b071b5e6f23ea3449c3d51"} Jan 28 20:56:35 crc kubenswrapper[4746]: I0128 20:56:35.090552 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" Jan 28 20:56:35 crc kubenswrapper[4746]: I0128 20:56:35.093022 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg" event={"ID":"28de2427-e250-44f5-add2-1b738cf6ce3b","Type":"ContainerStarted","Data":"c479aa9952a2a67b17ed6b7d01145eda6e7a286108e86a794150ba6f1c87a637"} Jan 28 20:56:35 crc kubenswrapper[4746]: I0128 20:56:35.093235 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg" Jan 28 20:56:35 crc kubenswrapper[4746]: I0128 20:56:35.132300 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" podStartSLOduration=34.055960058 podStartE2EDuration="39.132275359s" podCreationTimestamp="2026-01-28 20:55:56 +0000 UTC" firstStartedPulling="2026-01-28 20:56:28.841224863 +0000 UTC m=+1016.797411217" lastFinishedPulling="2026-01-28 20:56:33.917540164 +0000 UTC m=+1021.873726518" observedRunningTime="2026-01-28 20:56:35.123116223 +0000 UTC m=+1023.079302617" watchObservedRunningTime="2026-01-28 20:56:35.132275359 +0000 UTC m=+1023.088461753" Jan 28 20:56:35 crc kubenswrapper[4746]: I0128 20:56:35.149294 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg" podStartSLOduration=34.071605281 podStartE2EDuration="39.149265868s" podCreationTimestamp="2026-01-28 20:55:56 +0000 UTC" firstStartedPulling="2026-01-28 20:56:28.837740229 +0000 UTC m=+1016.793926583" lastFinishedPulling="2026-01-28 20:56:33.915400796 +0000 UTC m=+1021.871587170" observedRunningTime="2026-01-28 20:56:35.141987422 +0000 UTC m=+1023.098173826" watchObservedRunningTime="2026-01-28 20:56:35.149265868 +0000 UTC m=+1023.105452232" Jan 28 20:56:36 crc kubenswrapper[4746]: I0128 20:56:36.919709 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7f86f8796f-kll6j" Jan 28 20:56:36 crc kubenswrapper[4746]: I0128 20:56:36.934779 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7478f7dbf9-n6qr7" Jan 28 20:56:36 crc kubenswrapper[4746]: I0128 20:56:36.984485 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-b45d7bf98-cm85d" Jan 28 20:56:37 crc kubenswrapper[4746]: I0128 20:56:37.052178 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-78fdd796fd-bxtxd" Jan 28 20:56:37 crc kubenswrapper[4746]: I0128 20:56:37.088926 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-ws7k7" Jan 28 20:56:37 crc kubenswrapper[4746]: I0128 20:56:37.175351 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-p6qjg" Jan 28 20:56:37 crc kubenswrapper[4746]: I0128 20:56:37.279782 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7bdb645866-pg4s4" Jan 28 20:56:37 crc kubenswrapper[4746]: I0128 20:56:37.367298 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-78c6999f6f-m5qbs" Jan 28 20:56:37 crc kubenswrapper[4746]: I0128 20:56:37.375101 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b8b6d4659-65qb5" Jan 28 20:56:37 crc kubenswrapper[4746]: I0128 20:56:37.423474 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zcvgn" Jan 28 20:56:37 crc kubenswrapper[4746]: I0128 20:56:37.463960 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-78d58447c5-pcprz" Jan 28 20:56:37 crc kubenswrapper[4746]: I0128 20:56:37.528592 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-598f7747c9-5lc6j" Jan 28 20:56:37 crc kubenswrapper[4746]: I0128 20:56:37.620704 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-kpcqr" Jan 28 20:56:37 crc kubenswrapper[4746]: I0128 20:56:37.653741 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4cd88d46-hb6t9" Jan 28 20:56:37 crc kubenswrapper[4746]: I0128 20:56:37.768762 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-6klzp" Jan 28 20:56:37 crc kubenswrapper[4746]: I0128 20:56:37.779571 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-fjs9l" Jan 28 20:56:37 crc kubenswrapper[4746]: I0128 20:56:37.941195 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-m4x6j" Jan 28 20:56:38 crc kubenswrapper[4746]: I0128 20:56:38.132155 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-564965969-hd4k9" Jan 28 20:56:40 crc kubenswrapper[4746]: E0128 20:56:40.837876 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8kpht" podUID="6b7a0005-11ec-4c8a-87e9-872855585d4d" Jan 28 20:56:41 crc kubenswrapper[4746]: E0128 20:56:41.839904 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.50:5001/openstack-k8s-operators/telemetry-operator:a5bcf05e2d71c610156d017fdf197f7c58570d79\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-9477bbd48-z984g" podUID="e42669f3-6865-4ab6-9a9a-241c7b07509d" Jan 28 20:56:42 crc kubenswrapper[4746]: I0128 20:56:42.944912 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-694cf4f878-th2hg" Jan 28 20:56:43 crc kubenswrapper[4746]: I0128 20:56:43.328575 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b85449cmp" Jan 28 20:56:43 crc kubenswrapper[4746]: I0128 20:56:43.755985 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-65d466cb7d-vf8n9" Jan 28 20:56:45 crc kubenswrapper[4746]: I0128 20:56:45.871239 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:56:45 crc kubenswrapper[4746]: I0128 20:56:45.871736 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:56:55 crc kubenswrapper[4746]: I0128 20:56:55.264910 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8kpht" event={"ID":"6b7a0005-11ec-4c8a-87e9-872855585d4d","Type":"ContainerStarted","Data":"42bbb23ff7222ff44187dc099f62c54b8260e35e9055b5c0225a2bf307d4de35"} Jan 28 20:56:55 crc kubenswrapper[4746]: I0128 20:56:55.297747 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-8kpht" podStartSLOduration=3.304154025 podStartE2EDuration="58.297723398s" podCreationTimestamp="2026-01-28 20:55:57 +0000 UTC" firstStartedPulling="2026-01-28 20:55:59.295055269 +0000 UTC m=+987.251241623" lastFinishedPulling="2026-01-28 20:56:54.288624642 +0000 UTC m=+1042.244810996" observedRunningTime="2026-01-28 20:56:55.287722579 +0000 UTC m=+1043.243908933" watchObservedRunningTime="2026-01-28 20:56:55.297723398 +0000 UTC m=+1043.253909762" Jan 28 20:56:57 crc kubenswrapper[4746]: I0128 20:56:57.286070 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-9477bbd48-z984g" event={"ID":"e42669f3-6865-4ab6-9a9a-241c7b07509d","Type":"ContainerStarted","Data":"1bbacf0512f1b06a4cccc5fb5b2bde1558b00e319c2fd57a4c1b1a0c542ed6c4"} Jan 28 20:56:57 crc kubenswrapper[4746]: I0128 20:56:57.287806 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-9477bbd48-z984g" Jan 28 20:56:57 crc kubenswrapper[4746]: I0128 20:56:57.313491 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-9477bbd48-z984g" podStartSLOduration=2.648985278 podStartE2EDuration="1m0.313466162s" podCreationTimestamp="2026-01-28 20:55:57 +0000 UTC" firstStartedPulling="2026-01-28 20:55:59.245437169 +0000 UTC m=+987.201623543" lastFinishedPulling="2026-01-28 20:56:56.909918073 +0000 UTC m=+1044.866104427" observedRunningTime="2026-01-28 20:56:57.311031168 +0000 UTC m=+1045.267217542" watchObservedRunningTime="2026-01-28 20:56:57.313466162 +0000 UTC m=+1045.269652526" Jan 28 20:57:08 crc kubenswrapper[4746]: I0128 20:57:08.105677 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-9477bbd48-z984g" Jan 28 20:57:15 crc kubenswrapper[4746]: I0128 20:57:15.872165 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:57:15 crc kubenswrapper[4746]: I0128 20:57:15.872931 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:57:15 crc kubenswrapper[4746]: I0128 20:57:15.873004 4746 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 20:57:15 crc kubenswrapper[4746]: I0128 20:57:15.873962 4746 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"635dfdb81316e9a80fdcd2f942f907e439906f4018e69db1be59f1c63b3c993e"} pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 20:57:15 crc kubenswrapper[4746]: I0128 20:57:15.874071 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" containerID="cri-o://635dfdb81316e9a80fdcd2f942f907e439906f4018e69db1be59f1c63b3c993e" gracePeriod=600 Jan 28 20:57:16 crc kubenswrapper[4746]: I0128 20:57:16.480966 4746 generic.go:334] "Generic (PLEG): container finished" podID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerID="635dfdb81316e9a80fdcd2f942f907e439906f4018e69db1be59f1c63b3c993e" exitCode=0 Jan 28 20:57:16 crc kubenswrapper[4746]: I0128 20:57:16.481057 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerDied","Data":"635dfdb81316e9a80fdcd2f942f907e439906f4018e69db1be59f1c63b3c993e"} Jan 28 20:57:16 crc kubenswrapper[4746]: I0128 20:57:16.481436 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerStarted","Data":"6862c0afb8f6ee7e41759258bd8f935df2c29be354b170c8fd2a76edbba23242"} Jan 28 20:57:16 crc kubenswrapper[4746]: I0128 20:57:16.481464 4746 scope.go:117] "RemoveContainer" containerID="4dbcdfa14610109c45d3514591f8d6ce15356b36ba815407076266ee1f95c6fd" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.517586 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m2457"] Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.519374 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-m2457" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.522113 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-74s65" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.522278 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.522397 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.523039 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.563204 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m2457"] Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.610715 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f99e78a-c369-416e-935b-4f9c4f3ad490-config\") pod \"dnsmasq-dns-675f4bcbfc-m2457\" (UID: \"5f99e78a-c369-416e-935b-4f9c4f3ad490\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m2457" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.610829 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pghh\" (UniqueName: \"kubernetes.io/projected/5f99e78a-c369-416e-935b-4f9c4f3ad490-kube-api-access-9pghh\") pod \"dnsmasq-dns-675f4bcbfc-m2457\" (UID: \"5f99e78a-c369-416e-935b-4f9c4f3ad490\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m2457" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.699062 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gkmq8"] Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.708644 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-gkmq8" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.711991 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gkmq8"] Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.713756 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f99e78a-c369-416e-935b-4f9c4f3ad490-config\") pod \"dnsmasq-dns-675f4bcbfc-m2457\" (UID: \"5f99e78a-c369-416e-935b-4f9c4f3ad490\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m2457" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.713860 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pghh\" (UniqueName: \"kubernetes.io/projected/5f99e78a-c369-416e-935b-4f9c4f3ad490-kube-api-access-9pghh\") pod \"dnsmasq-dns-675f4bcbfc-m2457\" (UID: \"5f99e78a-c369-416e-935b-4f9c4f3ad490\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m2457" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.714436 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.715300 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f99e78a-c369-416e-935b-4f9c4f3ad490-config\") pod \"dnsmasq-dns-675f4bcbfc-m2457\" (UID: \"5f99e78a-c369-416e-935b-4f9c4f3ad490\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m2457" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.753174 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pghh\" (UniqueName: \"kubernetes.io/projected/5f99e78a-c369-416e-935b-4f9c4f3ad490-kube-api-access-9pghh\") pod \"dnsmasq-dns-675f4bcbfc-m2457\" (UID: \"5f99e78a-c369-416e-935b-4f9c4f3ad490\") " pod="openstack/dnsmasq-dns-675f4bcbfc-m2457" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.815507 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3a48dbd-649e-4fc6-b2a4-70c587c8237f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-gkmq8\" (UID: \"e3a48dbd-649e-4fc6-b2a4-70c587c8237f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gkmq8" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.815811 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3a48dbd-649e-4fc6-b2a4-70c587c8237f-config\") pod \"dnsmasq-dns-78dd6ddcc-gkmq8\" (UID: \"e3a48dbd-649e-4fc6-b2a4-70c587c8237f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gkmq8" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.815891 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvh5p\" (UniqueName: \"kubernetes.io/projected/e3a48dbd-649e-4fc6-b2a4-70c587c8237f-kube-api-access-cvh5p\") pod \"dnsmasq-dns-78dd6ddcc-gkmq8\" (UID: \"e3a48dbd-649e-4fc6-b2a4-70c587c8237f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gkmq8" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.838016 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-m2457" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.917915 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvh5p\" (UniqueName: \"kubernetes.io/projected/e3a48dbd-649e-4fc6-b2a4-70c587c8237f-kube-api-access-cvh5p\") pod \"dnsmasq-dns-78dd6ddcc-gkmq8\" (UID: \"e3a48dbd-649e-4fc6-b2a4-70c587c8237f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gkmq8" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.918003 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3a48dbd-649e-4fc6-b2a4-70c587c8237f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-gkmq8\" (UID: \"e3a48dbd-649e-4fc6-b2a4-70c587c8237f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gkmq8" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.918030 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3a48dbd-649e-4fc6-b2a4-70c587c8237f-config\") pod \"dnsmasq-dns-78dd6ddcc-gkmq8\" (UID: \"e3a48dbd-649e-4fc6-b2a4-70c587c8237f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gkmq8" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.920090 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3a48dbd-649e-4fc6-b2a4-70c587c8237f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-gkmq8\" (UID: \"e3a48dbd-649e-4fc6-b2a4-70c587c8237f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gkmq8" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.920248 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3a48dbd-649e-4fc6-b2a4-70c587c8237f-config\") pod \"dnsmasq-dns-78dd6ddcc-gkmq8\" (UID: \"e3a48dbd-649e-4fc6-b2a4-70c587c8237f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gkmq8" Jan 28 20:57:25 crc kubenswrapper[4746]: I0128 20:57:25.940224 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvh5p\" (UniqueName: \"kubernetes.io/projected/e3a48dbd-649e-4fc6-b2a4-70c587c8237f-kube-api-access-cvh5p\") pod \"dnsmasq-dns-78dd6ddcc-gkmq8\" (UID: \"e3a48dbd-649e-4fc6-b2a4-70c587c8237f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gkmq8" Jan 28 20:57:26 crc kubenswrapper[4746]: I0128 20:57:26.032596 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-gkmq8" Jan 28 20:57:26 crc kubenswrapper[4746]: I0128 20:57:26.336434 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m2457"] Jan 28 20:57:26 crc kubenswrapper[4746]: W0128 20:57:26.558962 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3a48dbd_649e_4fc6_b2a4_70c587c8237f.slice/crio-8abc0399150cde17ff910dad6117ea320f57a484a974d3dab0333253f20ec545 WatchSource:0}: Error finding container 8abc0399150cde17ff910dad6117ea320f57a484a974d3dab0333253f20ec545: Status 404 returned error can't find the container with id 8abc0399150cde17ff910dad6117ea320f57a484a974d3dab0333253f20ec545 Jan 28 20:57:26 crc kubenswrapper[4746]: I0128 20:57:26.562584 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gkmq8"] Jan 28 20:57:26 crc kubenswrapper[4746]: I0128 20:57:26.568292 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-gkmq8" event={"ID":"e3a48dbd-649e-4fc6-b2a4-70c587c8237f","Type":"ContainerStarted","Data":"8abc0399150cde17ff910dad6117ea320f57a484a974d3dab0333253f20ec545"} Jan 28 20:57:26 crc kubenswrapper[4746]: I0128 20:57:26.570493 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-m2457" event={"ID":"5f99e78a-c369-416e-935b-4f9c4f3ad490","Type":"ContainerStarted","Data":"408ca0f5431ad66517b40a051feaacb63df9daaa0d1e02a2efd3bfd418dbe2a6"} Jan 28 20:57:28 crc kubenswrapper[4746]: I0128 20:57:28.505305 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m2457"] Jan 28 20:57:28 crc kubenswrapper[4746]: I0128 20:57:28.529050 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-gbg8l"] Jan 28 20:57:28 crc kubenswrapper[4746]: I0128 20:57:28.530217 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" Jan 28 20:57:28 crc kubenswrapper[4746]: I0128 20:57:28.606087 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-gbg8l"] Jan 28 20:57:28 crc kubenswrapper[4746]: I0128 20:57:28.607224 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcb4de78-f54a-4f35-ba3f-960655540032-dns-svc\") pod \"dnsmasq-dns-666b6646f7-gbg8l\" (UID: \"fcb4de78-f54a-4f35-ba3f-960655540032\") " pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" Jan 28 20:57:28 crc kubenswrapper[4746]: I0128 20:57:28.607342 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz2v8\" (UniqueName: \"kubernetes.io/projected/fcb4de78-f54a-4f35-ba3f-960655540032-kube-api-access-lz2v8\") pod \"dnsmasq-dns-666b6646f7-gbg8l\" (UID: \"fcb4de78-f54a-4f35-ba3f-960655540032\") " pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" Jan 28 20:57:28 crc kubenswrapper[4746]: I0128 20:57:28.607368 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcb4de78-f54a-4f35-ba3f-960655540032-config\") pod \"dnsmasq-dns-666b6646f7-gbg8l\" (UID: \"fcb4de78-f54a-4f35-ba3f-960655540032\") " pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" Jan 28 20:57:28 crc kubenswrapper[4746]: I0128 20:57:28.708794 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz2v8\" (UniqueName: \"kubernetes.io/projected/fcb4de78-f54a-4f35-ba3f-960655540032-kube-api-access-lz2v8\") pod \"dnsmasq-dns-666b6646f7-gbg8l\" (UID: \"fcb4de78-f54a-4f35-ba3f-960655540032\") " pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" Jan 28 20:57:28 crc kubenswrapper[4746]: I0128 20:57:28.708831 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcb4de78-f54a-4f35-ba3f-960655540032-config\") pod \"dnsmasq-dns-666b6646f7-gbg8l\" (UID: \"fcb4de78-f54a-4f35-ba3f-960655540032\") " pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" Jan 28 20:57:28 crc kubenswrapper[4746]: I0128 20:57:28.708867 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcb4de78-f54a-4f35-ba3f-960655540032-dns-svc\") pod \"dnsmasq-dns-666b6646f7-gbg8l\" (UID: \"fcb4de78-f54a-4f35-ba3f-960655540032\") " pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" Jan 28 20:57:28 crc kubenswrapper[4746]: I0128 20:57:28.709653 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcb4de78-f54a-4f35-ba3f-960655540032-dns-svc\") pod \"dnsmasq-dns-666b6646f7-gbg8l\" (UID: \"fcb4de78-f54a-4f35-ba3f-960655540032\") " pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" Jan 28 20:57:28 crc kubenswrapper[4746]: I0128 20:57:28.710103 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcb4de78-f54a-4f35-ba3f-960655540032-config\") pod \"dnsmasq-dns-666b6646f7-gbg8l\" (UID: \"fcb4de78-f54a-4f35-ba3f-960655540032\") " pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" Jan 28 20:57:28 crc kubenswrapper[4746]: I0128 20:57:28.736517 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz2v8\" (UniqueName: \"kubernetes.io/projected/fcb4de78-f54a-4f35-ba3f-960655540032-kube-api-access-lz2v8\") pod \"dnsmasq-dns-666b6646f7-gbg8l\" (UID: \"fcb4de78-f54a-4f35-ba3f-960655540032\") " pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" Jan 28 20:57:28 crc kubenswrapper[4746]: I0128 20:57:28.864399 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" Jan 28 20:57:28 crc kubenswrapper[4746]: I0128 20:57:28.987592 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gkmq8"] Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.030138 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vfmkg"] Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.031315 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.039120 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vfmkg"] Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.126911 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc6mn\" (UniqueName: \"kubernetes.io/projected/0e48f2c5-a005-440d-b1d4-885bd3dd4a82-kube-api-access-gc6mn\") pod \"dnsmasq-dns-57d769cc4f-vfmkg\" (UID: \"0e48f2c5-a005-440d-b1d4-885bd3dd4a82\") " pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.126971 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e48f2c5-a005-440d-b1d4-885bd3dd4a82-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vfmkg\" (UID: \"0e48f2c5-a005-440d-b1d4-885bd3dd4a82\") " pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.127048 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e48f2c5-a005-440d-b1d4-885bd3dd4a82-config\") pod \"dnsmasq-dns-57d769cc4f-vfmkg\" (UID: \"0e48f2c5-a005-440d-b1d4-885bd3dd4a82\") " pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.232093 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc6mn\" (UniqueName: \"kubernetes.io/projected/0e48f2c5-a005-440d-b1d4-885bd3dd4a82-kube-api-access-gc6mn\") pod \"dnsmasq-dns-57d769cc4f-vfmkg\" (UID: \"0e48f2c5-a005-440d-b1d4-885bd3dd4a82\") " pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.232136 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e48f2c5-a005-440d-b1d4-885bd3dd4a82-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vfmkg\" (UID: \"0e48f2c5-a005-440d-b1d4-885bd3dd4a82\") " pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.232200 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e48f2c5-a005-440d-b1d4-885bd3dd4a82-config\") pod \"dnsmasq-dns-57d769cc4f-vfmkg\" (UID: \"0e48f2c5-a005-440d-b1d4-885bd3dd4a82\") " pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.233179 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e48f2c5-a005-440d-b1d4-885bd3dd4a82-config\") pod \"dnsmasq-dns-57d769cc4f-vfmkg\" (UID: \"0e48f2c5-a005-440d-b1d4-885bd3dd4a82\") " pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.233485 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e48f2c5-a005-440d-b1d4-885bd3dd4a82-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vfmkg\" (UID: \"0e48f2c5-a005-440d-b1d4-885bd3dd4a82\") " pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.263234 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc6mn\" (UniqueName: \"kubernetes.io/projected/0e48f2c5-a005-440d-b1d4-885bd3dd4a82-kube-api-access-gc6mn\") pod \"dnsmasq-dns-57d769cc4f-vfmkg\" (UID: \"0e48f2c5-a005-440d-b1d4-885bd3dd4a82\") " pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.371571 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.463983 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-gbg8l"] Jan 28 20:57:29 crc kubenswrapper[4746]: W0128 20:57:29.476803 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcb4de78_f54a_4f35_ba3f_960655540032.slice/crio-f854e6067df0fc7b7e3aca35111e6885a57a2463e6f0764f4d221e954127918a WatchSource:0}: Error finding container f854e6067df0fc7b7e3aca35111e6885a57a2463e6f0764f4d221e954127918a: Status 404 returned error can't find the container with id f854e6067df0fc7b7e3aca35111e6885a57a2463e6f0764f4d221e954127918a Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.616662 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" event={"ID":"fcb4de78-f54a-4f35-ba3f-960655540032","Type":"ContainerStarted","Data":"f854e6067df0fc7b7e3aca35111e6885a57a2463e6f0764f4d221e954127918a"} Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.737856 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.739266 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.742320 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.742646 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.743119 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.743134 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-tsllx" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.745977 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.747942 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.751166 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.756327 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.844264 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkb6z\" (UniqueName: \"kubernetes.io/projected/88718387-09d6-4e3d-a06f-4353ba42ce91-kube-api-access-bkb6z\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.844907 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.844958 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.844995 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/88718387-09d6-4e3d-a06f-4353ba42ce91-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.845023 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/88718387-09d6-4e3d-a06f-4353ba42ce91-pod-info\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.845068 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88718387-09d6-4e3d-a06f-4353ba42ce91-config-data\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.845120 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.845152 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.845177 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.845205 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/88718387-09d6-4e3d-a06f-4353ba42ce91-server-conf\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.845237 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/88718387-09d6-4e3d-a06f-4353ba42ce91-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.946993 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88718387-09d6-4e3d-a06f-4353ba42ce91-config-data\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.947044 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.947064 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.947098 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.947119 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/88718387-09d6-4e3d-a06f-4353ba42ce91-server-conf\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.947146 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/88718387-09d6-4e3d-a06f-4353ba42ce91-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.947203 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkb6z\" (UniqueName: \"kubernetes.io/projected/88718387-09d6-4e3d-a06f-4353ba42ce91-kube-api-access-bkb6z\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.947220 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.947260 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.947281 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/88718387-09d6-4e3d-a06f-4353ba42ce91-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.947301 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/88718387-09d6-4e3d-a06f-4353ba42ce91-pod-info\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.950946 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.950991 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/88718387-09d6-4e3d-a06f-4353ba42ce91-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.951272 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.952295 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88718387-09d6-4e3d-a06f-4353ba42ce91-config-data\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.953037 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/88718387-09d6-4e3d-a06f-4353ba42ce91-server-conf\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.958606 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.958653 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6f3ec12aa3a4e2e24baea6243845f770d59c6a449418d7f6cc0746fe9e1bbf7f/globalmount\"" pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.961502 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.966469 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.982411 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/88718387-09d6-4e3d-a06f-4353ba42ce91-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.987616 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/88718387-09d6-4e3d-a06f-4353ba42ce91-pod-info\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:29 crc kubenswrapper[4746]: I0128 20:57:29.998724 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vfmkg"] Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.002777 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkb6z\" (UniqueName: \"kubernetes.io/projected/88718387-09d6-4e3d-a06f-4353ba42ce91-kube-api-access-bkb6z\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.036028 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\") pod \"rabbitmq-server-0\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " pod="openstack/rabbitmq-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.083377 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.122513 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.124335 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.131381 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.131779 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.132031 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.132245 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.132342 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.132480 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ktgk4" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.132582 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.145348 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.255867 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/701360b2-121a-4cb4-9a4f-9ce63391e740-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.255939 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.255965 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.255983 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/701360b2-121a-4cb4-9a4f-9ce63391e740-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.256002 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/701360b2-121a-4cb4-9a4f-9ce63391e740-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.256019 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/701360b2-121a-4cb4-9a4f-9ce63391e740-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.256064 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.256102 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.256118 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/701360b2-121a-4cb4-9a4f-9ce63391e740-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.256133 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr2kk\" (UniqueName: \"kubernetes.io/projected/701360b2-121a-4cb4-9a4f-9ce63391e740-kube-api-access-wr2kk\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.256149 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.357181 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/701360b2-121a-4cb4-9a4f-9ce63391e740-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.358122 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.358158 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.358184 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/701360b2-121a-4cb4-9a4f-9ce63391e740-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.358232 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/701360b2-121a-4cb4-9a4f-9ce63391e740-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.358259 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/701360b2-121a-4cb4-9a4f-9ce63391e740-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.358326 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.358363 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.358383 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/701360b2-121a-4cb4-9a4f-9ce63391e740-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.358398 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr2kk\" (UniqueName: \"kubernetes.io/projected/701360b2-121a-4cb4-9a4f-9ce63391e740-kube-api-access-wr2kk\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.358420 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.360783 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/701360b2-121a-4cb4-9a4f-9ce63391e740-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.361852 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/701360b2-121a-4cb4-9a4f-9ce63391e740-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.362643 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/701360b2-121a-4cb4-9a4f-9ce63391e740-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.363069 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.363285 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.370612 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.372868 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/701360b2-121a-4cb4-9a4f-9ce63391e740-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.373790 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.373820 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cc06eb5d713361bb8bc23b99b651543751fb837cba7bff1aeb5a7aa259796cdb/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.375998 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.381553 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/701360b2-121a-4cb4-9a4f-9ce63391e740-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.390460 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr2kk\" (UniqueName: \"kubernetes.io/projected/701360b2-121a-4cb4-9a4f-9ce63391e740-kube-api-access-wr2kk\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.438232 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\") pod \"rabbitmq-cell1-server-0\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.466707 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.628268 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" event={"ID":"0e48f2c5-a005-440d-b1d4-885bd3dd4a82","Type":"ContainerStarted","Data":"329fb4ca85ba72bf66c84f9a20f3bfedf90782590cd83726463253a42452a8d8"} Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.764928 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 20:57:30 crc kubenswrapper[4746]: I0128 20:57:30.940462 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.358655 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.360327 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.364822 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.365288 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.365662 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.366710 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-ln9jf" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.386420 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.395161 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.486381 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-574bad40-35f9-4b6d-bcbe-e4f2c6ef8574\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-574bad40-35f9-4b6d-bcbe-e4f2c6ef8574\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.486460 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e98da54b-efd0-4811-a433-9ce8134feb13-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.486482 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e98da54b-efd0-4811-a433-9ce8134feb13-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.486502 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e98da54b-efd0-4811-a433-9ce8134feb13-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.486530 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5kxk\" (UniqueName: \"kubernetes.io/projected/e98da54b-efd0-4811-a433-9ce8134feb13-kube-api-access-m5kxk\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.486557 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e98da54b-efd0-4811-a433-9ce8134feb13-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.486601 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e98da54b-efd0-4811-a433-9ce8134feb13-config-data-default\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.486619 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e98da54b-efd0-4811-a433-9ce8134feb13-kolla-config\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.587693 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e98da54b-efd0-4811-a433-9ce8134feb13-config-data-default\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.587736 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e98da54b-efd0-4811-a433-9ce8134feb13-kolla-config\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.587816 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-574bad40-35f9-4b6d-bcbe-e4f2c6ef8574\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-574bad40-35f9-4b6d-bcbe-e4f2c6ef8574\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.587841 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e98da54b-efd0-4811-a433-9ce8134feb13-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.587861 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e98da54b-efd0-4811-a433-9ce8134feb13-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.587891 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e98da54b-efd0-4811-a433-9ce8134feb13-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.587930 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5kxk\" (UniqueName: \"kubernetes.io/projected/e98da54b-efd0-4811-a433-9ce8134feb13-kube-api-access-m5kxk\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.587962 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e98da54b-efd0-4811-a433-9ce8134feb13-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.588601 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e98da54b-efd0-4811-a433-9ce8134feb13-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.589017 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e98da54b-efd0-4811-a433-9ce8134feb13-kolla-config\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.592345 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e98da54b-efd0-4811-a433-9ce8134feb13-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.592601 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.592656 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-574bad40-35f9-4b6d-bcbe-e4f2c6ef8574\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-574bad40-35f9-4b6d-bcbe-e4f2c6ef8574\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/41420077eaeb4d78306081ef8cda55ed1dbca5dd8a0bb7187e68d9b3da8c1c4c/globalmount\"" pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.597256 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e98da54b-efd0-4811-a433-9ce8134feb13-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.612651 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e98da54b-efd0-4811-a433-9ce8134feb13-config-data-default\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.619688 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e98da54b-efd0-4811-a433-9ce8134feb13-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.626555 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5kxk\" (UniqueName: \"kubernetes.io/projected/e98da54b-efd0-4811-a433-9ce8134feb13-kube-api-access-m5kxk\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.648511 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-574bad40-35f9-4b6d-bcbe-e4f2c6ef8574\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-574bad40-35f9-4b6d-bcbe-e4f2c6ef8574\") pod \"openstack-galera-0\" (UID: \"e98da54b-efd0-4811-a433-9ce8134feb13\") " pod="openstack/openstack-galera-0" Jan 28 20:57:31 crc kubenswrapper[4746]: I0128 20:57:31.690099 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 28 20:57:32 crc kubenswrapper[4746]: I0128 20:57:32.897506 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 28 20:57:32 crc kubenswrapper[4746]: I0128 20:57:32.906993 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:32 crc kubenswrapper[4746]: I0128 20:57:32.912667 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 28 20:57:32 crc kubenswrapper[4746]: I0128 20:57:32.912890 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-bl2wd" Jan 28 20:57:32 crc kubenswrapper[4746]: I0128 20:57:32.913182 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 28 20:57:32 crc kubenswrapper[4746]: I0128 20:57:32.916621 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 28 20:57:32 crc kubenswrapper[4746]: I0128 20:57:32.918462 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 28 20:57:32 crc kubenswrapper[4746]: I0128 20:57:32.956677 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 28 20:57:32 crc kubenswrapper[4746]: I0128 20:57:32.957599 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 28 20:57:32 crc kubenswrapper[4746]: I0128 20:57:32.972107 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 28 20:57:32 crc kubenswrapper[4746]: I0128 20:57:32.972330 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 28 20:57:32 crc kubenswrapper[4746]: I0128 20:57:32.972474 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-7lwpr" Jan 28 20:57:32 crc kubenswrapper[4746]: I0128 20:57:32.995643 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.019206 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-775461f2-88d6-4136-9b27-bc42baf51e65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-775461f2-88d6-4136-9b27-bc42baf51e65\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.019257 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7257206d-db68-4f31-84d1-ceb4175ea394-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.019278 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7257206d-db68-4f31-84d1-ceb4175ea394-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.019301 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh2vx\" (UniqueName: \"kubernetes.io/projected/7257206d-db68-4f31-84d1-ceb4175ea394-kube-api-access-hh2vx\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.019327 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8c01b9a6-3e78-4a0c-9825-e39856c2df93-kolla-config\") pod \"memcached-0\" (UID: \"8c01b9a6-3e78-4a0c-9825-e39856c2df93\") " pod="openstack/memcached-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.019345 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvj9b\" (UniqueName: \"kubernetes.io/projected/8c01b9a6-3e78-4a0c-9825-e39856c2df93-kube-api-access-qvj9b\") pod \"memcached-0\" (UID: \"8c01b9a6-3e78-4a0c-9825-e39856c2df93\") " pod="openstack/memcached-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.019387 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c01b9a6-3e78-4a0c-9825-e39856c2df93-config-data\") pod \"memcached-0\" (UID: \"8c01b9a6-3e78-4a0c-9825-e39856c2df93\") " pod="openstack/memcached-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.019406 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c01b9a6-3e78-4a0c-9825-e39856c2df93-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8c01b9a6-3e78-4a0c-9825-e39856c2df93\") " pod="openstack/memcached-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.019427 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7257206d-db68-4f31-84d1-ceb4175ea394-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.019457 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7257206d-db68-4f31-84d1-ceb4175ea394-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.019478 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7257206d-db68-4f31-84d1-ceb4175ea394-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.019499 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c01b9a6-3e78-4a0c-9825-e39856c2df93-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8c01b9a6-3e78-4a0c-9825-e39856c2df93\") " pod="openstack/memcached-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.019517 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7257206d-db68-4f31-84d1-ceb4175ea394-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.122390 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c01b9a6-3e78-4a0c-9825-e39856c2df93-config-data\") pod \"memcached-0\" (UID: \"8c01b9a6-3e78-4a0c-9825-e39856c2df93\") " pod="openstack/memcached-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.122451 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c01b9a6-3e78-4a0c-9825-e39856c2df93-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8c01b9a6-3e78-4a0c-9825-e39856c2df93\") " pod="openstack/memcached-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.122481 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7257206d-db68-4f31-84d1-ceb4175ea394-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.122530 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7257206d-db68-4f31-84d1-ceb4175ea394-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.122561 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7257206d-db68-4f31-84d1-ceb4175ea394-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.122592 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c01b9a6-3e78-4a0c-9825-e39856c2df93-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8c01b9a6-3e78-4a0c-9825-e39856c2df93\") " pod="openstack/memcached-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.122618 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7257206d-db68-4f31-84d1-ceb4175ea394-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.122687 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-775461f2-88d6-4136-9b27-bc42baf51e65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-775461f2-88d6-4136-9b27-bc42baf51e65\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.122713 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7257206d-db68-4f31-84d1-ceb4175ea394-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.122741 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7257206d-db68-4f31-84d1-ceb4175ea394-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.122769 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh2vx\" (UniqueName: \"kubernetes.io/projected/7257206d-db68-4f31-84d1-ceb4175ea394-kube-api-access-hh2vx\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.122800 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8c01b9a6-3e78-4a0c-9825-e39856c2df93-kolla-config\") pod \"memcached-0\" (UID: \"8c01b9a6-3e78-4a0c-9825-e39856c2df93\") " pod="openstack/memcached-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.122825 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvj9b\" (UniqueName: \"kubernetes.io/projected/8c01b9a6-3e78-4a0c-9825-e39856c2df93-kube-api-access-qvj9b\") pod \"memcached-0\" (UID: \"8c01b9a6-3e78-4a0c-9825-e39856c2df93\") " pod="openstack/memcached-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.124318 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c01b9a6-3e78-4a0c-9825-e39856c2df93-config-data\") pod \"memcached-0\" (UID: \"8c01b9a6-3e78-4a0c-9825-e39856c2df93\") " pod="openstack/memcached-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.124833 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7257206d-db68-4f31-84d1-ceb4175ea394-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.126001 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7257206d-db68-4f31-84d1-ceb4175ea394-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.126501 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7257206d-db68-4f31-84d1-ceb4175ea394-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.130496 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8c01b9a6-3e78-4a0c-9825-e39856c2df93-kolla-config\") pod \"memcached-0\" (UID: \"8c01b9a6-3e78-4a0c-9825-e39856c2df93\") " pod="openstack/memcached-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.131215 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c01b9a6-3e78-4a0c-9825-e39856c2df93-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8c01b9a6-3e78-4a0c-9825-e39856c2df93\") " pod="openstack/memcached-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.142347 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7257206d-db68-4f31-84d1-ceb4175ea394-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.143250 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7257206d-db68-4f31-84d1-ceb4175ea394-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.143902 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7257206d-db68-4f31-84d1-ceb4175ea394-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.146871 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c01b9a6-3e78-4a0c-9825-e39856c2df93-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8c01b9a6-3e78-4a0c-9825-e39856c2df93\") " pod="openstack/memcached-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.149346 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvj9b\" (UniqueName: \"kubernetes.io/projected/8c01b9a6-3e78-4a0c-9825-e39856c2df93-kube-api-access-qvj9b\") pod \"memcached-0\" (UID: \"8c01b9a6-3e78-4a0c-9825-e39856c2df93\") " pod="openstack/memcached-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.149897 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.149930 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-775461f2-88d6-4136-9b27-bc42baf51e65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-775461f2-88d6-4136-9b27-bc42baf51e65\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8b7ff00f4c260bb06f7cf2f85e4b6a288b875bad3aae1dea9205ab95a009ae01/globalmount\"" pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.162858 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh2vx\" (UniqueName: \"kubernetes.io/projected/7257206d-db68-4f31-84d1-ceb4175ea394-kube-api-access-hh2vx\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.225630 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-775461f2-88d6-4136-9b27-bc42baf51e65\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-775461f2-88d6-4136-9b27-bc42baf51e65\") pod \"openstack-cell1-galera-0\" (UID: \"7257206d-db68-4f31-84d1-ceb4175ea394\") " pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.290390 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 28 20:57:33 crc kubenswrapper[4746]: I0128 20:57:33.319370 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 28 20:57:34 crc kubenswrapper[4746]: I0128 20:57:34.828511 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 20:57:34 crc kubenswrapper[4746]: I0128 20:57:34.829597 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 28 20:57:34 crc kubenswrapper[4746]: I0128 20:57:34.832023 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-ftsk7" Jan 28 20:57:34 crc kubenswrapper[4746]: I0128 20:57:34.875969 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksj4w\" (UniqueName: \"kubernetes.io/projected/4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17-kube-api-access-ksj4w\") pod \"kube-state-metrics-0\" (UID: \"4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17\") " pod="openstack/kube-state-metrics-0" Jan 28 20:57:34 crc kubenswrapper[4746]: I0128 20:57:34.898337 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 20:57:34 crc kubenswrapper[4746]: I0128 20:57:34.979485 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksj4w\" (UniqueName: \"kubernetes.io/projected/4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17-kube-api-access-ksj4w\") pod \"kube-state-metrics-0\" (UID: \"4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17\") " pod="openstack/kube-state-metrics-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.008467 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksj4w\" (UniqueName: \"kubernetes.io/projected/4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17-kube-api-access-ksj4w\") pod \"kube-state-metrics-0\" (UID: \"4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17\") " pod="openstack/kube-state-metrics-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.174774 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.557733 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.559465 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.564794 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.568181 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.568523 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-bwsj7" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.571961 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.572068 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.572914 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.721871 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0701e4bf-44d6-462c-a55b-140c2efceb6b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.721925 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0701e4bf-44d6-462c-a55b-140c2efceb6b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.721959 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0701e4bf-44d6-462c-a55b-140c2efceb6b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.721999 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/0701e4bf-44d6-462c-a55b-140c2efceb6b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.722037 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0701e4bf-44d6-462c-a55b-140c2efceb6b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.722097 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqswl\" (UniqueName: \"kubernetes.io/projected/0701e4bf-44d6-462c-a55b-140c2efceb6b-kube-api-access-nqswl\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.722129 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0701e4bf-44d6-462c-a55b-140c2efceb6b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.823865 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0701e4bf-44d6-462c-a55b-140c2efceb6b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.824223 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0701e4bf-44d6-462c-a55b-140c2efceb6b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.824244 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0701e4bf-44d6-462c-a55b-140c2efceb6b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.824277 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/0701e4bf-44d6-462c-a55b-140c2efceb6b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.824302 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0701e4bf-44d6-462c-a55b-140c2efceb6b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.824329 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqswl\" (UniqueName: \"kubernetes.io/projected/0701e4bf-44d6-462c-a55b-140c2efceb6b-kube-api-access-nqswl\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.824349 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0701e4bf-44d6-462c-a55b-140c2efceb6b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.824908 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/0701e4bf-44d6-462c-a55b-140c2efceb6b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.834684 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0701e4bf-44d6-462c-a55b-140c2efceb6b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.839875 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0701e4bf-44d6-462c-a55b-140c2efceb6b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.849853 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0701e4bf-44d6-462c-a55b-140c2efceb6b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.853260 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0701e4bf-44d6-462c-a55b-140c2efceb6b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.858741 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0701e4bf-44d6-462c-a55b-140c2efceb6b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:35 crc kubenswrapper[4746]: I0128 20:57:35.886103 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqswl\" (UniqueName: \"kubernetes.io/projected/0701e4bf-44d6-462c-a55b-140c2efceb6b-kube-api-access-nqswl\") pod \"alertmanager-metric-storage-0\" (UID: \"0701e4bf-44d6-462c-a55b-140c2efceb6b\") " pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.178308 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.181958 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.186516 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.189114 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.189138 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.189158 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.189885 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.190134 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.190227 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-9bwbj" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.193038 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.195936 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.198267 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.335756 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fde93743-7b9d-4175-abdf-bd74008cf4b0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.335809 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fde93743-7b9d-4175-abdf-bd74008cf4b0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.335883 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fde93743-7b9d-4175-abdf-bd74008cf4b0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.335945 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fde93743-7b9d-4175-abdf-bd74008cf4b0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.336005 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fde93743-7b9d-4175-abdf-bd74008cf4b0-config\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.336030 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fde93743-7b9d-4175-abdf-bd74008cf4b0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.336045 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd6fp\" (UniqueName: \"kubernetes.io/projected/fde93743-7b9d-4175-abdf-bd74008cf4b0-kube-api-access-fd6fp\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.336068 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fde93743-7b9d-4175-abdf-bd74008cf4b0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.336103 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.336331 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fde93743-7b9d-4175-abdf-bd74008cf4b0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.437435 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fde93743-7b9d-4175-abdf-bd74008cf4b0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.437493 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fde93743-7b9d-4175-abdf-bd74008cf4b0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.437517 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fde93743-7b9d-4175-abdf-bd74008cf4b0-config\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.437541 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fde93743-7b9d-4175-abdf-bd74008cf4b0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.437560 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd6fp\" (UniqueName: \"kubernetes.io/projected/fde93743-7b9d-4175-abdf-bd74008cf4b0-kube-api-access-fd6fp\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.437583 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fde93743-7b9d-4175-abdf-bd74008cf4b0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.437929 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.438431 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fde93743-7b9d-4175-abdf-bd74008cf4b0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.439020 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fde93743-7b9d-4175-abdf-bd74008cf4b0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.439109 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fde93743-7b9d-4175-abdf-bd74008cf4b0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.439129 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fde93743-7b9d-4175-abdf-bd74008cf4b0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.439663 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fde93743-7b9d-4175-abdf-bd74008cf4b0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.441068 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fde93743-7b9d-4175-abdf-bd74008cf4b0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.442683 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fde93743-7b9d-4175-abdf-bd74008cf4b0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.446498 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fde93743-7b9d-4175-abdf-bd74008cf4b0-config\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.446859 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fde93743-7b9d-4175-abdf-bd74008cf4b0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.451559 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fde93743-7b9d-4175-abdf-bd74008cf4b0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.455877 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fde93743-7b9d-4175-abdf-bd74008cf4b0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.465093 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.465142 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/02b2347eca05efae332ac6f226bae6da2144dba7ab9a77b7543473de2684cdce/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.474482 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd6fp\" (UniqueName: \"kubernetes.io/projected/fde93743-7b9d-4175-abdf-bd74008cf4b0-kube-api-access-fd6fp\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.530367 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\") pod \"prometheus-metric-storage-0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:36 crc kubenswrapper[4746]: I0128 20:57:36.808724 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 20:57:38 crc kubenswrapper[4746]: I0128 20:57:38.969060 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ms9wc"] Jan 28 20:57:38 crc kubenswrapper[4746]: I0128 20:57:38.975467 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:38 crc kubenswrapper[4746]: I0128 20:57:38.978348 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 28 20:57:38 crc kubenswrapper[4746]: I0128 20:57:38.978650 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 28 20:57:38 crc kubenswrapper[4746]: I0128 20:57:38.978892 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-cpvh2" Jan 28 20:57:38 crc kubenswrapper[4746]: I0128 20:57:38.990005 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-fcvh6"] Jan 28 20:57:38 crc kubenswrapper[4746]: I0128 20:57:38.991663 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.003528 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ms9wc"] Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.040504 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-fcvh6"] Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.086436 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b1288d6-9c28-48e5-a97f-bdd75de9b8a2-scripts\") pod \"ovn-controller-ovs-fcvh6\" (UID: \"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2\") " pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.086518 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/754a9c43-4753-41cd-945d-93f7fa2b715e-ovn-controller-tls-certs\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.086543 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2b1288d6-9c28-48e5-a97f-bdd75de9b8a2-etc-ovs\") pod \"ovn-controller-ovs-fcvh6\" (UID: \"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2\") " pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.086563 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/754a9c43-4753-41cd-945d-93f7fa2b715e-var-run-ovn\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.086590 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/754a9c43-4753-41cd-945d-93f7fa2b715e-combined-ca-bundle\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.086619 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2b1288d6-9c28-48e5-a97f-bdd75de9b8a2-var-lib\") pod \"ovn-controller-ovs-fcvh6\" (UID: \"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2\") " pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.086636 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2b1288d6-9c28-48e5-a97f-bdd75de9b8a2-var-run\") pod \"ovn-controller-ovs-fcvh6\" (UID: \"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2\") " pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.086730 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/754a9c43-4753-41cd-945d-93f7fa2b715e-var-run\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.086753 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c6pf\" (UniqueName: \"kubernetes.io/projected/754a9c43-4753-41cd-945d-93f7fa2b715e-kube-api-access-5c6pf\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.086768 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2b1288d6-9c28-48e5-a97f-bdd75de9b8a2-var-log\") pod \"ovn-controller-ovs-fcvh6\" (UID: \"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2\") " pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.086820 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/754a9c43-4753-41cd-945d-93f7fa2b715e-var-log-ovn\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.086884 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/754a9c43-4753-41cd-945d-93f7fa2b715e-scripts\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.086909 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx77j\" (UniqueName: \"kubernetes.io/projected/2b1288d6-9c28-48e5-a97f-bdd75de9b8a2-kube-api-access-jx77j\") pod \"ovn-controller-ovs-fcvh6\" (UID: \"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2\") " pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.188182 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/754a9c43-4753-41cd-945d-93f7fa2b715e-var-log-ovn\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.188249 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/754a9c43-4753-41cd-945d-93f7fa2b715e-scripts\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.188279 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx77j\" (UniqueName: \"kubernetes.io/projected/2b1288d6-9c28-48e5-a97f-bdd75de9b8a2-kube-api-access-jx77j\") pod \"ovn-controller-ovs-fcvh6\" (UID: \"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2\") " pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.188316 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b1288d6-9c28-48e5-a97f-bdd75de9b8a2-scripts\") pod \"ovn-controller-ovs-fcvh6\" (UID: \"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2\") " pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.188339 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/754a9c43-4753-41cd-945d-93f7fa2b715e-ovn-controller-tls-certs\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.188362 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2b1288d6-9c28-48e5-a97f-bdd75de9b8a2-etc-ovs\") pod \"ovn-controller-ovs-fcvh6\" (UID: \"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2\") " pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.188387 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/754a9c43-4753-41cd-945d-93f7fa2b715e-var-run-ovn\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.188417 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/754a9c43-4753-41cd-945d-93f7fa2b715e-combined-ca-bundle\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.188451 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2b1288d6-9c28-48e5-a97f-bdd75de9b8a2-var-lib\") pod \"ovn-controller-ovs-fcvh6\" (UID: \"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2\") " pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.188473 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2b1288d6-9c28-48e5-a97f-bdd75de9b8a2-var-run\") pod \"ovn-controller-ovs-fcvh6\" (UID: \"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2\") " pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.188515 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/754a9c43-4753-41cd-945d-93f7fa2b715e-var-run\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.188546 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c6pf\" (UniqueName: \"kubernetes.io/projected/754a9c43-4753-41cd-945d-93f7fa2b715e-kube-api-access-5c6pf\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.188568 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2b1288d6-9c28-48e5-a97f-bdd75de9b8a2-var-log\") pod \"ovn-controller-ovs-fcvh6\" (UID: \"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2\") " pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.189227 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2b1288d6-9c28-48e5-a97f-bdd75de9b8a2-var-log\") pod \"ovn-controller-ovs-fcvh6\" (UID: \"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2\") " pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.189344 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/754a9c43-4753-41cd-945d-93f7fa2b715e-var-log-ovn\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.191651 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/754a9c43-4753-41cd-945d-93f7fa2b715e-scripts\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.193887 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b1288d6-9c28-48e5-a97f-bdd75de9b8a2-scripts\") pod \"ovn-controller-ovs-fcvh6\" (UID: \"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2\") " pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.194769 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2b1288d6-9c28-48e5-a97f-bdd75de9b8a2-var-lib\") pod \"ovn-controller-ovs-fcvh6\" (UID: \"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2\") " pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.194866 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/754a9c43-4753-41cd-945d-93f7fa2b715e-var-run-ovn\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.194969 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/754a9c43-4753-41cd-945d-93f7fa2b715e-var-run\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.195023 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2b1288d6-9c28-48e5-a97f-bdd75de9b8a2-var-run\") pod \"ovn-controller-ovs-fcvh6\" (UID: \"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2\") " pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.195019 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2b1288d6-9c28-48e5-a97f-bdd75de9b8a2-etc-ovs\") pod \"ovn-controller-ovs-fcvh6\" (UID: \"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2\") " pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.200968 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/754a9c43-4753-41cd-945d-93f7fa2b715e-combined-ca-bundle\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.200970 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/754a9c43-4753-41cd-945d-93f7fa2b715e-ovn-controller-tls-certs\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.212754 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx77j\" (UniqueName: \"kubernetes.io/projected/2b1288d6-9c28-48e5-a97f-bdd75de9b8a2-kube-api-access-jx77j\") pod \"ovn-controller-ovs-fcvh6\" (UID: \"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2\") " pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.216252 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c6pf\" (UniqueName: \"kubernetes.io/projected/754a9c43-4753-41cd-945d-93f7fa2b715e-kube-api-access-5c6pf\") pod \"ovn-controller-ms9wc\" (UID: \"754a9c43-4753-41cd-945d-93f7fa2b715e\") " pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.327876 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ms9wc" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.340022 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.647972 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.649241 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.652865 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.653046 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-wwrw9" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.653698 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.653847 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.654150 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.670904 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.712451 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"88718387-09d6-4e3d-a06f-4353ba42ce91","Type":"ContainerStarted","Data":"64fd52a5f6f1b19040c4a7e9bea22fab671a685a5e6979c081c503d6ab6812c7"} Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.713492 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"701360b2-121a-4cb4-9a4f-9ce63391e740","Type":"ContainerStarted","Data":"288c0fefb19fd38219e1688e0284a2c2b633a493917d689a3750d7b80adf7b08"} Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.804703 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1e0be80-baed-4c8f-affd-33a252b527ad-config\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.804793 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ng5q\" (UniqueName: \"kubernetes.io/projected/d1e0be80-baed-4c8f-affd-33a252b527ad-kube-api-access-7ng5q\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.804826 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1e0be80-baed-4c8f-affd-33a252b527ad-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.804864 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e0be80-baed-4c8f-affd-33a252b527ad-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.804901 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1e0be80-baed-4c8f-affd-33a252b527ad-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.804931 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1e0be80-baed-4c8f-affd-33a252b527ad-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.804975 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f1dba101-a5a3-465f-b82f-7071cde7e786\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1dba101-a5a3-465f-b82f-7071cde7e786\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.805225 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1e0be80-baed-4c8f-affd-33a252b527ad-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.906331 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1e0be80-baed-4c8f-affd-33a252b527ad-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.907587 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1e0be80-baed-4c8f-affd-33a252b527ad-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.908204 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1e0be80-baed-4c8f-affd-33a252b527ad-config\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.908653 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ng5q\" (UniqueName: \"kubernetes.io/projected/d1e0be80-baed-4c8f-affd-33a252b527ad-kube-api-access-7ng5q\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.908692 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1e0be80-baed-4c8f-affd-33a252b527ad-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.908735 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e0be80-baed-4c8f-affd-33a252b527ad-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.908781 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1e0be80-baed-4c8f-affd-33a252b527ad-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.908807 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1e0be80-baed-4c8f-affd-33a252b527ad-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.908839 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f1dba101-a5a3-465f-b82f-7071cde7e786\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1dba101-a5a3-465f-b82f-7071cde7e786\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.909301 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1e0be80-baed-4c8f-affd-33a252b527ad-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.909959 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1e0be80-baed-4c8f-affd-33a252b527ad-config\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.914237 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1e0be80-baed-4c8f-affd-33a252b527ad-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.914590 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1e0be80-baed-4c8f-affd-33a252b527ad-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.915960 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e0be80-baed-4c8f-affd-33a252b527ad-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.915992 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.916036 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f1dba101-a5a3-465f-b82f-7071cde7e786\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1dba101-a5a3-465f-b82f-7071cde7e786\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/aa10d0d47615218621245b919aa9e6dee3f3395e2a7df7eb67a21b861b23fa97/globalmount\"" pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.958027 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ng5q\" (UniqueName: \"kubernetes.io/projected/d1e0be80-baed-4c8f-affd-33a252b527ad-kube-api-access-7ng5q\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.963470 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f1dba101-a5a3-465f-b82f-7071cde7e786\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f1dba101-a5a3-465f-b82f-7071cde7e786\") pod \"ovsdbserver-nb-0\" (UID: \"d1e0be80-baed-4c8f-affd-33a252b527ad\") " pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:39 crc kubenswrapper[4746]: I0128 20:57:39.974598 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.281919 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.283597 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.287079 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.287468 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.288105 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-rpsqd" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.288261 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.307556 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.460164 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.460217 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.460253 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.460276 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-575f7911-53da-432c-8bdc-cb90b4422b8a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-575f7911-53da-432c-8bdc-cb90b4422b8a\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.460383 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.460439 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn8fd\" (UniqueName: \"kubernetes.io/projected/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-kube-api-access-mn8fd\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.460525 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-config\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.460624 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.562312 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-config\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.562424 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.562617 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.562659 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.563478 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.563507 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-575f7911-53da-432c-8bdc-cb90b4422b8a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-575f7911-53da-432c-8bdc-cb90b4422b8a\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.563410 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-config\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.563538 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.563605 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn8fd\" (UniqueName: \"kubernetes.io/projected/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-kube-api-access-mn8fd\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.563866 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.564327 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.569943 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.573161 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.573775 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.590015 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn8fd\" (UniqueName: \"kubernetes.io/projected/692d10ed-801f-47d2-b069-b3a0cb8dc4b7-kube-api-access-mn8fd\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.591923 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.591964 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-575f7911-53da-432c-8bdc-cb90b4422b8a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-575f7911-53da-432c-8bdc-cb90b4422b8a\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9ed9212e3c3e9040899025ffc4d8422d9a3cced5664560aeb505632fdd2667d9/globalmount\"" pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.704201 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-575f7911-53da-432c-8bdc-cb90b4422b8a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-575f7911-53da-432c-8bdc-cb90b4422b8a\") pod \"ovsdbserver-sb-0\" (UID: \"692d10ed-801f-47d2-b069-b3a0cb8dc4b7\") " pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:43 crc kubenswrapper[4746]: I0128 20:57:43.923053 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 28 20:57:47 crc kubenswrapper[4746]: I0128 20:57:47.569398 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf"] Jan 28 20:57:47 crc kubenswrapper[4746]: I0128 20:57:47.570742 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:57:47 crc kubenswrapper[4746]: I0128 20:57:47.575132 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Jan 28 20:57:47 crc kubenswrapper[4746]: I0128 20:57:47.575305 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Jan 28 20:57:47 crc kubenswrapper[4746]: I0128 20:57:47.579450 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Jan 28 20:57:47 crc kubenswrapper[4746]: I0128 20:57:47.583484 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-hkgqf" Jan 28 20:57:47 crc kubenswrapper[4746]: I0128 20:57:47.594352 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Jan 28 20:57:47 crc kubenswrapper[4746]: I0128 20:57:47.609274 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf"] Jan 28 20:57:47 crc kubenswrapper[4746]: I0128 20:57:47.735127 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/7b3d4385-f154-424c-b7b6-280c36a88967-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-55rmf\" (UID: \"7b3d4385-f154-424c-b7b6-280c36a88967\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:57:47 crc kubenswrapper[4746]: I0128 20:57:47.735229 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/7b3d4385-f154-424c-b7b6-280c36a88967-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-55rmf\" (UID: \"7b3d4385-f154-424c-b7b6-280c36a88967\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:57:47 crc kubenswrapper[4746]: I0128 20:57:47.735302 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b3d4385-f154-424c-b7b6-280c36a88967-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-55rmf\" (UID: \"7b3d4385-f154-424c-b7b6-280c36a88967\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:57:47 crc kubenswrapper[4746]: I0128 20:57:47.735518 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3d4385-f154-424c-b7b6-280c36a88967-config\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-55rmf\" (UID: \"7b3d4385-f154-424c-b7b6-280c36a88967\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:57:47 crc kubenswrapper[4746]: I0128 20:57:47.735556 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmhqt\" (UniqueName: \"kubernetes.io/projected/7b3d4385-f154-424c-b7b6-280c36a88967-kube-api-access-vmhqt\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-55rmf\" (UID: \"7b3d4385-f154-424c-b7b6-280c36a88967\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:47.836620 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/7b3d4385-f154-424c-b7b6-280c36a88967-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-55rmf\" (UID: \"7b3d4385-f154-424c-b7b6-280c36a88967\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:47.836882 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/7b3d4385-f154-424c-b7b6-280c36a88967-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-55rmf\" (UID: \"7b3d4385-f154-424c-b7b6-280c36a88967\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:47.836935 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b3d4385-f154-424c-b7b6-280c36a88967-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-55rmf\" (UID: \"7b3d4385-f154-424c-b7b6-280c36a88967\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:47.836990 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3d4385-f154-424c-b7b6-280c36a88967-config\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-55rmf\" (UID: \"7b3d4385-f154-424c-b7b6-280c36a88967\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:47.837008 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmhqt\" (UniqueName: \"kubernetes.io/projected/7b3d4385-f154-424c-b7b6-280c36a88967-kube-api-access-vmhqt\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-55rmf\" (UID: \"7b3d4385-f154-424c-b7b6-280c36a88967\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:47.838379 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b3d4385-f154-424c-b7b6-280c36a88967-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-55rmf\" (UID: \"7b3d4385-f154-424c-b7b6-280c36a88967\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:47.838998 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3d4385-f154-424c-b7b6-280c36a88967-config\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-55rmf\" (UID: \"7b3d4385-f154-424c-b7b6-280c36a88967\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:47.852944 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/7b3d4385-f154-424c-b7b6-280c36a88967-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-55rmf\" (UID: \"7b3d4385-f154-424c-b7b6-280c36a88967\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:47.858220 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/7b3d4385-f154-424c-b7b6-280c36a88967-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-55rmf\" (UID: \"7b3d4385-f154-424c-b7b6-280c36a88967\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:47.863023 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmhqt\" (UniqueName: \"kubernetes.io/projected/7b3d4385-f154-424c-b7b6-280c36a88967-kube-api-access-vmhqt\") pod \"cloudkitty-lokistack-distributor-66dfd9bb-55rmf\" (UID: \"7b3d4385-f154-424c-b7b6-280c36a88967\") " pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:47.891305 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:47.906634 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2"] Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:47.907684 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:47.911225 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:47.913282 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:47.928119 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:47.934898 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2"] Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.019307 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh"] Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.023516 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.029385 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.029759 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.037118 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh"] Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.040895 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/9f570ea4-b303-46ab-8a65-cf64391aeb3b-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-gb5z2\" (UID: \"9f570ea4-b303-46ab-8a65-cf64391aeb3b\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.040972 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xt25\" (UniqueName: \"kubernetes.io/projected/9f570ea4-b303-46ab-8a65-cf64391aeb3b-kube-api-access-9xt25\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-gb5z2\" (UID: \"9f570ea4-b303-46ab-8a65-cf64391aeb3b\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.041027 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f570ea4-b303-46ab-8a65-cf64391aeb3b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-gb5z2\" (UID: \"9f570ea4-b303-46ab-8a65-cf64391aeb3b\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.041050 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/9f570ea4-b303-46ab-8a65-cf64391aeb3b-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-gb5z2\" (UID: \"9f570ea4-b303-46ab-8a65-cf64391aeb3b\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.041118 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f570ea4-b303-46ab-8a65-cf64391aeb3b-config\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-gb5z2\" (UID: \"9f570ea4-b303-46ab-8a65-cf64391aeb3b\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.041153 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/9f570ea4-b303-46ab-8a65-cf64391aeb3b-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-gb5z2\" (UID: \"9f570ea4-b303-46ab-8a65-cf64391aeb3b\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.142188 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/9f570ea4-b303-46ab-8a65-cf64391aeb3b-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-gb5z2\" (UID: \"9f570ea4-b303-46ab-8a65-cf64391aeb3b\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.142249 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/add39f1a-2338-41e9-9a61-d32fe5a28097-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh\" (UID: \"add39f1a-2338-41e9-9a61-d32fe5a28097\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.142285 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xt25\" (UniqueName: \"kubernetes.io/projected/9f570ea4-b303-46ab-8a65-cf64391aeb3b-kube-api-access-9xt25\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-gb5z2\" (UID: \"9f570ea4-b303-46ab-8a65-cf64391aeb3b\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.142314 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f570ea4-b303-46ab-8a65-cf64391aeb3b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-gb5z2\" (UID: \"9f570ea4-b303-46ab-8a65-cf64391aeb3b\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.142337 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/9f570ea4-b303-46ab-8a65-cf64391aeb3b-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-gb5z2\" (UID: \"9f570ea4-b303-46ab-8a65-cf64391aeb3b\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.142370 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/add39f1a-2338-41e9-9a61-d32fe5a28097-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh\" (UID: \"add39f1a-2338-41e9-9a61-d32fe5a28097\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.143208 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9"] Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.144569 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.145004 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f570ea4-b303-46ab-8a65-cf64391aeb3b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-gb5z2\" (UID: \"9f570ea4-b303-46ab-8a65-cf64391aeb3b\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.145178 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f570ea4-b303-46ab-8a65-cf64391aeb3b-config\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-gb5z2\" (UID: \"9f570ea4-b303-46ab-8a65-cf64391aeb3b\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.145260 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrtnl\" (UniqueName: \"kubernetes.io/projected/add39f1a-2338-41e9-9a61-d32fe5a28097-kube-api-access-xrtnl\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh\" (UID: \"add39f1a-2338-41e9-9a61-d32fe5a28097\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.145288 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/9f570ea4-b303-46ab-8a65-cf64391aeb3b-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-gb5z2\" (UID: \"9f570ea4-b303-46ab-8a65-cf64391aeb3b\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.145355 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/add39f1a-2338-41e9-9a61-d32fe5a28097-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh\" (UID: \"add39f1a-2338-41e9-9a61-d32fe5a28097\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.145386 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/add39f1a-2338-41e9-9a61-d32fe5a28097-config\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh\" (UID: \"add39f1a-2338-41e9-9a61-d32fe5a28097\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.147040 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f570ea4-b303-46ab-8a65-cf64391aeb3b-config\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-gb5z2\" (UID: \"9f570ea4-b303-46ab-8a65-cf64391aeb3b\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.147321 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.147327 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.148779 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.148940 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.149043 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.151643 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/9f570ea4-b303-46ab-8a65-cf64391aeb3b-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-gb5z2\" (UID: \"9f570ea4-b303-46ab-8a65-cf64391aeb3b\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.151878 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/9f570ea4-b303-46ab-8a65-cf64391aeb3b-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-gb5z2\" (UID: \"9f570ea4-b303-46ab-8a65-cf64391aeb3b\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.157007 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.163947 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/9f570ea4-b303-46ab-8a65-cf64391aeb3b-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-gb5z2\" (UID: \"9f570ea4-b303-46ab-8a65-cf64391aeb3b\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.173418 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xt25\" (UniqueName: \"kubernetes.io/projected/9f570ea4-b303-46ab-8a65-cf64391aeb3b-kube-api-access-9xt25\") pod \"cloudkitty-lokistack-querier-795fd8f8cc-gb5z2\" (UID: \"9f570ea4-b303-46ab-8a65-cf64391aeb3b\") " pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.185198 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9"] Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.190271 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt"] Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.191649 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.196835 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-kq4qm" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.228345 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt"] Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.244400 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.246527 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw2vk\" (UniqueName: \"kubernetes.io/projected/247c16c1-2e4e-48dd-b836-0792f7231417-kube-api-access-mw2vk\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.246576 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/add39f1a-2338-41e9-9a61-d32fe5a28097-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh\" (UID: \"add39f1a-2338-41e9-9a61-d32fe5a28097\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.246608 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247c16c1-2e4e-48dd-b836-0792f7231417-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.246637 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/f6b72417-5723-4d82-928b-f4be94e4bbfd-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.246692 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6b72417-5723-4d82-928b-f4be94e4bbfd-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.246822 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-874kl\" (UniqueName: \"kubernetes.io/projected/f6b72417-5723-4d82-928b-f4be94e4bbfd-kube-api-access-874kl\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.246879 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/f6b72417-5723-4d82-928b-f4be94e4bbfd-tenants\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.246935 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrtnl\" (UniqueName: \"kubernetes.io/projected/add39f1a-2338-41e9-9a61-d32fe5a28097-kube-api-access-xrtnl\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh\" (UID: \"add39f1a-2338-41e9-9a61-d32fe5a28097\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.246972 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6b72417-5723-4d82-928b-f4be94e4bbfd-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.246997 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/247c16c1-2e4e-48dd-b836-0792f7231417-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.247042 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247c16c1-2e4e-48dd-b836-0792f7231417-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.247129 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/f6b72417-5723-4d82-928b-f4be94e4bbfd-rbac\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.247172 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/247c16c1-2e4e-48dd-b836-0792f7231417-tenants\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.247200 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/add39f1a-2338-41e9-9a61-d32fe5a28097-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh\" (UID: \"add39f1a-2338-41e9-9a61-d32fe5a28097\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.247249 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/add39f1a-2338-41e9-9a61-d32fe5a28097-config\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh\" (UID: \"add39f1a-2338-41e9-9a61-d32fe5a28097\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.247330 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6b72417-5723-4d82-928b-f4be94e4bbfd-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.247414 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247c16c1-2e4e-48dd-b836-0792f7231417-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.247451 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/f6b72417-5723-4d82-928b-f4be94e4bbfd-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.247500 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/add39f1a-2338-41e9-9a61-d32fe5a28097-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh\" (UID: \"add39f1a-2338-41e9-9a61-d32fe5a28097\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.247546 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/247c16c1-2e4e-48dd-b836-0792f7231417-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.247582 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/f6b72417-5723-4d82-928b-f4be94e4bbfd-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.247658 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/247c16c1-2e4e-48dd-b836-0792f7231417-rbac\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.247681 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/247c16c1-2e4e-48dd-b836-0792f7231417-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.250455 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/add39f1a-2338-41e9-9a61-d32fe5a28097-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh\" (UID: \"add39f1a-2338-41e9-9a61-d32fe5a28097\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.250695 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/add39f1a-2338-41e9-9a61-d32fe5a28097-config\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh\" (UID: \"add39f1a-2338-41e9-9a61-d32fe5a28097\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.252727 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/add39f1a-2338-41e9-9a61-d32fe5a28097-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh\" (UID: \"add39f1a-2338-41e9-9a61-d32fe5a28097\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.269572 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/add39f1a-2338-41e9-9a61-d32fe5a28097-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh\" (UID: \"add39f1a-2338-41e9-9a61-d32fe5a28097\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.272955 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrtnl\" (UniqueName: \"kubernetes.io/projected/add39f1a-2338-41e9-9a61-d32fe5a28097-kube-api-access-xrtnl\") pod \"cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh\" (UID: \"add39f1a-2338-41e9-9a61-d32fe5a28097\") " pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.349647 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/247c16c1-2e4e-48dd-b836-0792f7231417-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.349728 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/f6b72417-5723-4d82-928b-f4be94e4bbfd-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.349781 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/247c16c1-2e4e-48dd-b836-0792f7231417-rbac\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.349806 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/247c16c1-2e4e-48dd-b836-0792f7231417-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.349865 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw2vk\" (UniqueName: \"kubernetes.io/projected/247c16c1-2e4e-48dd-b836-0792f7231417-kube-api-access-mw2vk\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: E0128 20:57:48.349873 4746 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.349916 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247c16c1-2e4e-48dd-b836-0792f7231417-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.349943 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/f6b72417-5723-4d82-928b-f4be94e4bbfd-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: E0128 20:57:48.349986 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/247c16c1-2e4e-48dd-b836-0792f7231417-tls-secret podName:247c16c1-2e4e-48dd-b836-0792f7231417 nodeName:}" failed. No retries permitted until 2026-01-28 20:57:48.849960509 +0000 UTC m=+1096.806147043 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/247c16c1-2e4e-48dd-b836-0792f7231417-tls-secret") pod "cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" (UID: "247c16c1-2e4e-48dd-b836-0792f7231417") : secret "cloudkitty-lokistack-gateway-http" not found Jan 28 20:57:48 crc kubenswrapper[4746]: E0128 20:57:48.350121 4746 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Jan 28 20:57:48 crc kubenswrapper[4746]: E0128 20:57:48.350228 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6b72417-5723-4d82-928b-f4be94e4bbfd-tls-secret podName:f6b72417-5723-4d82-928b-f4be94e4bbfd nodeName:}" failed. No retries permitted until 2026-01-28 20:57:48.850195435 +0000 UTC m=+1096.806381979 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/f6b72417-5723-4d82-928b-f4be94e4bbfd-tls-secret") pod "cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" (UID: "f6b72417-5723-4d82-928b-f4be94e4bbfd") : secret "cloudkitty-lokistack-gateway-http" not found Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.351442 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/247c16c1-2e4e-48dd-b836-0792f7231417-rbac\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.351422 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247c16c1-2e4e-48dd-b836-0792f7231417-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.351592 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6b72417-5723-4d82-928b-f4be94e4bbfd-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.351634 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6b72417-5723-4d82-928b-f4be94e4bbfd-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.351769 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-874kl\" (UniqueName: \"kubernetes.io/projected/f6b72417-5723-4d82-928b-f4be94e4bbfd-kube-api-access-874kl\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.351844 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/f6b72417-5723-4d82-928b-f4be94e4bbfd-tenants\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.352335 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6b72417-5723-4d82-928b-f4be94e4bbfd-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.352362 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/247c16c1-2e4e-48dd-b836-0792f7231417-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.352422 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247c16c1-2e4e-48dd-b836-0792f7231417-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.352521 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/f6b72417-5723-4d82-928b-f4be94e4bbfd-rbac\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.352570 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/247c16c1-2e4e-48dd-b836-0792f7231417-tenants\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.352674 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6b72417-5723-4d82-928b-f4be94e4bbfd-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.352761 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247c16c1-2e4e-48dd-b836-0792f7231417-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.352800 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/f6b72417-5723-4d82-928b-f4be94e4bbfd-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.353266 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6b72417-5723-4d82-928b-f4be94e4bbfd-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.356222 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247c16c1-2e4e-48dd-b836-0792f7231417-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.356791 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.356920 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/f6b72417-5723-4d82-928b-f4be94e4bbfd-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.356955 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6b72417-5723-4d82-928b-f4be94e4bbfd-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.357249 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/f6b72417-5723-4d82-928b-f4be94e4bbfd-rbac\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.357678 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/247c16c1-2e4e-48dd-b836-0792f7231417-tenants\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.358410 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/f6b72417-5723-4d82-928b-f4be94e4bbfd-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.359065 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247c16c1-2e4e-48dd-b836-0792f7231417-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.359116 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/f6b72417-5723-4d82-928b-f4be94e4bbfd-tenants\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.359165 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/247c16c1-2e4e-48dd-b836-0792f7231417-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.359216 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/247c16c1-2e4e-48dd-b836-0792f7231417-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.369674 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw2vk\" (UniqueName: \"kubernetes.io/projected/247c16c1-2e4e-48dd-b836-0792f7231417-kube-api-access-mw2vk\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.370531 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-874kl\" (UniqueName: \"kubernetes.io/projected/f6b72417-5723-4d82-928b-f4be94e4bbfd-kube-api-access-874kl\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.857277 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.859257 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.864471 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.864583 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.865355 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/f6b72417-5723-4d82-928b-f4be94e4bbfd-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.865463 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/247c16c1-2e4e-48dd-b836-0792f7231417-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.872951 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/247c16c1-2e4e-48dd-b836-0792f7231417-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-jptw9\" (UID: \"247c16c1-2e4e-48dd-b836-0792f7231417\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.875418 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/f6b72417-5723-4d82-928b-f4be94e4bbfd-tls-secret\") pod \"cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt\" (UID: \"f6b72417-5723-4d82-928b-f4be94e4bbfd\") " pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.881066 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.967230 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/d3cad0b0-7b53-4280-9dec-05e01692820c-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.967306 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3cad0b0-7b53-4280-9dec-05e01692820c-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.967360 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.967385 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/d3cad0b0-7b53-4280-9dec-05e01692820c-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.967452 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/d3cad0b0-7b53-4280-9dec-05e01692820c-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.967513 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.967594 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3cad0b0-7b53-4280-9dec-05e01692820c-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.967620 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqw9q\" (UniqueName: \"kubernetes.io/projected/d3cad0b0-7b53-4280-9dec-05e01692820c-kube-api-access-wqw9q\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.993239 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.996435 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.998696 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Jan 28 20:57:48 crc kubenswrapper[4746]: I0128 20:57:48.998815 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.011554 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.070037 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/d3cad0b0-7b53-4280-9dec-05e01692820c-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.070137 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.070189 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3cad0b0-7b53-4280-9dec-05e01692820c-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.070218 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqw9q\" (UniqueName: \"kubernetes.io/projected/d3cad0b0-7b53-4280-9dec-05e01692820c-kube-api-access-wqw9q\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.070287 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/d3cad0b0-7b53-4280-9dec-05e01692820c-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.070352 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3cad0b0-7b53-4280-9dec-05e01692820c-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.070403 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.070424 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/d3cad0b0-7b53-4280-9dec-05e01692820c-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.070583 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.071423 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3cad0b0-7b53-4280-9dec-05e01692820c-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.071748 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.071909 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3cad0b0-7b53-4280-9dec-05e01692820c-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.089926 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/d3cad0b0-7b53-4280-9dec-05e01692820c-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.096622 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/d3cad0b0-7b53-4280-9dec-05e01692820c-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.101088 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqw9q\" (UniqueName: \"kubernetes.io/projected/d3cad0b0-7b53-4280-9dec-05e01692820c-kube-api-access-wqw9q\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.105212 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.106591 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/d3cad0b0-7b53-4280-9dec-05e01692820c-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.107382 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.113547 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.113840 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.116171 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.127766 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.141454 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.145301 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"d3cad0b0-7b53-4280-9dec-05e01692820c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.149592 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.172686 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6edc718f-ce48-415e-ae81-574ef1f48cb6-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.172932 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5nkz\" (UniqueName: \"kubernetes.io/projected/6edc718f-ce48-415e-ae81-574ef1f48cb6-kube-api-access-j5nkz\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.172972 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/6edc718f-ce48-415e-ae81-574ef1f48cb6-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.173172 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/6edc718f-ce48-415e-ae81-574ef1f48cb6-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.173205 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/6edc718f-ce48-415e-ae81-574ef1f48cb6-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.173255 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.173274 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6edc718f-ce48-415e-ae81-574ef1f48cb6-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.231577 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.275089 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6edc718f-ce48-415e-ae81-574ef1f48cb6-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.275158 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.275221 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5nkz\" (UniqueName: \"kubernetes.io/projected/6edc718f-ce48-415e-ae81-574ef1f48cb6-kube-api-access-j5nkz\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.275249 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/6edc718f-ce48-415e-ae81-574ef1f48cb6-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.275267 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/6edc718f-ce48-415e-ae81-574ef1f48cb6-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.275284 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.275301 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/6edc718f-ce48-415e-ae81-574ef1f48cb6-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.275322 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.275342 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.275360 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.275377 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.275394 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csnfv\" (UniqueName: \"kubernetes.io/projected/e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef-kube-api-access-csnfv\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.275413 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6edc718f-ce48-415e-ae81-574ef1f48cb6-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.275440 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.275778 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.276346 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6edc718f-ce48-415e-ae81-574ef1f48cb6-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.276900 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6edc718f-ce48-415e-ae81-574ef1f48cb6-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.281717 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/6edc718f-ce48-415e-ae81-574ef1f48cb6-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.282782 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/6edc718f-ce48-415e-ae81-574ef1f48cb6-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.282924 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/6edc718f-ce48-415e-ae81-574ef1f48cb6-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.295027 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5nkz\" (UniqueName: \"kubernetes.io/projected/6edc718f-ce48-415e-ae81-574ef1f48cb6-kube-api-access-j5nkz\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.318849 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"6edc718f-ce48-415e-ae81-574ef1f48cb6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.358012 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.379034 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.379306 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.379342 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.379650 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.379692 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.379722 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.379750 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csnfv\" (UniqueName: \"kubernetes.io/projected/e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef-kube-api-access-csnfv\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.379908 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.381898 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.382884 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.385254 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.386031 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.386539 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.406096 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.411273 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csnfv\" (UniqueName: \"kubernetes.io/projected/e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef-kube-api-access-csnfv\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:49 crc kubenswrapper[4746]: I0128 20:57:49.493983 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:57:55 crc kubenswrapper[4746]: I0128 20:57:55.558161 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 28 20:57:55 crc kubenswrapper[4746]: I0128 20:57:55.655429 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Jan 28 20:57:57 crc kubenswrapper[4746]: E0128 20:57:57.338710 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 28 20:57:57 crc kubenswrapper[4746]: E0128 20:57:57.340470 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cvh5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-gkmq8_openstack(e3a48dbd-649e-4fc6-b2a4-70c587c8237f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 20:57:57 crc kubenswrapper[4746]: E0128 20:57:57.342024 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-gkmq8" podUID="e3a48dbd-649e-4fc6-b2a4-70c587c8237f" Jan 28 20:57:57 crc kubenswrapper[4746]: E0128 20:57:57.410831 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 28 20:57:57 crc kubenswrapper[4746]: E0128 20:57:57.411385 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9pghh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-m2457_openstack(5f99e78a-c369-416e-935b-4f9c4f3ad490): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 20:57:57 crc kubenswrapper[4746]: E0128 20:57:57.412622 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-m2457" podUID="5f99e78a-c369-416e-935b-4f9c4f3ad490" Jan 28 20:57:57 crc kubenswrapper[4746]: E0128 20:57:57.445367 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 28 20:57:57 crc kubenswrapper[4746]: E0128 20:57:57.446313 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lz2v8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-gbg8l_openstack(fcb4de78-f54a-4f35-ba3f-960655540032): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 20:57:57 crc kubenswrapper[4746]: E0128 20:57:57.447440 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" podUID="fcb4de78-f54a-4f35-ba3f-960655540032" Jan 28 20:57:57 crc kubenswrapper[4746]: E0128 20:57:57.456740 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 28 20:57:57 crc kubenswrapper[4746]: E0128 20:57:57.456866 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gc6mn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-vfmkg_openstack(0e48f2c5-a005-440d-b1d4-885bd3dd4a82): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 20:57:57 crc kubenswrapper[4746]: E0128 20:57:57.458128 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" podUID="0e48f2c5-a005-440d-b1d4-885bd3dd4a82" Jan 28 20:57:57 crc kubenswrapper[4746]: I0128 20:57:57.847598 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0701e4bf-44d6-462c-a55b-140c2efceb6b","Type":"ContainerStarted","Data":"e9930d0bdbcaf92c0832920b1d21f14df695bf54ba22767c6388a1d65b39ec3d"} Jan 28 20:57:57 crc kubenswrapper[4746]: I0128 20:57:57.850058 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7257206d-db68-4f31-84d1-ceb4175ea394","Type":"ContainerStarted","Data":"fc4d754816b1f98249e2a56d879ea1a3fe963f2f3e68b667b03ad529afdea14a"} Jan 28 20:57:57 crc kubenswrapper[4746]: E0128 20:57:57.852264 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" podUID="0e48f2c5-a005-440d-b1d4-885bd3dd4a82" Jan 28 20:57:57 crc kubenswrapper[4746]: E0128 20:57:57.857263 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" podUID="fcb4de78-f54a-4f35-ba3f-960655540032" Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.421868 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.732653 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh"] Jan 28 20:57:58 crc kubenswrapper[4746]: W0128 20:57:58.756149 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b3d4385_f154_424c_b7b6_280c36a88967.slice/crio-4eb86d212c360c8ed79a6a53ad26a5d6ca76d0a5bdc3b9d11e18cbe3a5b079ff WatchSource:0}: Error finding container 4eb86d212c360c8ed79a6a53ad26a5d6ca76d0a5bdc3b9d11e18cbe3a5b079ff: Status 404 returned error can't find the container with id 4eb86d212c360c8ed79a6a53ad26a5d6ca76d0a5bdc3b9d11e18cbe3a5b079ff Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.757396 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf"] Jan 28 20:57:58 crc kubenswrapper[4746]: W0128 20:57:58.758447 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadd39f1a_2338_41e9_9a61_d32fe5a28097.slice/crio-5cf73607b6b844e349b1aedfbd810e4fa9d7115aae06a9d49fa3af28bd1cc242 WatchSource:0}: Error finding container 5cf73607b6b844e349b1aedfbd810e4fa9d7115aae06a9d49fa3af28bd1cc242: Status 404 returned error can't find the container with id 5cf73607b6b844e349b1aedfbd810e4fa9d7115aae06a9d49fa3af28bd1cc242 Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.776013 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9"] Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.790298 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt"] Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.809257 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2"] Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.819962 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.858260 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" event={"ID":"7b3d4385-f154-424c-b7b6-280c36a88967","Type":"ContainerStarted","Data":"4eb86d212c360c8ed79a6a53ad26a5d6ca76d0a5bdc3b9d11e18cbe3a5b079ff"} Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.858606 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-gkmq8" Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.859490 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-gkmq8" event={"ID":"e3a48dbd-649e-4fc6-b2a4-70c587c8237f","Type":"ContainerDied","Data":"8abc0399150cde17ff910dad6117ea320f57a484a974d3dab0333253f20ec545"} Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.864286 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e98da54b-efd0-4811-a433-9ce8134feb13","Type":"ContainerStarted","Data":"726ee27a4e061bc8b980d968dd0b0976bbdc29258c2b7cfb21e0fcfa95e32803"} Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.865751 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-m2457" Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.866243 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" event={"ID":"247c16c1-2e4e-48dd-b836-0792f7231417","Type":"ContainerStarted","Data":"d1f972dd8179aab4cb5d5fe676b677240742c66f403e5be29e4b46f4bf1ae6b7"} Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.867309 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-m2457" event={"ID":"5f99e78a-c369-416e-935b-4f9c4f3ad490","Type":"ContainerDied","Data":"408ca0f5431ad66517b40a051feaacb63df9daaa0d1e02a2efd3bfd418dbe2a6"} Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.867371 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-m2457" Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.869335 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" event={"ID":"f6b72417-5723-4d82-928b-f4be94e4bbfd","Type":"ContainerStarted","Data":"c01aa82ac858bfad77a1c05b19d562b0e93a18059c1a2cd9ec2de0d7fc061af0"} Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.884753 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" event={"ID":"add39f1a-2338-41e9-9a61-d32fe5a28097","Type":"ContainerStarted","Data":"5cf73607b6b844e349b1aedfbd810e4fa9d7115aae06a9d49fa3af28bd1cc242"} Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.886616 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" event={"ID":"9f570ea4-b303-46ab-8a65-cf64391aeb3b","Type":"ContainerStarted","Data":"ca244ad31781f099bb307b6aac9f5554d1efaf1adefa8f24aaee35a05125781f"} Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.888667 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3a48dbd-649e-4fc6-b2a4-70c587c8237f-config\") pod \"e3a48dbd-649e-4fc6-b2a4-70c587c8237f\" (UID: \"e3a48dbd-649e-4fc6-b2a4-70c587c8237f\") " Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.888997 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pghh\" (UniqueName: \"kubernetes.io/projected/5f99e78a-c369-416e-935b-4f9c4f3ad490-kube-api-access-9pghh\") pod \"5f99e78a-c369-416e-935b-4f9c4f3ad490\" (UID: \"5f99e78a-c369-416e-935b-4f9c4f3ad490\") " Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.889179 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3a48dbd-649e-4fc6-b2a4-70c587c8237f-dns-svc\") pod \"e3a48dbd-649e-4fc6-b2a4-70c587c8237f\" (UID: \"e3a48dbd-649e-4fc6-b2a4-70c587c8237f\") " Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.889239 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvh5p\" (UniqueName: \"kubernetes.io/projected/e3a48dbd-649e-4fc6-b2a4-70c587c8237f-kube-api-access-cvh5p\") pod \"e3a48dbd-649e-4fc6-b2a4-70c587c8237f\" (UID: \"e3a48dbd-649e-4fc6-b2a4-70c587c8237f\") " Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.889293 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f99e78a-c369-416e-935b-4f9c4f3ad490-config\") pod \"5f99e78a-c369-416e-935b-4f9c4f3ad490\" (UID: \"5f99e78a-c369-416e-935b-4f9c4f3ad490\") " Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.889485 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3a48dbd-649e-4fc6-b2a4-70c587c8237f-config" (OuterVolumeSpecName: "config") pod "e3a48dbd-649e-4fc6-b2a4-70c587c8237f" (UID: "e3a48dbd-649e-4fc6-b2a4-70c587c8237f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.890065 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3a48dbd-649e-4fc6-b2a4-70c587c8237f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e3a48dbd-649e-4fc6-b2a4-70c587c8237f" (UID: "e3a48dbd-649e-4fc6-b2a4-70c587c8237f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.890930 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f99e78a-c369-416e-935b-4f9c4f3ad490-config" (OuterVolumeSpecName: "config") pod "5f99e78a-c369-416e-935b-4f9c4f3ad490" (UID: "5f99e78a-c369-416e-935b-4f9c4f3ad490"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.894327 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3a48dbd-649e-4fc6-b2a4-70c587c8237f-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.996227 4746 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3a48dbd-649e-4fc6-b2a4-70c587c8237f-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.996259 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f99e78a-c369-416e-935b-4f9c4f3ad490-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.997334 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f99e78a-c369-416e-935b-4f9c4f3ad490-kube-api-access-9pghh" (OuterVolumeSpecName: "kube-api-access-9pghh") pod "5f99e78a-c369-416e-935b-4f9c4f3ad490" (UID: "5f99e78a-c369-416e-935b-4f9c4f3ad490"). InnerVolumeSpecName "kube-api-access-9pghh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:57:58 crc kubenswrapper[4746]: I0128 20:57:58.997397 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a48dbd-649e-4fc6-b2a4-70c587c8237f-kube-api-access-cvh5p" (OuterVolumeSpecName: "kube-api-access-cvh5p") pod "e3a48dbd-649e-4fc6-b2a4-70c587c8237f" (UID: "e3a48dbd-649e-4fc6-b2a4-70c587c8237f"). InnerVolumeSpecName "kube-api-access-cvh5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.098476 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pghh\" (UniqueName: \"kubernetes.io/projected/5f99e78a-c369-416e-935b-4f9c4f3ad490-kube-api-access-9pghh\") on node \"crc\" DevicePath \"\"" Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.098786 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvh5p\" (UniqueName: \"kubernetes.io/projected/e3a48dbd-649e-4fc6-b2a4-70c587c8237f-kube-api-access-cvh5p\") on node \"crc\" DevicePath \"\"" Jan 28 20:57:59 crc kubenswrapper[4746]: W0128 20:57:59.123520 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c01b9a6_3e78_4a0c_9825_e39856c2df93.slice/crio-5eb114b2fd4e47e49c6317a6b69bfea6fd32686ad8a1d0881723bf647419d4e9 WatchSource:0}: Error finding container 5eb114b2fd4e47e49c6317a6b69bfea6fd32686ad8a1d0881723bf647419d4e9: Status 404 returned error can't find the container with id 5eb114b2fd4e47e49c6317a6b69bfea6fd32686ad8a1d0881723bf647419d4e9 Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.139430 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.227049 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m2457"] Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.233296 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-m2457"] Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.299305 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.384190 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.406468 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ms9wc"] Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.412583 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.419290 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.432051 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.439122 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.497344 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-fcvh6"] Jan 28 20:57:59 crc kubenswrapper[4746]: W0128 20:57:59.755219 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3cad0b0_7b53_4280_9dec_05e01692820c.slice/crio-a9f86144a154356a1425b19bae9e7744c6e7dde9bdf77f63e63e01011d9929e9 WatchSource:0}: Error finding container a9f86144a154356a1425b19bae9e7744c6e7dde9bdf77f63e63e01011d9929e9: Status 404 returned error can't find the container with id a9f86144a154356a1425b19bae9e7744c6e7dde9bdf77f63e63e01011d9929e9 Jan 28 20:57:59 crc kubenswrapper[4746]: W0128 20:57:59.763254 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1e0be80_baed_4c8f_affd_33a252b527ad.slice/crio-9f8015ce7b76914f21b8547ecc407e34c3c2b6bdf03eae3626b9409ab7d6f9ce WatchSource:0}: Error finding container 9f8015ce7b76914f21b8547ecc407e34c3c2b6bdf03eae3626b9409ab7d6f9ce: Status 404 returned error can't find the container with id 9f8015ce7b76914f21b8547ecc407e34c3c2b6bdf03eae3626b9409ab7d6f9ce Jan 28 20:57:59 crc kubenswrapper[4746]: W0128 20:57:59.770014 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod754a9c43_4753_41cd_945d_93f7fa2b715e.slice/crio-08dd4b34348854e38e4cd699591314303e9dd7a8d00dd6f6a1cebe70e950dbc8 WatchSource:0}: Error finding container 08dd4b34348854e38e4cd699591314303e9dd7a8d00dd6f6a1cebe70e950dbc8: Status 404 returned error can't find the container with id 08dd4b34348854e38e4cd699591314303e9dd7a8d00dd6f6a1cebe70e950dbc8 Jan 28 20:57:59 crc kubenswrapper[4746]: W0128 20:57:59.774026 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6e1dec5_d0eb_4a49_b8c7_c89f3defbcef.slice/crio-ad43d728b402747bbac0b343767d8c778035e0b9c25c3f36121caa8282437212 WatchSource:0}: Error finding container ad43d728b402747bbac0b343767d8c778035e0b9c25c3f36121caa8282437212: Status 404 returned error can't find the container with id ad43d728b402747bbac0b343767d8c778035e0b9c25c3f36121caa8282437212 Jan 28 20:57:59 crc kubenswrapper[4746]: W0128 20:57:59.779686 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod692d10ed_801f_47d2_b069_b3a0cb8dc4b7.slice/crio-d3481d17153250a5fd435f2307134593284c9250122a74cd2bd4332b7ba28753 WatchSource:0}: Error finding container d3481d17153250a5fd435f2307134593284c9250122a74cd2bd4332b7ba28753: Status 404 returned error can't find the container with id d3481d17153250a5fd435f2307134593284c9250122a74cd2bd4332b7ba28753 Jan 28 20:57:59 crc kubenswrapper[4746]: W0128 20:57:59.780955 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4aa3c3d3_f7a7_4e26_bf26_630de3cc7a17.slice/crio-bf44be3c0234f66aaa68f27d0dbf90d6605b576dcc3ac644a36e0b8b3263f898 WatchSource:0}: Error finding container bf44be3c0234f66aaa68f27d0dbf90d6605b576dcc3ac644a36e0b8b3263f898: Status 404 returned error can't find the container with id bf44be3c0234f66aaa68f27d0dbf90d6605b576dcc3ac644a36e0b8b3263f898 Jan 28 20:57:59 crc kubenswrapper[4746]: W0128 20:57:59.796769 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6edc718f_ce48_415e_ae81_574ef1f48cb6.slice/crio-fa025734146ab05c246a67ac8d9c620134b11a9628e3c5365a171be8d2ce0354 WatchSource:0}: Error finding container fa025734146ab05c246a67ac8d9c620134b11a9628e3c5365a171be8d2ce0354: Status 404 returned error can't find the container with id fa025734146ab05c246a67ac8d9c620134b11a9628e3c5365a171be8d2ce0354 Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.906872 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8c01b9a6-3e78-4a0c-9825-e39856c2df93","Type":"ContainerStarted","Data":"5eb114b2fd4e47e49c6317a6b69bfea6fd32686ad8a1d0881723bf647419d4e9"} Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.907996 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fde93743-7b9d-4175-abdf-bd74008cf4b0","Type":"ContainerStarted","Data":"40d7ba0b2886f49e10fa30076957a979cdf53e9e3cd86fc83316978ff59321c8"} Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.909689 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"d3cad0b0-7b53-4280-9dec-05e01692820c","Type":"ContainerStarted","Data":"a9f86144a154356a1425b19bae9e7744c6e7dde9bdf77f63e63e01011d9929e9"} Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.911164 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"88718387-09d6-4e3d-a06f-4353ba42ce91","Type":"ContainerStarted","Data":"ff6dfe0b7527df02f28dd43e980cf6fdbb546706b9ce1c38c5e9ebe0e3c4c38a"} Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.912136 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ms9wc" event={"ID":"754a9c43-4753-41cd-945d-93f7fa2b715e","Type":"ContainerStarted","Data":"08dd4b34348854e38e4cd699591314303e9dd7a8d00dd6f6a1cebe70e950dbc8"} Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.915653 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17","Type":"ContainerStarted","Data":"bf44be3c0234f66aaa68f27d0dbf90d6605b576dcc3ac644a36e0b8b3263f898"} Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.917412 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d1e0be80-baed-4c8f-affd-33a252b527ad","Type":"ContainerStarted","Data":"9f8015ce7b76914f21b8547ecc407e34c3c2b6bdf03eae3626b9409ab7d6f9ce"} Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.918602 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"6edc718f-ce48-415e-ae81-574ef1f48cb6","Type":"ContainerStarted","Data":"fa025734146ab05c246a67ac8d9c620134b11a9628e3c5365a171be8d2ce0354"} Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.920114 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"692d10ed-801f-47d2-b069-b3a0cb8dc4b7","Type":"ContainerStarted","Data":"d3481d17153250a5fd435f2307134593284c9250122a74cd2bd4332b7ba28753"} Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.921970 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fcvh6" event={"ID":"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2","Type":"ContainerStarted","Data":"a022a0ae477df769b5e63ab16168161e71d234cb00e13b069640328c2fb06f19"} Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.922989 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef","Type":"ContainerStarted","Data":"ad43d728b402747bbac0b343767d8c778035e0b9c25c3f36121caa8282437212"} Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.924318 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-gkmq8" Jan 28 20:57:59 crc kubenswrapper[4746]: I0128 20:57:59.924439 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"701360b2-121a-4cb4-9a4f-9ce63391e740","Type":"ContainerStarted","Data":"f2f571246c74fa9c9e1b471c32df4a45d7e6ced3641e96b15c5ddca28302d0b2"} Jan 28 20:58:00 crc kubenswrapper[4746]: I0128 20:58:00.020648 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gkmq8"] Jan 28 20:58:00 crc kubenswrapper[4746]: I0128 20:58:00.035506 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gkmq8"] Jan 28 20:58:00 crc kubenswrapper[4746]: E0128 20:58:00.173828 4746 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3a48dbd_649e_4fc6_b2a4_70c587c8237f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3a48dbd_649e_4fc6_b2a4_70c587c8237f.slice/crio-8abc0399150cde17ff910dad6117ea320f57a484a974d3dab0333253f20ec545\": RecentStats: unable to find data in memory cache]" Jan 28 20:58:00 crc kubenswrapper[4746]: I0128 20:58:00.865811 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f99e78a-c369-416e-935b-4f9c4f3ad490" path="/var/lib/kubelet/pods/5f99e78a-c369-416e-935b-4f9c4f3ad490/volumes" Jan 28 20:58:00 crc kubenswrapper[4746]: I0128 20:58:00.866188 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3a48dbd-649e-4fc6-b2a4-70c587c8237f" path="/var/lib/kubelet/pods/e3a48dbd-649e-4fc6-b2a4-70c587c8237f/volumes" Jan 28 20:58:02 crc kubenswrapper[4746]: I0128 20:58:02.998983 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0701e4bf-44d6-462c-a55b-140c2efceb6b","Type":"ContainerStarted","Data":"1cc370ef10bdd16098796a7cb6bbd8a489680efff656ff0c5aaf3bf340fd296a"} Jan 28 20:58:09 crc kubenswrapper[4746]: I0128 20:58:09.063247 4746 generic.go:334] "Generic (PLEG): container finished" podID="0701e4bf-44d6-462c-a55b-140c2efceb6b" containerID="1cc370ef10bdd16098796a7cb6bbd8a489680efff656ff0c5aaf3bf340fd296a" exitCode=0 Jan 28 20:58:09 crc kubenswrapper[4746]: I0128 20:58:09.063328 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0701e4bf-44d6-462c-a55b-140c2efceb6b","Type":"ContainerDied","Data":"1cc370ef10bdd16098796a7cb6bbd8a489680efff656ff0c5aaf3bf340fd296a"} Jan 28 20:58:10 crc kubenswrapper[4746]: I0128 20:58:10.072749 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8c01b9a6-3e78-4a0c-9825-e39856c2df93","Type":"ContainerStarted","Data":"f859c82a67f480dc4f677c4cc3d32764569e8477f691d93973a0f1cc341285ec"} Jan 28 20:58:10 crc kubenswrapper[4746]: I0128 20:58:10.075129 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 28 20:58:10 crc kubenswrapper[4746]: I0128 20:58:10.099951 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=28.684493386 podStartE2EDuration="38.099928769s" podCreationTimestamp="2026-01-28 20:57:32 +0000 UTC" firstStartedPulling="2026-01-28 20:57:59.173393562 +0000 UTC m=+1107.129579916" lastFinishedPulling="2026-01-28 20:58:08.588828945 +0000 UTC m=+1116.545015299" observedRunningTime="2026-01-28 20:58:10.090534278 +0000 UTC m=+1118.046720652" watchObservedRunningTime="2026-01-28 20:58:10.099928769 +0000 UTC m=+1118.056115123" Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.082804 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"692d10ed-801f-47d2-b069-b3a0cb8dc4b7","Type":"ContainerStarted","Data":"cdb0bc25231550ab5878f6c117d723ac007423fe883ad4f3fccfd97efd8e8631"} Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.084813 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" event={"ID":"7b3d4385-f154-424c-b7b6-280c36a88967","Type":"ContainerStarted","Data":"23fa38c62c7972479b3b5a95e4112d622453fbe9234d4532b7655d32c1ea1c93"} Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.085359 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.088888 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"6edc718f-ce48-415e-ae81-574ef1f48cb6","Type":"ContainerStarted","Data":"5354e614290e8832408b74d8f0713b3da51f59227b8f99efbb2fc0eabbea24cd"} Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.089615 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.094643 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" event={"ID":"f6b72417-5723-4d82-928b-f4be94e4bbfd","Type":"ContainerStarted","Data":"a04cf2e83da46bd142591eddc0d95224bf09c009bcb91f21a46d948b178f876e"} Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.095746 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.099053 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" event={"ID":"add39f1a-2338-41e9-9a61-d32fe5a28097","Type":"ContainerStarted","Data":"f90959ebc7b9ad2ac4f5b09e869a3d0d338102e0f072d86196255d8ae23cedfb"} Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.099677 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.108311 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" podStartSLOduration=14.283260881 podStartE2EDuration="24.108288695s" podCreationTimestamp="2026-01-28 20:57:47 +0000 UTC" firstStartedPulling="2026-01-28 20:57:58.763595325 +0000 UTC m=+1106.719781679" lastFinishedPulling="2026-01-28 20:58:08.588623139 +0000 UTC m=+1116.544809493" observedRunningTime="2026-01-28 20:58:11.105586102 +0000 UTC m=+1119.061772456" watchObservedRunningTime="2026-01-28 20:58:11.108288695 +0000 UTC m=+1119.064475059" Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.112673 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"d3cad0b0-7b53-4280-9dec-05e01692820c","Type":"ContainerStarted","Data":"ceaa61a61470f279f61aa3c6cb11d83e33840fc94875f8916a8de624e7c5a84a"} Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.113612 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.119352 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef","Type":"ContainerStarted","Data":"2e3af671061082442ef2cf003ef7feebf2654ad46700e4bec31739987a77e52c"} Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.119501 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.122524 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7257206d-db68-4f31-84d1-ceb4175ea394","Type":"ContainerStarted","Data":"c4d73b2f92e6eeaf0cd370487c8b8dcf4e4a0fa19f0e5cec00bc27f0b70988d1"} Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.135684 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e98da54b-efd0-4811-a433-9ce8134feb13","Type":"ContainerStarted","Data":"3af966fe0f7cafb0f0891d536b2adf38e9026a7c01f85db57d6e4d5ace99e33c"} Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.137072 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.150545 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" podStartSLOduration=13.576298517 podStartE2EDuration="24.150523038s" podCreationTimestamp="2026-01-28 20:57:47 +0000 UTC" firstStartedPulling="2026-01-28 20:57:58.763553134 +0000 UTC m=+1106.719739488" lastFinishedPulling="2026-01-28 20:58:09.337777655 +0000 UTC m=+1117.293964009" observedRunningTime="2026-01-28 20:58:11.133729458 +0000 UTC m=+1119.089915812" watchObservedRunningTime="2026-01-28 20:58:11.150523038 +0000 UTC m=+1119.106709392" Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.157254 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fcvh6" event={"ID":"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2","Type":"ContainerStarted","Data":"15ff7baffe1bcfa53cd2f4b25f16f50aae8e5187111ac09c8490506669d2787d"} Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.171719 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt" podStartSLOduration=12.844927669 podStartE2EDuration="23.171696636s" podCreationTimestamp="2026-01-28 20:57:48 +0000 UTC" firstStartedPulling="2026-01-28 20:57:58.788970255 +0000 UTC m=+1106.745156609" lastFinishedPulling="2026-01-28 20:58:09.115739222 +0000 UTC m=+1117.071925576" observedRunningTime="2026-01-28 20:58:11.165410817 +0000 UTC m=+1119.121597191" watchObservedRunningTime="2026-01-28 20:58:11.171696636 +0000 UTC m=+1119.127882990" Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.200305 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=14.536720356 podStartE2EDuration="24.200289592s" podCreationTimestamp="2026-01-28 20:57:47 +0000 UTC" firstStartedPulling="2026-01-28 20:57:59.842300016 +0000 UTC m=+1107.798486360" lastFinishedPulling="2026-01-28 20:58:09.505869242 +0000 UTC m=+1117.462055596" observedRunningTime="2026-01-28 20:58:11.189651087 +0000 UTC m=+1119.145837441" watchObservedRunningTime="2026-01-28 20:58:11.200289592 +0000 UTC m=+1119.156475946" Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.259746 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=13.754078064 podStartE2EDuration="23.259723526s" podCreationTimestamp="2026-01-28 20:57:48 +0000 UTC" firstStartedPulling="2026-01-28 20:57:59.778450314 +0000 UTC m=+1107.734636678" lastFinishedPulling="2026-01-28 20:58:09.284095786 +0000 UTC m=+1117.240282140" observedRunningTime="2026-01-28 20:58:11.255406809 +0000 UTC m=+1119.211593153" watchObservedRunningTime="2026-01-28 20:58:11.259723526 +0000 UTC m=+1119.215909880" Jan 28 20:58:11 crc kubenswrapper[4746]: I0128 20:58:11.349783 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=14.775133439 podStartE2EDuration="24.3497627s" podCreationTimestamp="2026-01-28 20:57:47 +0000 UTC" firstStartedPulling="2026-01-28 20:57:59.762022804 +0000 UTC m=+1107.718209158" lastFinishedPulling="2026-01-28 20:58:09.336652065 +0000 UTC m=+1117.292838419" observedRunningTime="2026-01-28 20:58:11.33858714 +0000 UTC m=+1119.294773494" watchObservedRunningTime="2026-01-28 20:58:11.3497627 +0000 UTC m=+1119.305949064" Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.161388 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d1e0be80-baed-4c8f-affd-33a252b527ad","Type":"ContainerStarted","Data":"44f1a83e32229b4b419778ecb06a35b12ac75b394280f302ca4b1fd957a1dbd4"} Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.162830 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fde93743-7b9d-4175-abdf-bd74008cf4b0","Type":"ContainerStarted","Data":"af4cdf1309c1a6bdd11844ef3a15794d9f902b1f3ec405bfe53f763a4322e93a"} Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.164258 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" event={"ID":"247c16c1-2e4e-48dd-b836-0792f7231417","Type":"ContainerStarted","Data":"d0cbdbb0a5f192f5a989781bfef215e24f79250e918defb2cdb7e0ebb715eae7"} Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.164688 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.165680 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" event={"ID":"9f570ea4-b303-46ab-8a65-cf64391aeb3b","Type":"ContainerStarted","Data":"cc9b947c0553b52a4f74b261fc091bf58deaae8582ae1e813d229764a0607d40"} Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.166033 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.167206 4746 generic.go:334] "Generic (PLEG): container finished" podID="2b1288d6-9c28-48e5-a97f-bdd75de9b8a2" containerID="15ff7baffe1bcfa53cd2f4b25f16f50aae8e5187111ac09c8490506669d2787d" exitCode=0 Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.167278 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fcvh6" event={"ID":"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2","Type":"ContainerDied","Data":"15ff7baffe1bcfa53cd2f4b25f16f50aae8e5187111ac09c8490506669d2787d"} Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.169194 4746 generic.go:334] "Generic (PLEG): container finished" podID="0e48f2c5-a005-440d-b1d4-885bd3dd4a82" containerID="86285df83ebe099072ee72ae8481ed25518db8ae522dc26fa319741f5e10321d" exitCode=0 Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.169248 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" event={"ID":"0e48f2c5-a005-440d-b1d4-885bd3dd4a82","Type":"ContainerDied","Data":"86285df83ebe099072ee72ae8481ed25518db8ae522dc26fa319741f5e10321d"} Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.171470 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ms9wc" event={"ID":"754a9c43-4753-41cd-945d-93f7fa2b715e","Type":"ContainerStarted","Data":"3e3e24762c8e17ccb1d3430948822669e7daaca1b2139ed562c4d75bbda9f459"} Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.171889 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ms9wc" Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.174806 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17","Type":"ContainerStarted","Data":"cf6d9052c57bbb311927bde6702b43fb502f545e155aa24d1396694d9c822738"} Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.174840 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.182342 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.231023 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ms9wc" podStartSLOduration=24.669083917000002 podStartE2EDuration="34.230999147s" podCreationTimestamp="2026-01-28 20:57:38 +0000 UTC" firstStartedPulling="2026-01-28 20:57:59.774621422 +0000 UTC m=+1107.730807776" lastFinishedPulling="2026-01-28 20:58:09.336536652 +0000 UTC m=+1117.292723006" observedRunningTime="2026-01-28 20:58:12.223692661 +0000 UTC m=+1120.179879005" watchObservedRunningTime="2026-01-28 20:58:12.230999147 +0000 UTC m=+1120.187185501" Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.257865 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7db4f4db8c-jptw9" podStartSLOduration=13.709896029 podStartE2EDuration="24.257849906s" podCreationTimestamp="2026-01-28 20:57:48 +0000 UTC" firstStartedPulling="2026-01-28 20:57:58.788820151 +0000 UTC m=+1106.745006505" lastFinishedPulling="2026-01-28 20:58:09.336774028 +0000 UTC m=+1117.292960382" observedRunningTime="2026-01-28 20:58:12.253558542 +0000 UTC m=+1120.209744896" watchObservedRunningTime="2026-01-28 20:58:12.257849906 +0000 UTC m=+1120.214036260" Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.290032 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" podStartSLOduration=14.582324048 podStartE2EDuration="25.290011699s" podCreationTimestamp="2026-01-28 20:57:47 +0000 UTC" firstStartedPulling="2026-01-28 20:57:58.798330585 +0000 UTC m=+1106.754516939" lastFinishedPulling="2026-01-28 20:58:09.506018236 +0000 UTC m=+1117.462204590" observedRunningTime="2026-01-28 20:58:12.273685822 +0000 UTC m=+1120.229872176" watchObservedRunningTime="2026-01-28 20:58:12.290011699 +0000 UTC m=+1120.246198053" Jan 28 20:58:12 crc kubenswrapper[4746]: I0128 20:58:12.322825 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=27.820023983 podStartE2EDuration="38.322796218s" podCreationTimestamp="2026-01-28 20:57:34 +0000 UTC" firstStartedPulling="2026-01-28 20:57:59.840664473 +0000 UTC m=+1107.796850827" lastFinishedPulling="2026-01-28 20:58:10.343436708 +0000 UTC m=+1118.299623062" observedRunningTime="2026-01-28 20:58:12.31837749 +0000 UTC m=+1120.274563834" watchObservedRunningTime="2026-01-28 20:58:12.322796218 +0000 UTC m=+1120.278982572" Jan 28 20:58:13 crc kubenswrapper[4746]: I0128 20:58:13.187140 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0701e4bf-44d6-462c-a55b-140c2efceb6b","Type":"ContainerStarted","Data":"d949d3e8860af06309546bd7725df6c19d96587826735e12ff3a00d9c394f2f8"} Jan 28 20:58:13 crc kubenswrapper[4746]: I0128 20:58:13.189193 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" event={"ID":"0e48f2c5-a005-440d-b1d4-885bd3dd4a82","Type":"ContainerStarted","Data":"c77139041dada346e753825ccb577fd7e0206e3e918bc5faa9fb0bae913ce06a"} Jan 28 20:58:13 crc kubenswrapper[4746]: I0128 20:58:13.213738 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" podStartSLOduration=4.856399633 podStartE2EDuration="45.213705365s" podCreationTimestamp="2026-01-28 20:57:28 +0000 UTC" firstStartedPulling="2026-01-28 20:57:30.00006781 +0000 UTC m=+1077.956254174" lastFinishedPulling="2026-01-28 20:58:10.357373552 +0000 UTC m=+1118.313559906" observedRunningTime="2026-01-28 20:58:13.212516713 +0000 UTC m=+1121.168703067" watchObservedRunningTime="2026-01-28 20:58:13.213705365 +0000 UTC m=+1121.169891719" Jan 28 20:58:14 crc kubenswrapper[4746]: I0128 20:58:14.201420 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fcvh6" event={"ID":"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2","Type":"ContainerStarted","Data":"af63a2fd3d01e131e8558ef08af7ee141352fabc32302e844b8b461f7f79bad1"} Jan 28 20:58:14 crc kubenswrapper[4746]: I0128 20:58:14.201725 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fcvh6" event={"ID":"2b1288d6-9c28-48e5-a97f-bdd75de9b8a2","Type":"ContainerStarted","Data":"4b0b5276420a24a1714c7ad2889e873629ceb243863d0bce683048700052c199"} Jan 28 20:58:14 crc kubenswrapper[4746]: I0128 20:58:14.201859 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:58:14 crc kubenswrapper[4746]: I0128 20:58:14.205106 4746 generic.go:334] "Generic (PLEG): container finished" podID="fcb4de78-f54a-4f35-ba3f-960655540032" containerID="76b1bef1de0973ca2cf0cf3c38c5244dac5c458a108541869e7a5f63d1d680a6" exitCode=0 Jan 28 20:58:14 crc kubenswrapper[4746]: I0128 20:58:14.206054 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" event={"ID":"fcb4de78-f54a-4f35-ba3f-960655540032","Type":"ContainerDied","Data":"76b1bef1de0973ca2cf0cf3c38c5244dac5c458a108541869e7a5f63d1d680a6"} Jan 28 20:58:14 crc kubenswrapper[4746]: I0128 20:58:14.246264 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-fcvh6" podStartSLOduration=26.8667515 podStartE2EDuration="36.246245239s" podCreationTimestamp="2026-01-28 20:57:38 +0000 UTC" firstStartedPulling="2026-01-28 20:57:59.750698741 +0000 UTC m=+1107.706885095" lastFinishedPulling="2026-01-28 20:58:09.13019248 +0000 UTC m=+1117.086378834" observedRunningTime="2026-01-28 20:58:14.237856414 +0000 UTC m=+1122.194042778" watchObservedRunningTime="2026-01-28 20:58:14.246245239 +0000 UTC m=+1122.202431593" Jan 28 20:58:14 crc kubenswrapper[4746]: I0128 20:58:14.340882 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:58:14 crc kubenswrapper[4746]: I0128 20:58:14.372693 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" Jan 28 20:58:18 crc kubenswrapper[4746]: I0128 20:58:18.258004 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" event={"ID":"fcb4de78-f54a-4f35-ba3f-960655540032","Type":"ContainerStarted","Data":"89d9445e59761ee006ed7801dc0f527c81e1ce476e94645ba8b7dff5b9b2fa01"} Jan 28 20:58:18 crc kubenswrapper[4746]: I0128 20:58:18.258686 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" Jan 28 20:58:18 crc kubenswrapper[4746]: I0128 20:58:18.266445 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"0701e4bf-44d6-462c-a55b-140c2efceb6b","Type":"ContainerStarted","Data":"047f751691be2b66374970dc9402252f48ee590d2458dca188664f8613d1e210"} Jan 28 20:58:18 crc kubenswrapper[4746]: I0128 20:58:18.266906 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Jan 28 20:58:18 crc kubenswrapper[4746]: I0128 20:58:18.270345 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Jan 28 20:58:18 crc kubenswrapper[4746]: I0128 20:58:18.286394 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" podStartSLOduration=-9223371986.568401 podStartE2EDuration="50.286374541s" podCreationTimestamp="2026-01-28 20:57:28 +0000 UTC" firstStartedPulling="2026-01-28 20:57:29.48125766 +0000 UTC m=+1077.437444014" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:58:18.282437045 +0000 UTC m=+1126.238623439" watchObservedRunningTime="2026-01-28 20:58:18.286374541 +0000 UTC m=+1126.242560895" Jan 28 20:58:18 crc kubenswrapper[4746]: I0128 20:58:18.310074 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=28.209852026 podStartE2EDuration="43.310048036s" podCreationTimestamp="2026-01-28 20:57:35 +0000 UTC" firstStartedPulling="2026-01-28 20:57:57.354749671 +0000 UTC m=+1105.310936025" lastFinishedPulling="2026-01-28 20:58:12.454945671 +0000 UTC m=+1120.411132035" observedRunningTime="2026-01-28 20:58:18.300762696 +0000 UTC m=+1126.256949100" watchObservedRunningTime="2026-01-28 20:58:18.310048036 +0000 UTC m=+1126.266234400" Jan 28 20:58:18 crc kubenswrapper[4746]: I0128 20:58:18.320400 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 28 20:58:19 crc kubenswrapper[4746]: I0128 20:58:19.278056 4746 generic.go:334] "Generic (PLEG): container finished" podID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerID="af4cdf1309c1a6bdd11844ef3a15794d9f902b1f3ec405bfe53f763a4322e93a" exitCode=0 Jan 28 20:58:19 crc kubenswrapper[4746]: I0128 20:58:19.278208 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fde93743-7b9d-4175-abdf-bd74008cf4b0","Type":"ContainerDied","Data":"af4cdf1309c1a6bdd11844ef3a15794d9f902b1f3ec405bfe53f763a4322e93a"} Jan 28 20:58:19 crc kubenswrapper[4746]: I0128 20:58:19.373605 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" Jan 28 20:58:19 crc kubenswrapper[4746]: I0128 20:58:19.452035 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-gbg8l"] Jan 28 20:58:20 crc kubenswrapper[4746]: I0128 20:58:20.289270 4746 generic.go:334] "Generic (PLEG): container finished" podID="7257206d-db68-4f31-84d1-ceb4175ea394" containerID="c4d73b2f92e6eeaf0cd370487c8b8dcf4e4a0fa19f0e5cec00bc27f0b70988d1" exitCode=0 Jan 28 20:58:20 crc kubenswrapper[4746]: I0128 20:58:20.289404 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7257206d-db68-4f31-84d1-ceb4175ea394","Type":"ContainerDied","Data":"c4d73b2f92e6eeaf0cd370487c8b8dcf4e4a0fa19f0e5cec00bc27f0b70988d1"} Jan 28 20:58:20 crc kubenswrapper[4746]: I0128 20:58:20.291677 4746 generic.go:334] "Generic (PLEG): container finished" podID="e98da54b-efd0-4811-a433-9ce8134feb13" containerID="3af966fe0f7cafb0f0891d536b2adf38e9026a7c01f85db57d6e4d5ace99e33c" exitCode=0 Jan 28 20:58:20 crc kubenswrapper[4746]: I0128 20:58:20.291817 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e98da54b-efd0-4811-a433-9ce8134feb13","Type":"ContainerDied","Data":"3af966fe0f7cafb0f0891d536b2adf38e9026a7c01f85db57d6e4d5ace99e33c"} Jan 28 20:58:20 crc kubenswrapper[4746]: I0128 20:58:20.293186 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" podUID="fcb4de78-f54a-4f35-ba3f-960655540032" containerName="dnsmasq-dns" containerID="cri-o://89d9445e59761ee006ed7801dc0f527c81e1ce476e94645ba8b7dff5b9b2fa01" gracePeriod=10 Jan 28 20:58:22 crc kubenswrapper[4746]: I0128 20:58:22.312317 4746 generic.go:334] "Generic (PLEG): container finished" podID="fcb4de78-f54a-4f35-ba3f-960655540032" containerID="89d9445e59761ee006ed7801dc0f527c81e1ce476e94645ba8b7dff5b9b2fa01" exitCode=0 Jan 28 20:58:22 crc kubenswrapper[4746]: I0128 20:58:22.312382 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" event={"ID":"fcb4de78-f54a-4f35-ba3f-960655540032","Type":"ContainerDied","Data":"89d9445e59761ee006ed7801dc0f527c81e1ce476e94645ba8b7dff5b9b2fa01"} Jan 28 20:58:22 crc kubenswrapper[4746]: I0128 20:58:22.471664 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" Jan 28 20:58:22 crc kubenswrapper[4746]: I0128 20:58:22.610284 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz2v8\" (UniqueName: \"kubernetes.io/projected/fcb4de78-f54a-4f35-ba3f-960655540032-kube-api-access-lz2v8\") pod \"fcb4de78-f54a-4f35-ba3f-960655540032\" (UID: \"fcb4de78-f54a-4f35-ba3f-960655540032\") " Jan 28 20:58:22 crc kubenswrapper[4746]: I0128 20:58:22.612265 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcb4de78-f54a-4f35-ba3f-960655540032-dns-svc\") pod \"fcb4de78-f54a-4f35-ba3f-960655540032\" (UID: \"fcb4de78-f54a-4f35-ba3f-960655540032\") " Jan 28 20:58:22 crc kubenswrapper[4746]: I0128 20:58:22.612344 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcb4de78-f54a-4f35-ba3f-960655540032-config\") pod \"fcb4de78-f54a-4f35-ba3f-960655540032\" (UID: \"fcb4de78-f54a-4f35-ba3f-960655540032\") " Jan 28 20:58:22 crc kubenswrapper[4746]: I0128 20:58:22.614021 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcb4de78-f54a-4f35-ba3f-960655540032-kube-api-access-lz2v8" (OuterVolumeSpecName: "kube-api-access-lz2v8") pod "fcb4de78-f54a-4f35-ba3f-960655540032" (UID: "fcb4de78-f54a-4f35-ba3f-960655540032"). InnerVolumeSpecName "kube-api-access-lz2v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:22 crc kubenswrapper[4746]: I0128 20:58:22.671555 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcb4de78-f54a-4f35-ba3f-960655540032-config" (OuterVolumeSpecName: "config") pod "fcb4de78-f54a-4f35-ba3f-960655540032" (UID: "fcb4de78-f54a-4f35-ba3f-960655540032"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:22 crc kubenswrapper[4746]: I0128 20:58:22.695883 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcb4de78-f54a-4f35-ba3f-960655540032-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fcb4de78-f54a-4f35-ba3f-960655540032" (UID: "fcb4de78-f54a-4f35-ba3f-960655540032"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:22 crc kubenswrapper[4746]: I0128 20:58:22.715064 4746 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcb4de78-f54a-4f35-ba3f-960655540032-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:22 crc kubenswrapper[4746]: I0128 20:58:22.715120 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcb4de78-f54a-4f35-ba3f-960655540032-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:22 crc kubenswrapper[4746]: I0128 20:58:22.715133 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz2v8\" (UniqueName: \"kubernetes.io/projected/fcb4de78-f54a-4f35-ba3f-960655540032-kube-api-access-lz2v8\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:23 crc kubenswrapper[4746]: I0128 20:58:23.322589 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d1e0be80-baed-4c8f-affd-33a252b527ad","Type":"ContainerStarted","Data":"ecf7c122898e4c5726194876044eabe51f41effd9ed5a1cced66c8c9bcb4c85f"} Jan 28 20:58:23 crc kubenswrapper[4746]: I0128 20:58:23.324727 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" event={"ID":"fcb4de78-f54a-4f35-ba3f-960655540032","Type":"ContainerDied","Data":"f854e6067df0fc7b7e3aca35111e6885a57a2463e6f0764f4d221e954127918a"} Jan 28 20:58:23 crc kubenswrapper[4746]: I0128 20:58:23.324763 4746 scope.go:117] "RemoveContainer" containerID="89d9445e59761ee006ed7801dc0f527c81e1ce476e94645ba8b7dff5b9b2fa01" Jan 28 20:58:23 crc kubenswrapper[4746]: I0128 20:58:23.324863 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-gbg8l" Jan 28 20:58:23 crc kubenswrapper[4746]: I0128 20:58:23.327534 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7257206d-db68-4f31-84d1-ceb4175ea394","Type":"ContainerStarted","Data":"421a5f9a04b8e20c91df88238c7458911913a3763e0661859df75574e3bb1e3b"} Jan 28 20:58:23 crc kubenswrapper[4746]: I0128 20:58:23.333270 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e98da54b-efd0-4811-a433-9ce8134feb13","Type":"ContainerStarted","Data":"4627fa441d65d9a5ac4bcbe88ed214d820073b14626aff29f5494a90b6094ac1"} Jan 28 20:58:23 crc kubenswrapper[4746]: I0128 20:58:23.336654 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"692d10ed-801f-47d2-b069-b3a0cb8dc4b7","Type":"ContainerStarted","Data":"295523659c5b2f13e237d3b6106eb3f54f5e1ae8468f69982c7207050d964ce4"} Jan 28 20:58:23 crc kubenswrapper[4746]: I0128 20:58:23.351274 4746 scope.go:117] "RemoveContainer" containerID="76b1bef1de0973ca2cf0cf3c38c5244dac5c458a108541869e7a5f63d1d680a6" Jan 28 20:58:23 crc kubenswrapper[4746]: I0128 20:58:23.355369 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=22.593652951 podStartE2EDuration="45.355351298s" podCreationTimestamp="2026-01-28 20:57:38 +0000 UTC" firstStartedPulling="2026-01-28 20:57:59.766884444 +0000 UTC m=+1107.723070798" lastFinishedPulling="2026-01-28 20:58:22.528582791 +0000 UTC m=+1130.484769145" observedRunningTime="2026-01-28 20:58:23.348985688 +0000 UTC m=+1131.305172052" watchObservedRunningTime="2026-01-28 20:58:23.355351298 +0000 UTC m=+1131.311537652" Jan 28 20:58:23 crc kubenswrapper[4746]: I0128 20:58:23.370563 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-gbg8l"] Jan 28 20:58:23 crc kubenswrapper[4746]: I0128 20:58:23.384684 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-gbg8l"] Jan 28 20:58:23 crc kubenswrapper[4746]: I0128 20:58:23.395023 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=18.697057665 podStartE2EDuration="41.395005232s" podCreationTimestamp="2026-01-28 20:57:42 +0000 UTC" firstStartedPulling="2026-01-28 20:57:59.790174239 +0000 UTC m=+1107.746360583" lastFinishedPulling="2026-01-28 20:58:22.488121796 +0000 UTC m=+1130.444308150" observedRunningTime="2026-01-28 20:58:23.384700705 +0000 UTC m=+1131.340887079" watchObservedRunningTime="2026-01-28 20:58:23.395005232 +0000 UTC m=+1131.351191586" Jan 28 20:58:23 crc kubenswrapper[4746]: I0128 20:58:23.421771 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=40.439819804 podStartE2EDuration="52.421722508s" podCreationTimestamp="2026-01-28 20:57:31 +0000 UTC" firstStartedPulling="2026-01-28 20:57:57.354680899 +0000 UTC m=+1105.310867253" lastFinishedPulling="2026-01-28 20:58:09.336583603 +0000 UTC m=+1117.292769957" observedRunningTime="2026-01-28 20:58:23.415070289 +0000 UTC m=+1131.371256643" watchObservedRunningTime="2026-01-28 20:58:23.421722508 +0000 UTC m=+1131.377908862" Jan 28 20:58:23 crc kubenswrapper[4746]: I0128 20:58:23.437823 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=42.54401155 podStartE2EDuration="53.437805619s" podCreationTimestamp="2026-01-28 20:57:30 +0000 UTC" firstStartedPulling="2026-01-28 20:57:58.441646133 +0000 UTC m=+1106.397832487" lastFinishedPulling="2026-01-28 20:58:09.335440202 +0000 UTC m=+1117.291626556" observedRunningTime="2026-01-28 20:58:23.433134614 +0000 UTC m=+1131.389320968" watchObservedRunningTime="2026-01-28 20:58:23.437805619 +0000 UTC m=+1131.393991973" Jan 28 20:58:23 crc kubenswrapper[4746]: I0128 20:58:23.924527 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 28 20:58:24 crc kubenswrapper[4746]: I0128 20:58:24.844347 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcb4de78-f54a-4f35-ba3f-960655540032" path="/var/lib/kubelet/pods/fcb4de78-f54a-4f35-ba3f-960655540032/volumes" Jan 28 20:58:24 crc kubenswrapper[4746]: I0128 20:58:24.976250 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 28 20:58:24 crc kubenswrapper[4746]: I0128 20:58:24.976314 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.059989 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.139258 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-922j4"] Jan 28 20:58:25 crc kubenswrapper[4746]: E0128 20:58:25.139866 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb4de78-f54a-4f35-ba3f-960655540032" containerName="init" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.139948 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb4de78-f54a-4f35-ba3f-960655540032" containerName="init" Jan 28 20:58:25 crc kubenswrapper[4746]: E0128 20:58:25.140019 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb4de78-f54a-4f35-ba3f-960655540032" containerName="dnsmasq-dns" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.140094 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb4de78-f54a-4f35-ba3f-960655540032" containerName="dnsmasq-dns" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.140302 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb4de78-f54a-4f35-ba3f-960655540032" containerName="dnsmasq-dns" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.141237 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-922j4" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.187431 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-922j4"] Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.237838 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.274553 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85q6g\" (UniqueName: \"kubernetes.io/projected/d4bec6f2-d32d-4c25-ac4f-fddbd615754a-kube-api-access-85q6g\") pod \"dnsmasq-dns-7cb5889db5-922j4\" (UID: \"d4bec6f2-d32d-4c25-ac4f-fddbd615754a\") " pod="openstack/dnsmasq-dns-7cb5889db5-922j4" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.274604 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4bec6f2-d32d-4c25-ac4f-fddbd615754a-config\") pod \"dnsmasq-dns-7cb5889db5-922j4\" (UID: \"d4bec6f2-d32d-4c25-ac4f-fddbd615754a\") " pod="openstack/dnsmasq-dns-7cb5889db5-922j4" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.274663 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4bec6f2-d32d-4c25-ac4f-fddbd615754a-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-922j4\" (UID: \"d4bec6f2-d32d-4c25-ac4f-fddbd615754a\") " pod="openstack/dnsmasq-dns-7cb5889db5-922j4" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.376035 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85q6g\" (UniqueName: \"kubernetes.io/projected/d4bec6f2-d32d-4c25-ac4f-fddbd615754a-kube-api-access-85q6g\") pod \"dnsmasq-dns-7cb5889db5-922j4\" (UID: \"d4bec6f2-d32d-4c25-ac4f-fddbd615754a\") " pod="openstack/dnsmasq-dns-7cb5889db5-922j4" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.376381 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4bec6f2-d32d-4c25-ac4f-fddbd615754a-config\") pod \"dnsmasq-dns-7cb5889db5-922j4\" (UID: \"d4bec6f2-d32d-4c25-ac4f-fddbd615754a\") " pod="openstack/dnsmasq-dns-7cb5889db5-922j4" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.376438 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4bec6f2-d32d-4c25-ac4f-fddbd615754a-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-922j4\" (UID: \"d4bec6f2-d32d-4c25-ac4f-fddbd615754a\") " pod="openstack/dnsmasq-dns-7cb5889db5-922j4" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.377415 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4bec6f2-d32d-4c25-ac4f-fddbd615754a-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-922j4\" (UID: \"d4bec6f2-d32d-4c25-ac4f-fddbd615754a\") " pod="openstack/dnsmasq-dns-7cb5889db5-922j4" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.377609 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4bec6f2-d32d-4c25-ac4f-fddbd615754a-config\") pod \"dnsmasq-dns-7cb5889db5-922j4\" (UID: \"d4bec6f2-d32d-4c25-ac4f-fddbd615754a\") " pod="openstack/dnsmasq-dns-7cb5889db5-922j4" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.404638 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85q6g\" (UniqueName: \"kubernetes.io/projected/d4bec6f2-d32d-4c25-ac4f-fddbd615754a-kube-api-access-85q6g\") pod \"dnsmasq-dns-7cb5889db5-922j4\" (UID: \"d4bec6f2-d32d-4c25-ac4f-fddbd615754a\") " pod="openstack/dnsmasq-dns-7cb5889db5-922j4" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.472326 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-922j4" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.478141 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-5vb9w"] Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.479393 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.484602 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.484934 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.498065 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-5vb9w"] Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.585139 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7349005b-b4d2-40b0-bc5c-d83acafaf9e3-config\") pod \"ovn-controller-metrics-5vb9w\" (UID: \"7349005b-b4d2-40b0-bc5c-d83acafaf9e3\") " pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.585213 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7349005b-b4d2-40b0-bc5c-d83acafaf9e3-ovs-rundir\") pod \"ovn-controller-metrics-5vb9w\" (UID: \"7349005b-b4d2-40b0-bc5c-d83acafaf9e3\") " pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.585240 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gwpr\" (UniqueName: \"kubernetes.io/projected/7349005b-b4d2-40b0-bc5c-d83acafaf9e3-kube-api-access-7gwpr\") pod \"ovn-controller-metrics-5vb9w\" (UID: \"7349005b-b4d2-40b0-bc5c-d83acafaf9e3\") " pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.585315 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7349005b-b4d2-40b0-bc5c-d83acafaf9e3-combined-ca-bundle\") pod \"ovn-controller-metrics-5vb9w\" (UID: \"7349005b-b4d2-40b0-bc5c-d83acafaf9e3\") " pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.585344 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7349005b-b4d2-40b0-bc5c-d83acafaf9e3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5vb9w\" (UID: \"7349005b-b4d2-40b0-bc5c-d83acafaf9e3\") " pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.585397 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7349005b-b4d2-40b0-bc5c-d83acafaf9e3-ovn-rundir\") pod \"ovn-controller-metrics-5vb9w\" (UID: \"7349005b-b4d2-40b0-bc5c-d83acafaf9e3\") " pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.687568 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7349005b-b4d2-40b0-bc5c-d83acafaf9e3-ovn-rundir\") pod \"ovn-controller-metrics-5vb9w\" (UID: \"7349005b-b4d2-40b0-bc5c-d83acafaf9e3\") " pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.687653 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7349005b-b4d2-40b0-bc5c-d83acafaf9e3-config\") pod \"ovn-controller-metrics-5vb9w\" (UID: \"7349005b-b4d2-40b0-bc5c-d83acafaf9e3\") " pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.687688 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7349005b-b4d2-40b0-bc5c-d83acafaf9e3-ovs-rundir\") pod \"ovn-controller-metrics-5vb9w\" (UID: \"7349005b-b4d2-40b0-bc5c-d83acafaf9e3\") " pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.687708 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gwpr\" (UniqueName: \"kubernetes.io/projected/7349005b-b4d2-40b0-bc5c-d83acafaf9e3-kube-api-access-7gwpr\") pod \"ovn-controller-metrics-5vb9w\" (UID: \"7349005b-b4d2-40b0-bc5c-d83acafaf9e3\") " pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.687766 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7349005b-b4d2-40b0-bc5c-d83acafaf9e3-combined-ca-bundle\") pod \"ovn-controller-metrics-5vb9w\" (UID: \"7349005b-b4d2-40b0-bc5c-d83acafaf9e3\") " pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.687788 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7349005b-b4d2-40b0-bc5c-d83acafaf9e3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5vb9w\" (UID: \"7349005b-b4d2-40b0-bc5c-d83acafaf9e3\") " pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.688011 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7349005b-b4d2-40b0-bc5c-d83acafaf9e3-ovn-rundir\") pod \"ovn-controller-metrics-5vb9w\" (UID: \"7349005b-b4d2-40b0-bc5c-d83acafaf9e3\") " pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.688138 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7349005b-b4d2-40b0-bc5c-d83acafaf9e3-ovs-rundir\") pod \"ovn-controller-metrics-5vb9w\" (UID: \"7349005b-b4d2-40b0-bc5c-d83acafaf9e3\") " pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.721502 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7349005b-b4d2-40b0-bc5c-d83acafaf9e3-config\") pod \"ovn-controller-metrics-5vb9w\" (UID: \"7349005b-b4d2-40b0-bc5c-d83acafaf9e3\") " pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.727050 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7349005b-b4d2-40b0-bc5c-d83acafaf9e3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5vb9w\" (UID: \"7349005b-b4d2-40b0-bc5c-d83acafaf9e3\") " pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.735615 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7349005b-b4d2-40b0-bc5c-d83acafaf9e3-combined-ca-bundle\") pod \"ovn-controller-metrics-5vb9w\" (UID: \"7349005b-b4d2-40b0-bc5c-d83acafaf9e3\") " pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.740215 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gwpr\" (UniqueName: \"kubernetes.io/projected/7349005b-b4d2-40b0-bc5c-d83acafaf9e3-kube-api-access-7gwpr\") pod \"ovn-controller-metrics-5vb9w\" (UID: \"7349005b-b4d2-40b0-bc5c-d83acafaf9e3\") " pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.757964 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-922j4"] Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.817968 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5vb9w" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.821827 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-vwgs9"] Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.823753 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-vwgs9" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.829482 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.833109 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-vwgs9"] Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.891010 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bee76ad-4034-469f-8a4e-8dff76089361-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-vwgs9\" (UID: \"8bee76ad-4034-469f-8a4e-8dff76089361\") " pod="openstack/dnsmasq-dns-74f6f696b9-vwgs9" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.891116 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bee76ad-4034-469f-8a4e-8dff76089361-config\") pod \"dnsmasq-dns-74f6f696b9-vwgs9\" (UID: \"8bee76ad-4034-469f-8a4e-8dff76089361\") " pod="openstack/dnsmasq-dns-74f6f696b9-vwgs9" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.891389 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v7bg\" (UniqueName: \"kubernetes.io/projected/8bee76ad-4034-469f-8a4e-8dff76089361-kube-api-access-8v7bg\") pod \"dnsmasq-dns-74f6f696b9-vwgs9\" (UID: \"8bee76ad-4034-469f-8a4e-8dff76089361\") " pod="openstack/dnsmasq-dns-74f6f696b9-vwgs9" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.891580 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bee76ad-4034-469f-8a4e-8dff76089361-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-vwgs9\" (UID: \"8bee76ad-4034-469f-8a4e-8dff76089361\") " pod="openstack/dnsmasq-dns-74f6f696b9-vwgs9" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.924684 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.960321 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-vwgs9"] Jan 28 20:58:25 crc kubenswrapper[4746]: E0128 20:58:25.961027 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-8v7bg ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-74f6f696b9-vwgs9" podUID="8bee76ad-4034-469f-8a4e-8dff76089361" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.987279 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-pdnv6"] Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.988674 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.992462 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.995712 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bee76ad-4034-469f-8a4e-8dff76089361-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-vwgs9\" (UID: \"8bee76ad-4034-469f-8a4e-8dff76089361\") " pod="openstack/dnsmasq-dns-74f6f696b9-vwgs9" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.995775 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bee76ad-4034-469f-8a4e-8dff76089361-config\") pod \"dnsmasq-dns-74f6f696b9-vwgs9\" (UID: \"8bee76ad-4034-469f-8a4e-8dff76089361\") " pod="openstack/dnsmasq-dns-74f6f696b9-vwgs9" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.995839 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v7bg\" (UniqueName: \"kubernetes.io/projected/8bee76ad-4034-469f-8a4e-8dff76089361-kube-api-access-8v7bg\") pod \"dnsmasq-dns-74f6f696b9-vwgs9\" (UID: \"8bee76ad-4034-469f-8a4e-8dff76089361\") " pod="openstack/dnsmasq-dns-74f6f696b9-vwgs9" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.995882 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bee76ad-4034-469f-8a4e-8dff76089361-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-vwgs9\" (UID: \"8bee76ad-4034-469f-8a4e-8dff76089361\") " pod="openstack/dnsmasq-dns-74f6f696b9-vwgs9" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.996882 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bee76ad-4034-469f-8a4e-8dff76089361-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-vwgs9\" (UID: \"8bee76ad-4034-469f-8a4e-8dff76089361\") " pod="openstack/dnsmasq-dns-74f6f696b9-vwgs9" Jan 28 20:58:25 crc kubenswrapper[4746]: I0128 20:58:25.997070 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bee76ad-4034-469f-8a4e-8dff76089361-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-vwgs9\" (UID: \"8bee76ad-4034-469f-8a4e-8dff76089361\") " pod="openstack/dnsmasq-dns-74f6f696b9-vwgs9" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.002340 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bee76ad-4034-469f-8a4e-8dff76089361-config\") pod \"dnsmasq-dns-74f6f696b9-vwgs9\" (UID: \"8bee76ad-4034-469f-8a4e-8dff76089361\") " pod="openstack/dnsmasq-dns-74f6f696b9-vwgs9" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.010047 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-pdnv6"] Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.017066 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.025056 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v7bg\" (UniqueName: \"kubernetes.io/projected/8bee76ad-4034-469f-8a4e-8dff76089361-kube-api-access-8v7bg\") pod \"dnsmasq-dns-74f6f696b9-vwgs9\" (UID: \"8bee76ad-4034-469f-8a4e-8dff76089361\") " pod="openstack/dnsmasq-dns-74f6f696b9-vwgs9" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.097247 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-dns-svc\") pod \"dnsmasq-dns-698758b865-pdnv6\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.097666 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frf92\" (UniqueName: \"kubernetes.io/projected/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-kube-api-access-frf92\") pod \"dnsmasq-dns-698758b865-pdnv6\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.097708 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-pdnv6\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.097836 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-config\") pod \"dnsmasq-dns-698758b865-pdnv6\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.097906 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-pdnv6\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.199246 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-pdnv6\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.199319 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-dns-svc\") pod \"dnsmasq-dns-698758b865-pdnv6\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.199365 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frf92\" (UniqueName: \"kubernetes.io/projected/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-kube-api-access-frf92\") pod \"dnsmasq-dns-698758b865-pdnv6\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.199385 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-pdnv6\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.199452 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-config\") pod \"dnsmasq-dns-698758b865-pdnv6\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.200518 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-config\") pod \"dnsmasq-dns-698758b865-pdnv6\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.200596 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-dns-svc\") pod \"dnsmasq-dns-698758b865-pdnv6\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.200638 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-pdnv6\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.200938 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-pdnv6\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.222717 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frf92\" (UniqueName: \"kubernetes.io/projected/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-kube-api-access-frf92\") pod \"dnsmasq-dns-698758b865-pdnv6\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.304399 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.314351 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.321209 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.325332 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.325411 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.326136 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-4fq7d" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.330723 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.339301 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.386167 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-vwgs9" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.406289 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.406350 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/39e8de66-78c6-45cf-b026-7783ef89922d-cache\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.406395 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96mx6\" (UniqueName: \"kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-kube-api-access-96mx6\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.406461 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/39e8de66-78c6-45cf-b026-7783ef89922d-lock\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.406528 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6e1d7b3f-c034-4be2-9f14-bfffce532119\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e1d7b3f-c034-4be2-9f14-bfffce532119\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.406574 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39e8de66-78c6-45cf-b026-7783ef89922d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.416884 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-vwgs9" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.434997 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.507929 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bee76ad-4034-469f-8a4e-8dff76089361-ovsdbserver-nb\") pod \"8bee76ad-4034-469f-8a4e-8dff76089361\" (UID: \"8bee76ad-4034-469f-8a4e-8dff76089361\") " Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.507970 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bee76ad-4034-469f-8a4e-8dff76089361-config\") pod \"8bee76ad-4034-469f-8a4e-8dff76089361\" (UID: \"8bee76ad-4034-469f-8a4e-8dff76089361\") " Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.508008 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v7bg\" (UniqueName: \"kubernetes.io/projected/8bee76ad-4034-469f-8a4e-8dff76089361-kube-api-access-8v7bg\") pod \"8bee76ad-4034-469f-8a4e-8dff76089361\" (UID: \"8bee76ad-4034-469f-8a4e-8dff76089361\") " Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.508068 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bee76ad-4034-469f-8a4e-8dff76089361-dns-svc\") pod \"8bee76ad-4034-469f-8a4e-8dff76089361\" (UID: \"8bee76ad-4034-469f-8a4e-8dff76089361\") " Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.508469 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bee76ad-4034-469f-8a4e-8dff76089361-config" (OuterVolumeSpecName: "config") pod "8bee76ad-4034-469f-8a4e-8dff76089361" (UID: "8bee76ad-4034-469f-8a4e-8dff76089361"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.508494 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.508553 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/39e8de66-78c6-45cf-b026-7783ef89922d-cache\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.508587 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96mx6\" (UniqueName: \"kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-kube-api-access-96mx6\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.508647 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/39e8de66-78c6-45cf-b026-7783ef89922d-lock\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.508721 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6e1d7b3f-c034-4be2-9f14-bfffce532119\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e1d7b3f-c034-4be2-9f14-bfffce532119\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.508757 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39e8de66-78c6-45cf-b026-7783ef89922d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.508753 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bee76ad-4034-469f-8a4e-8dff76089361-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8bee76ad-4034-469f-8a4e-8dff76089361" (UID: "8bee76ad-4034-469f-8a4e-8dff76089361"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.508908 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bee76ad-4034-469f-8a4e-8dff76089361-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8bee76ad-4034-469f-8a4e-8dff76089361" (UID: "8bee76ad-4034-469f-8a4e-8dff76089361"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.509058 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bee76ad-4034-469f-8a4e-8dff76089361-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.509102 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bee76ad-4034-469f-8a4e-8dff76089361-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:26 crc kubenswrapper[4746]: E0128 20:58:26.510148 4746 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 28 20:58:26 crc kubenswrapper[4746]: E0128 20:58:26.510193 4746 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 28 20:58:26 crc kubenswrapper[4746]: E0128 20:58:26.510250 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift podName:39e8de66-78c6-45cf-b026-7783ef89922d nodeName:}" failed. No retries permitted until 2026-01-28 20:58:27.010232836 +0000 UTC m=+1134.966419260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift") pod "swift-storage-0" (UID: "39e8de66-78c6-45cf-b026-7783ef89922d") : configmap "swift-ring-files" not found Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.510533 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/39e8de66-78c6-45cf-b026-7783ef89922d-cache\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.510580 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/39e8de66-78c6-45cf-b026-7783ef89922d-lock\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.518666 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bee76ad-4034-469f-8a4e-8dff76089361-kube-api-access-8v7bg" (OuterVolumeSpecName: "kube-api-access-8v7bg") pod "8bee76ad-4034-469f-8a4e-8dff76089361" (UID: "8bee76ad-4034-469f-8a4e-8dff76089361"). InnerVolumeSpecName "kube-api-access-8v7bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.525192 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.525225 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6e1d7b3f-c034-4be2-9f14-bfffce532119\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e1d7b3f-c034-4be2-9f14-bfffce532119\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/63864e6c829043faddd5e095781f290746f419cb1d7dba0ed4c58f37312fc970/globalmount\"" pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.526501 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39e8de66-78c6-45cf-b026-7783ef89922d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.544174 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96mx6\" (UniqueName: \"kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-kube-api-access-96mx6\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.566068 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6e1d7b3f-c034-4be2-9f14-bfffce532119\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e1d7b3f-c034-4be2-9f14-bfffce532119\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.611445 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v7bg\" (UniqueName: \"kubernetes.io/projected/8bee76ad-4034-469f-8a4e-8dff76089361-kube-api-access-8v7bg\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.611494 4746 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bee76ad-4034-469f-8a4e-8dff76089361-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.806252 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.808934 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.813724 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.813837 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.813848 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-gtw9n" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.814005 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.856907 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.915998 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/af4de16a-caed-4c86-9cf8-da6f9214ca5f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.916037 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmq6l\" (UniqueName: \"kubernetes.io/projected/af4de16a-caed-4c86-9cf8-da6f9214ca5f-kube-api-access-rmq6l\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.916113 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/af4de16a-caed-4c86-9cf8-da6f9214ca5f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.916187 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af4de16a-caed-4c86-9cf8-da6f9214ca5f-scripts\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.916210 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af4de16a-caed-4c86-9cf8-da6f9214ca5f-config\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.916227 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/af4de16a-caed-4c86-9cf8-da6f9214ca5f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.916288 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af4de16a-caed-4c86-9cf8-da6f9214ca5f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.917190 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-bggx2"] Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.918407 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.922853 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.923227 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.923362 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.927287 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-bggx2"] Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.978465 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-bggx2"] Jan 28 20:58:26 crc kubenswrapper[4746]: E0128 20:58:26.979550 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-549rm ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-549rm ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-bggx2" podUID="b1828503-4f67-4655-b5b5-428bd1afc8f4" Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.990519 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-5sxj6"] Jan 28 20:58:26 crc kubenswrapper[4746]: I0128 20:58:26.992043 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.004018 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-5sxj6"] Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.016893 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/af4de16a-caed-4c86-9cf8-da6f9214ca5f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.016945 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmq6l\" (UniqueName: \"kubernetes.io/projected/af4de16a-caed-4c86-9cf8-da6f9214ca5f-kube-api-access-rmq6l\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.016976 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/61a4ff02-ae06-438a-a39c-8264c8e61b38-ring-data-devices\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017009 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/af4de16a-caed-4c86-9cf8-da6f9214ca5f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017046 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b1828503-4f67-4655-b5b5-428bd1afc8f4-swiftconf\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017068 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1828503-4f67-4655-b5b5-428bd1afc8f4-scripts\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017115 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/61a4ff02-ae06-438a-a39c-8264c8e61b38-swiftconf\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017147 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/61a4ff02-ae06-438a-a39c-8264c8e61b38-etc-swift\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017167 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a4ff02-ae06-438a-a39c-8264c8e61b38-combined-ca-bundle\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017190 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/61a4ff02-ae06-438a-a39c-8264c8e61b38-dispersionconf\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017226 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017246 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b1828503-4f67-4655-b5b5-428bd1afc8f4-etc-swift\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017271 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61a4ff02-ae06-438a-a39c-8264c8e61b38-scripts\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017296 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1828503-4f67-4655-b5b5-428bd1afc8f4-combined-ca-bundle\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017319 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-549rm\" (UniqueName: \"kubernetes.io/projected/b1828503-4f67-4655-b5b5-428bd1afc8f4-kube-api-access-549rm\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017342 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af4de16a-caed-4c86-9cf8-da6f9214ca5f-scripts\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017408 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af4de16a-caed-4c86-9cf8-da6f9214ca5f-config\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017423 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b1828503-4f67-4655-b5b5-428bd1afc8f4-ring-data-devices\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017465 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/af4de16a-caed-4c86-9cf8-da6f9214ca5f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017514 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af4de16a-caed-4c86-9cf8-da6f9214ca5f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017563 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b1828503-4f67-4655-b5b5-428bd1afc8f4-dispersionconf\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017583 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvfrh\" (UniqueName: \"kubernetes.io/projected/61a4ff02-ae06-438a-a39c-8264c8e61b38-kube-api-access-bvfrh\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: E0128 20:58:27.017859 4746 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 28 20:58:27 crc kubenswrapper[4746]: E0128 20:58:27.017882 4746 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 28 20:58:27 crc kubenswrapper[4746]: E0128 20:58:27.017927 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift podName:39e8de66-78c6-45cf-b026-7783ef89922d nodeName:}" failed. No retries permitted until 2026-01-28 20:58:28.017911807 +0000 UTC m=+1135.974098161 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift") pod "swift-storage-0" (UID: "39e8de66-78c6-45cf-b026-7783ef89922d") : configmap "swift-ring-files" not found Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.017459 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/af4de16a-caed-4c86-9cf8-da6f9214ca5f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.018916 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af4de16a-caed-4c86-9cf8-da6f9214ca5f-scripts\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.019576 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af4de16a-caed-4c86-9cf8-da6f9214ca5f-config\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.023563 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af4de16a-caed-4c86-9cf8-da6f9214ca5f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.023713 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/af4de16a-caed-4c86-9cf8-da6f9214ca5f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.025187 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/af4de16a-caed-4c86-9cf8-da6f9214ca5f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.035018 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmq6l\" (UniqueName: \"kubernetes.io/projected/af4de16a-caed-4c86-9cf8-da6f9214ca5f-kube-api-access-rmq6l\") pod \"ovn-northd-0\" (UID: \"af4de16a-caed-4c86-9cf8-da6f9214ca5f\") " pod="openstack/ovn-northd-0" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.118551 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b1828503-4f67-4655-b5b5-428bd1afc8f4-dispersionconf\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.118600 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvfrh\" (UniqueName: \"kubernetes.io/projected/61a4ff02-ae06-438a-a39c-8264c8e61b38-kube-api-access-bvfrh\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.118645 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/61a4ff02-ae06-438a-a39c-8264c8e61b38-ring-data-devices\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.118685 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b1828503-4f67-4655-b5b5-428bd1afc8f4-swiftconf\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.118705 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1828503-4f67-4655-b5b5-428bd1afc8f4-scripts\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.118761 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/61a4ff02-ae06-438a-a39c-8264c8e61b38-swiftconf\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.118792 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/61a4ff02-ae06-438a-a39c-8264c8e61b38-etc-swift\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.118807 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a4ff02-ae06-438a-a39c-8264c8e61b38-combined-ca-bundle\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.118827 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/61a4ff02-ae06-438a-a39c-8264c8e61b38-dispersionconf\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.118847 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b1828503-4f67-4655-b5b5-428bd1afc8f4-etc-swift\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.118874 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61a4ff02-ae06-438a-a39c-8264c8e61b38-scripts\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.118893 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1828503-4f67-4655-b5b5-428bd1afc8f4-combined-ca-bundle\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.118910 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-549rm\" (UniqueName: \"kubernetes.io/projected/b1828503-4f67-4655-b5b5-428bd1afc8f4-kube-api-access-549rm\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.118939 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b1828503-4f67-4655-b5b5-428bd1afc8f4-ring-data-devices\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.119656 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/61a4ff02-ae06-438a-a39c-8264c8e61b38-ring-data-devices\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.119996 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b1828503-4f67-4655-b5b5-428bd1afc8f4-etc-swift\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.120141 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b1828503-4f67-4655-b5b5-428bd1afc8f4-ring-data-devices\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.120032 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61a4ff02-ae06-438a-a39c-8264c8e61b38-scripts\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.120284 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/61a4ff02-ae06-438a-a39c-8264c8e61b38-etc-swift\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.122201 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1828503-4f67-4655-b5b5-428bd1afc8f4-scripts\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.123599 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/61a4ff02-ae06-438a-a39c-8264c8e61b38-swiftconf\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.127603 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a4ff02-ae06-438a-a39c-8264c8e61b38-combined-ca-bundle\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.127883 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b1828503-4f67-4655-b5b5-428bd1afc8f4-swiftconf\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.131288 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b1828503-4f67-4655-b5b5-428bd1afc8f4-dispersionconf\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.136703 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/61a4ff02-ae06-438a-a39c-8264c8e61b38-dispersionconf\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.137469 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1828503-4f67-4655-b5b5-428bd1afc8f4-combined-ca-bundle\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.138691 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-549rm\" (UniqueName: \"kubernetes.io/projected/b1828503-4f67-4655-b5b5-428bd1afc8f4-kube-api-access-549rm\") pod \"swift-ring-rebalance-bggx2\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.148192 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvfrh\" (UniqueName: \"kubernetes.io/projected/61a4ff02-ae06-438a-a39c-8264c8e61b38-kube-api-access-bvfrh\") pod \"swift-ring-rebalance-5sxj6\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.162650 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.310109 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.399069 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.399683 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-vwgs9" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.415236 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.485147 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-vwgs9"] Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.498994 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-vwgs9"] Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.526473 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-549rm\" (UniqueName: \"kubernetes.io/projected/b1828503-4f67-4655-b5b5-428bd1afc8f4-kube-api-access-549rm\") pod \"b1828503-4f67-4655-b5b5-428bd1afc8f4\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.526582 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b1828503-4f67-4655-b5b5-428bd1afc8f4-ring-data-devices\") pod \"b1828503-4f67-4655-b5b5-428bd1afc8f4\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.526643 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1828503-4f67-4655-b5b5-428bd1afc8f4-combined-ca-bundle\") pod \"b1828503-4f67-4655-b5b5-428bd1afc8f4\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.526706 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1828503-4f67-4655-b5b5-428bd1afc8f4-scripts\") pod \"b1828503-4f67-4655-b5b5-428bd1afc8f4\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.526726 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b1828503-4f67-4655-b5b5-428bd1afc8f4-dispersionconf\") pod \"b1828503-4f67-4655-b5b5-428bd1afc8f4\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.526763 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b1828503-4f67-4655-b5b5-428bd1afc8f4-etc-swift\") pod \"b1828503-4f67-4655-b5b5-428bd1afc8f4\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.526844 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b1828503-4f67-4655-b5b5-428bd1afc8f4-swiftconf\") pod \"b1828503-4f67-4655-b5b5-428bd1afc8f4\" (UID: \"b1828503-4f67-4655-b5b5-428bd1afc8f4\") " Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.530974 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1828503-4f67-4655-b5b5-428bd1afc8f4-kube-api-access-549rm" (OuterVolumeSpecName: "kube-api-access-549rm") pod "b1828503-4f67-4655-b5b5-428bd1afc8f4" (UID: "b1828503-4f67-4655-b5b5-428bd1afc8f4"). InnerVolumeSpecName "kube-api-access-549rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.531040 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1828503-4f67-4655-b5b5-428bd1afc8f4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b1828503-4f67-4655-b5b5-428bd1afc8f4" (UID: "b1828503-4f67-4655-b5b5-428bd1afc8f4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.531331 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1828503-4f67-4655-b5b5-428bd1afc8f4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b1828503-4f67-4655-b5b5-428bd1afc8f4" (UID: "b1828503-4f67-4655-b5b5-428bd1afc8f4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.537304 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1828503-4f67-4655-b5b5-428bd1afc8f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1828503-4f67-4655-b5b5-428bd1afc8f4" (UID: "b1828503-4f67-4655-b5b5-428bd1afc8f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.543717 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1828503-4f67-4655-b5b5-428bd1afc8f4-scripts" (OuterVolumeSpecName: "scripts") pod "b1828503-4f67-4655-b5b5-428bd1afc8f4" (UID: "b1828503-4f67-4655-b5b5-428bd1afc8f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.546525 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1828503-4f67-4655-b5b5-428bd1afc8f4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b1828503-4f67-4655-b5b5-428bd1afc8f4" (UID: "b1828503-4f67-4655-b5b5-428bd1afc8f4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.557247 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1828503-4f67-4655-b5b5-428bd1afc8f4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b1828503-4f67-4655-b5b5-428bd1afc8f4" (UID: "b1828503-4f67-4655-b5b5-428bd1afc8f4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.631039 4746 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b1828503-4f67-4655-b5b5-428bd1afc8f4-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.631102 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-549rm\" (UniqueName: \"kubernetes.io/projected/b1828503-4f67-4655-b5b5-428bd1afc8f4-kube-api-access-549rm\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.631117 4746 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b1828503-4f67-4655-b5b5-428bd1afc8f4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.631127 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1828503-4f67-4655-b5b5-428bd1afc8f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.631137 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1828503-4f67-4655-b5b5-428bd1afc8f4-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.631147 4746 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b1828503-4f67-4655-b5b5-428bd1afc8f4-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.631157 4746 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b1828503-4f67-4655-b5b5-428bd1afc8f4-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:27 crc kubenswrapper[4746]: I0128 20:58:27.895822 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-66dfd9bb-55rmf" Jan 28 20:58:28 crc kubenswrapper[4746]: I0128 20:58:28.039186 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:28 crc kubenswrapper[4746]: E0128 20:58:28.039379 4746 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 28 20:58:28 crc kubenswrapper[4746]: E0128 20:58:28.039653 4746 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 28 20:58:28 crc kubenswrapper[4746]: E0128 20:58:28.039710 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift podName:39e8de66-78c6-45cf-b026-7783ef89922d nodeName:}" failed. No retries permitted until 2026-01-28 20:58:30.039689193 +0000 UTC m=+1137.995875607 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift") pod "swift-storage-0" (UID: "39e8de66-78c6-45cf-b026-7783ef89922d") : configmap "swift-ring-files" not found Jan 28 20:58:28 crc kubenswrapper[4746]: I0128 20:58:28.251836 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-795fd8f8cc-gb5z2" Jan 28 20:58:28 crc kubenswrapper[4746]: I0128 20:58:28.361734 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh" Jan 28 20:58:28 crc kubenswrapper[4746]: I0128 20:58:28.408429 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bggx2" Jan 28 20:58:28 crc kubenswrapper[4746]: I0128 20:58:28.490854 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-bggx2"] Jan 28 20:58:28 crc kubenswrapper[4746]: I0128 20:58:28.514576 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-bggx2"] Jan 28 20:58:28 crc kubenswrapper[4746]: I0128 20:58:28.850044 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bee76ad-4034-469f-8a4e-8dff76089361" path="/var/lib/kubelet/pods/8bee76ad-4034-469f-8a4e-8dff76089361/volumes" Jan 28 20:58:28 crc kubenswrapper[4746]: I0128 20:58:28.850663 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1828503-4f67-4655-b5b5-428bd1afc8f4" path="/var/lib/kubelet/pods/b1828503-4f67-4655-b5b5-428bd1afc8f4/volumes" Jan 28 20:58:29 crc kubenswrapper[4746]: I0128 20:58:29.244492 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="d3cad0b0-7b53-4280-9dec-05e01692820c" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 28 20:58:29 crc kubenswrapper[4746]: I0128 20:58:29.459315 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Jan 28 20:58:29 crc kubenswrapper[4746]: I0128 20:58:29.638429 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Jan 28 20:58:30 crc kubenswrapper[4746]: I0128 20:58:30.095255 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:30 crc kubenswrapper[4746]: E0128 20:58:30.095472 4746 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 28 20:58:30 crc kubenswrapper[4746]: E0128 20:58:30.095867 4746 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 28 20:58:30 crc kubenswrapper[4746]: E0128 20:58:30.095938 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift podName:39e8de66-78c6-45cf-b026-7783ef89922d nodeName:}" failed. No retries permitted until 2026-01-28 20:58:34.095914434 +0000 UTC m=+1142.052100788 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift") pod "swift-storage-0" (UID: "39e8de66-78c6-45cf-b026-7783ef89922d") : configmap "swift-ring-files" not found Jan 28 20:58:30 crc kubenswrapper[4746]: I0128 20:58:30.246901 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-922j4"] Jan 28 20:58:30 crc kubenswrapper[4746]: W0128 20:58:30.253620 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61a4ff02_ae06_438a_a39c_8264c8e61b38.slice/crio-a8d31a39a56a93819c68ad91f78932813cd375e53a1b32ba0f782db3d6547413 WatchSource:0}: Error finding container a8d31a39a56a93819c68ad91f78932813cd375e53a1b32ba0f782db3d6547413: Status 404 returned error can't find the container with id a8d31a39a56a93819c68ad91f78932813cd375e53a1b32ba0f782db3d6547413 Jan 28 20:58:30 crc kubenswrapper[4746]: I0128 20:58:30.262801 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-5sxj6"] Jan 28 20:58:30 crc kubenswrapper[4746]: I0128 20:58:30.412759 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-5vb9w"] Jan 28 20:58:30 crc kubenswrapper[4746]: I0128 20:58:30.434544 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5sxj6" event={"ID":"61a4ff02-ae06-438a-a39c-8264c8e61b38","Type":"ContainerStarted","Data":"a8d31a39a56a93819c68ad91f78932813cd375e53a1b32ba0f782db3d6547413"} Jan 28 20:58:30 crc kubenswrapper[4746]: I0128 20:58:30.437908 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-922j4" event={"ID":"d4bec6f2-d32d-4c25-ac4f-fddbd615754a","Type":"ContainerStarted","Data":"80dfcf13951da04bef9c83083ff0e5f2bc46d481831555f6fd844173e38dccbf"} Jan 28 20:58:30 crc kubenswrapper[4746]: I0128 20:58:30.437962 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-922j4" event={"ID":"d4bec6f2-d32d-4c25-ac4f-fddbd615754a","Type":"ContainerStarted","Data":"e3132bd500f133a4edd912867d1338f0815a1b65298ab8b0d617d91032b0d9b1"} Jan 28 20:58:30 crc kubenswrapper[4746]: I0128 20:58:30.437979 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cb5889db5-922j4" podUID="d4bec6f2-d32d-4c25-ac4f-fddbd615754a" containerName="init" containerID="cri-o://80dfcf13951da04bef9c83083ff0e5f2bc46d481831555f6fd844173e38dccbf" gracePeriod=10 Jan 28 20:58:30 crc kubenswrapper[4746]: I0128 20:58:30.442953 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fde93743-7b9d-4175-abdf-bd74008cf4b0","Type":"ContainerStarted","Data":"7f07d0762d2d78bb7a35ece131dabc38efd5554409eccb72f034b896277380c1"} Jan 28 20:58:30 crc kubenswrapper[4746]: I0128 20:58:30.529275 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 28 20:58:30 crc kubenswrapper[4746]: I0128 20:58:30.537625 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-pdnv6"] Jan 28 20:58:30 crc kubenswrapper[4746]: I0128 20:58:30.756334 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-922j4" Jan 28 20:58:30 crc kubenswrapper[4746]: I0128 20:58:30.910701 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85q6g\" (UniqueName: \"kubernetes.io/projected/d4bec6f2-d32d-4c25-ac4f-fddbd615754a-kube-api-access-85q6g\") pod \"d4bec6f2-d32d-4c25-ac4f-fddbd615754a\" (UID: \"d4bec6f2-d32d-4c25-ac4f-fddbd615754a\") " Jan 28 20:58:30 crc kubenswrapper[4746]: I0128 20:58:30.910852 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4bec6f2-d32d-4c25-ac4f-fddbd615754a-dns-svc\") pod \"d4bec6f2-d32d-4c25-ac4f-fddbd615754a\" (UID: \"d4bec6f2-d32d-4c25-ac4f-fddbd615754a\") " Jan 28 20:58:30 crc kubenswrapper[4746]: I0128 20:58:30.910907 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4bec6f2-d32d-4c25-ac4f-fddbd615754a-config\") pod \"d4bec6f2-d32d-4c25-ac4f-fddbd615754a\" (UID: \"d4bec6f2-d32d-4c25-ac4f-fddbd615754a\") " Jan 28 20:58:30 crc kubenswrapper[4746]: I0128 20:58:30.916272 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4bec6f2-d32d-4c25-ac4f-fddbd615754a-kube-api-access-85q6g" (OuterVolumeSpecName: "kube-api-access-85q6g") pod "d4bec6f2-d32d-4c25-ac4f-fddbd615754a" (UID: "d4bec6f2-d32d-4c25-ac4f-fddbd615754a"). InnerVolumeSpecName "kube-api-access-85q6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:30 crc kubenswrapper[4746]: I0128 20:58:30.931704 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4bec6f2-d32d-4c25-ac4f-fddbd615754a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d4bec6f2-d32d-4c25-ac4f-fddbd615754a" (UID: "d4bec6f2-d32d-4c25-ac4f-fddbd615754a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:30 crc kubenswrapper[4746]: I0128 20:58:30.931959 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4bec6f2-d32d-4c25-ac4f-fddbd615754a-config" (OuterVolumeSpecName: "config") pod "d4bec6f2-d32d-4c25-ac4f-fddbd615754a" (UID: "d4bec6f2-d32d-4c25-ac4f-fddbd615754a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.013743 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85q6g\" (UniqueName: \"kubernetes.io/projected/d4bec6f2-d32d-4c25-ac4f-fddbd615754a-kube-api-access-85q6g\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.013782 4746 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4bec6f2-d32d-4c25-ac4f-fddbd615754a-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.013795 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4bec6f2-d32d-4c25-ac4f-fddbd615754a-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.455694 4746 generic.go:334] "Generic (PLEG): container finished" podID="d4bec6f2-d32d-4c25-ac4f-fddbd615754a" containerID="80dfcf13951da04bef9c83083ff0e5f2bc46d481831555f6fd844173e38dccbf" exitCode=0 Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.455744 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-922j4" event={"ID":"d4bec6f2-d32d-4c25-ac4f-fddbd615754a","Type":"ContainerDied","Data":"80dfcf13951da04bef9c83083ff0e5f2bc46d481831555f6fd844173e38dccbf"} Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.455750 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-922j4" Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.456135 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-922j4" event={"ID":"d4bec6f2-d32d-4c25-ac4f-fddbd615754a","Type":"ContainerDied","Data":"e3132bd500f133a4edd912867d1338f0815a1b65298ab8b0d617d91032b0d9b1"} Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.456202 4746 scope.go:117] "RemoveContainer" containerID="80dfcf13951da04bef9c83083ff0e5f2bc46d481831555f6fd844173e38dccbf" Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.458598 4746 generic.go:334] "Generic (PLEG): container finished" podID="b1a60d3b-bcc7-47e3-94b0-12acae8ccfda" containerID="35baa398a2ff8247262c262e8c9b0d34000d20d9745720e78ff444d5c8abb948" exitCode=0 Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.458684 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-pdnv6" event={"ID":"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda","Type":"ContainerDied","Data":"35baa398a2ff8247262c262e8c9b0d34000d20d9745720e78ff444d5c8abb948"} Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.458727 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-pdnv6" event={"ID":"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda","Type":"ContainerStarted","Data":"d6add7168af2abe8b03cc109943ce81073fed15fb20d52da0172097ad1356ba3"} Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.460988 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"af4de16a-caed-4c86-9cf8-da6f9214ca5f","Type":"ContainerStarted","Data":"930bee50235aa093904cf59ce2194e6b4ca945e96856deed6cb1a0f1230bba8f"} Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.462843 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5vb9w" event={"ID":"7349005b-b4d2-40b0-bc5c-d83acafaf9e3","Type":"ContainerStarted","Data":"6db0755e4a821eef7eaadb1a4f4595113d17717450174e556ddbce7c0ce6c85a"} Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.462865 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5vb9w" event={"ID":"7349005b-b4d2-40b0-bc5c-d83acafaf9e3","Type":"ContainerStarted","Data":"7cdbd381f5ef498f9c00f3f9584f9583d4d67cc70fbe7e8f30a14ce0aa6e452d"} Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.513584 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-5vb9w" podStartSLOduration=6.513567773 podStartE2EDuration="6.513567773s" podCreationTimestamp="2026-01-28 20:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:58:31.498332154 +0000 UTC m=+1139.454518518" watchObservedRunningTime="2026-01-28 20:58:31.513567773 +0000 UTC m=+1139.469754127" Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.691167 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.691215 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.770142 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-922j4"] Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.787472 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-922j4"] Jan 28 20:58:31 crc kubenswrapper[4746]: I0128 20:58:31.815812 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 28 20:58:32 crc kubenswrapper[4746]: I0128 20:58:32.001126 4746 scope.go:117] "RemoveContainer" containerID="80dfcf13951da04bef9c83083ff0e5f2bc46d481831555f6fd844173e38dccbf" Jan 28 20:58:32 crc kubenswrapper[4746]: E0128 20:58:32.002162 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80dfcf13951da04bef9c83083ff0e5f2bc46d481831555f6fd844173e38dccbf\": container with ID starting with 80dfcf13951da04bef9c83083ff0e5f2bc46d481831555f6fd844173e38dccbf not found: ID does not exist" containerID="80dfcf13951da04bef9c83083ff0e5f2bc46d481831555f6fd844173e38dccbf" Jan 28 20:58:32 crc kubenswrapper[4746]: I0128 20:58:32.002209 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80dfcf13951da04bef9c83083ff0e5f2bc46d481831555f6fd844173e38dccbf"} err="failed to get container status \"80dfcf13951da04bef9c83083ff0e5f2bc46d481831555f6fd844173e38dccbf\": rpc error: code = NotFound desc = could not find container \"80dfcf13951da04bef9c83083ff0e5f2bc46d481831555f6fd844173e38dccbf\": container with ID starting with 80dfcf13951da04bef9c83083ff0e5f2bc46d481831555f6fd844173e38dccbf not found: ID does not exist" Jan 28 20:58:32 crc kubenswrapper[4746]: I0128 20:58:32.479903 4746 generic.go:334] "Generic (PLEG): container finished" podID="88718387-09d6-4e3d-a06f-4353ba42ce91" containerID="ff6dfe0b7527df02f28dd43e980cf6fdbb546706b9ce1c38c5e9ebe0e3c4c38a" exitCode=0 Jan 28 20:58:32 crc kubenswrapper[4746]: I0128 20:58:32.479978 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"88718387-09d6-4e3d-a06f-4353ba42ce91","Type":"ContainerDied","Data":"ff6dfe0b7527df02f28dd43e980cf6fdbb546706b9ce1c38c5e9ebe0e3c4c38a"} Jan 28 20:58:32 crc kubenswrapper[4746]: I0128 20:58:32.495029 4746 generic.go:334] "Generic (PLEG): container finished" podID="701360b2-121a-4cb4-9a4f-9ce63391e740" containerID="f2f571246c74fa9c9e1b471c32df4a45d7e6ced3641e96b15c5ddca28302d0b2" exitCode=0 Jan 28 20:58:32 crc kubenswrapper[4746]: I0128 20:58:32.495940 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"701360b2-121a-4cb4-9a4f-9ce63391e740","Type":"ContainerDied","Data":"f2f571246c74fa9c9e1b471c32df4a45d7e6ced3641e96b15c5ddca28302d0b2"} Jan 28 20:58:32 crc kubenswrapper[4746]: I0128 20:58:32.587115 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 28 20:58:32 crc kubenswrapper[4746]: I0128 20:58:32.859461 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4bec6f2-d32d-4c25-ac4f-fddbd615754a" path="/var/lib/kubelet/pods/d4bec6f2-d32d-4c25-ac4f-fddbd615754a/volumes" Jan 28 20:58:32 crc kubenswrapper[4746]: I0128 20:58:32.935278 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-350d-account-create-update-q5qhn"] Jan 28 20:58:32 crc kubenswrapper[4746]: E0128 20:58:32.935644 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4bec6f2-d32d-4c25-ac4f-fddbd615754a" containerName="init" Jan 28 20:58:32 crc kubenswrapper[4746]: I0128 20:58:32.935661 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4bec6f2-d32d-4c25-ac4f-fddbd615754a" containerName="init" Jan 28 20:58:32 crc kubenswrapper[4746]: I0128 20:58:32.935837 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4bec6f2-d32d-4c25-ac4f-fddbd615754a" containerName="init" Jan 28 20:58:32 crc kubenswrapper[4746]: I0128 20:58:32.936494 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-350d-account-create-update-q5qhn" Jan 28 20:58:32 crc kubenswrapper[4746]: I0128 20:58:32.938679 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 28 20:58:32 crc kubenswrapper[4746]: I0128 20:58:32.947053 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-350d-account-create-update-q5qhn"] Jan 28 20:58:32 crc kubenswrapper[4746]: I0128 20:58:32.990315 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-dlfdf"] Jan 28 20:58:32 crc kubenswrapper[4746]: I0128 20:58:32.991742 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dlfdf" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.000319 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dlfdf"] Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.071010 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jn7w\" (UniqueName: \"kubernetes.io/projected/d587d573-77d2-41a6-a9c9-3cf63b24512d-kube-api-access-7jn7w\") pod \"keystone-350d-account-create-update-q5qhn\" (UID: \"d587d573-77d2-41a6-a9c9-3cf63b24512d\") " pod="openstack/keystone-350d-account-create-update-q5qhn" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.071102 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d587d573-77d2-41a6-a9c9-3cf63b24512d-operator-scripts\") pod \"keystone-350d-account-create-update-q5qhn\" (UID: \"d587d573-77d2-41a6-a9c9-3cf63b24512d\") " pod="openstack/keystone-350d-account-create-update-q5qhn" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.071142 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5s8l\" (UniqueName: \"kubernetes.io/projected/5d335033-aade-4271-ae71-4bb277438111-kube-api-access-q5s8l\") pod \"keystone-db-create-dlfdf\" (UID: \"5d335033-aade-4271-ae71-4bb277438111\") " pod="openstack/keystone-db-create-dlfdf" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.071186 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d335033-aade-4271-ae71-4bb277438111-operator-scripts\") pod \"keystone-db-create-dlfdf\" (UID: \"5d335033-aade-4271-ae71-4bb277438111\") " pod="openstack/keystone-db-create-dlfdf" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.177686 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jn7w\" (UniqueName: \"kubernetes.io/projected/d587d573-77d2-41a6-a9c9-3cf63b24512d-kube-api-access-7jn7w\") pod \"keystone-350d-account-create-update-q5qhn\" (UID: \"d587d573-77d2-41a6-a9c9-3cf63b24512d\") " pod="openstack/keystone-350d-account-create-update-q5qhn" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.177763 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d587d573-77d2-41a6-a9c9-3cf63b24512d-operator-scripts\") pod \"keystone-350d-account-create-update-q5qhn\" (UID: \"d587d573-77d2-41a6-a9c9-3cf63b24512d\") " pod="openstack/keystone-350d-account-create-update-q5qhn" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.177807 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5s8l\" (UniqueName: \"kubernetes.io/projected/5d335033-aade-4271-ae71-4bb277438111-kube-api-access-q5s8l\") pod \"keystone-db-create-dlfdf\" (UID: \"5d335033-aade-4271-ae71-4bb277438111\") " pod="openstack/keystone-db-create-dlfdf" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.177862 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d335033-aade-4271-ae71-4bb277438111-operator-scripts\") pod \"keystone-db-create-dlfdf\" (UID: \"5d335033-aade-4271-ae71-4bb277438111\") " pod="openstack/keystone-db-create-dlfdf" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.179730 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d587d573-77d2-41a6-a9c9-3cf63b24512d-operator-scripts\") pod \"keystone-350d-account-create-update-q5qhn\" (UID: \"d587d573-77d2-41a6-a9c9-3cf63b24512d\") " pod="openstack/keystone-350d-account-create-update-q5qhn" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.179796 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d335033-aade-4271-ae71-4bb277438111-operator-scripts\") pod \"keystone-db-create-dlfdf\" (UID: \"5d335033-aade-4271-ae71-4bb277438111\") " pod="openstack/keystone-db-create-dlfdf" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.207186 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5s8l\" (UniqueName: \"kubernetes.io/projected/5d335033-aade-4271-ae71-4bb277438111-kube-api-access-q5s8l\") pod \"keystone-db-create-dlfdf\" (UID: \"5d335033-aade-4271-ae71-4bb277438111\") " pod="openstack/keystone-db-create-dlfdf" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.217550 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-zmt7g"] Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.221458 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jn7w\" (UniqueName: \"kubernetes.io/projected/d587d573-77d2-41a6-a9c9-3cf63b24512d-kube-api-access-7jn7w\") pod \"keystone-350d-account-create-update-q5qhn\" (UID: \"d587d573-77d2-41a6-a9c9-3cf63b24512d\") " pod="openstack/keystone-350d-account-create-update-q5qhn" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.222346 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zmt7g" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.258315 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zmt7g"] Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.277603 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-350d-account-create-update-q5qhn" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.291342 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.291728 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.308981 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dlfdf" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.349681 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-20b2-account-create-update-6nbn5"] Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.351207 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-20b2-account-create-update-6nbn5" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.354916 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.361033 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-20b2-account-create-update-6nbn5"] Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.381380 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjjzv\" (UniqueName: \"kubernetes.io/projected/80e2e7c1-645d-4709-b83e-c5604fcc4dfe-kube-api-access-tjjzv\") pod \"placement-db-create-zmt7g\" (UID: \"80e2e7c1-645d-4709-b83e-c5604fcc4dfe\") " pod="openstack/placement-db-create-zmt7g" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.381483 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80e2e7c1-645d-4709-b83e-c5604fcc4dfe-operator-scripts\") pod \"placement-db-create-zmt7g\" (UID: \"80e2e7c1-645d-4709-b83e-c5604fcc4dfe\") " pod="openstack/placement-db-create-zmt7g" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.414251 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.488838 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjjzv\" (UniqueName: \"kubernetes.io/projected/80e2e7c1-645d-4709-b83e-c5604fcc4dfe-kube-api-access-tjjzv\") pod \"placement-db-create-zmt7g\" (UID: \"80e2e7c1-645d-4709-b83e-c5604fcc4dfe\") " pod="openstack/placement-db-create-zmt7g" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.489489 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80e2e7c1-645d-4709-b83e-c5604fcc4dfe-operator-scripts\") pod \"placement-db-create-zmt7g\" (UID: \"80e2e7c1-645d-4709-b83e-c5604fcc4dfe\") " pod="openstack/placement-db-create-zmt7g" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.489532 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4729z\" (UniqueName: \"kubernetes.io/projected/95d6c762-62af-4a0e-bbb9-af154d84b913-kube-api-access-4729z\") pod \"placement-20b2-account-create-update-6nbn5\" (UID: \"95d6c762-62af-4a0e-bbb9-af154d84b913\") " pod="openstack/placement-20b2-account-create-update-6nbn5" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.489574 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95d6c762-62af-4a0e-bbb9-af154d84b913-operator-scripts\") pod \"placement-20b2-account-create-update-6nbn5\" (UID: \"95d6c762-62af-4a0e-bbb9-af154d84b913\") " pod="openstack/placement-20b2-account-create-update-6nbn5" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.490496 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80e2e7c1-645d-4709-b83e-c5604fcc4dfe-operator-scripts\") pod \"placement-db-create-zmt7g\" (UID: \"80e2e7c1-645d-4709-b83e-c5604fcc4dfe\") " pod="openstack/placement-db-create-zmt7g" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.509862 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fde93743-7b9d-4175-abdf-bd74008cf4b0","Type":"ContainerStarted","Data":"87a1fc8f482054b4ebcdef39a9f24f66265346cafd84734616e8affa161376bd"} Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.534957 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjjzv\" (UniqueName: \"kubernetes.io/projected/80e2e7c1-645d-4709-b83e-c5604fcc4dfe-kube-api-access-tjjzv\") pod \"placement-db-create-zmt7g\" (UID: \"80e2e7c1-645d-4709-b83e-c5604fcc4dfe\") " pod="openstack/placement-db-create-zmt7g" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.591492 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4729z\" (UniqueName: \"kubernetes.io/projected/95d6c762-62af-4a0e-bbb9-af154d84b913-kube-api-access-4729z\") pod \"placement-20b2-account-create-update-6nbn5\" (UID: \"95d6c762-62af-4a0e-bbb9-af154d84b913\") " pod="openstack/placement-20b2-account-create-update-6nbn5" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.591539 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95d6c762-62af-4a0e-bbb9-af154d84b913-operator-scripts\") pod \"placement-20b2-account-create-update-6nbn5\" (UID: \"95d6c762-62af-4a0e-bbb9-af154d84b913\") " pod="openstack/placement-20b2-account-create-update-6nbn5" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.593005 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95d6c762-62af-4a0e-bbb9-af154d84b913-operator-scripts\") pod \"placement-20b2-account-create-update-6nbn5\" (UID: \"95d6c762-62af-4a0e-bbb9-af154d84b913\") " pod="openstack/placement-20b2-account-create-update-6nbn5" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.597339 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zmt7g" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.611982 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4729z\" (UniqueName: \"kubernetes.io/projected/95d6c762-62af-4a0e-bbb9-af154d84b913-kube-api-access-4729z\") pod \"placement-20b2-account-create-update-6nbn5\" (UID: \"95d6c762-62af-4a0e-bbb9-af154d84b913\") " pod="openstack/placement-20b2-account-create-update-6nbn5" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.643739 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 28 20:58:33 crc kubenswrapper[4746]: I0128 20:58:33.696058 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-20b2-account-create-update-6nbn5" Jan 28 20:58:34 crc kubenswrapper[4746]: I0128 20:58:34.099432 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:34 crc kubenswrapper[4746]: E0128 20:58:34.099707 4746 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 28 20:58:34 crc kubenswrapper[4746]: E0128 20:58:34.099723 4746 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 28 20:58:34 crc kubenswrapper[4746]: E0128 20:58:34.099768 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift podName:39e8de66-78c6-45cf-b026-7783ef89922d nodeName:}" failed. No retries permitted until 2026-01-28 20:58:42.099751603 +0000 UTC m=+1150.055937957 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift") pod "swift-storage-0" (UID: "39e8de66-78c6-45cf-b026-7783ef89922d") : configmap "swift-ring-files" not found Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.474161 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zmt7g"] Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.482957 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-350d-account-create-update-q5qhn"] Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.523683 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-4wrhd"] Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.524925 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4wrhd" Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.529733 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4wrhd"] Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.618966 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzq6k\" (UniqueName: \"kubernetes.io/projected/617495c8-6e95-4b00-a9ae-8a89fdf3eb3f-kube-api-access-kzq6k\") pod \"glance-db-create-4wrhd\" (UID: \"617495c8-6e95-4b00-a9ae-8a89fdf3eb3f\") " pod="openstack/glance-db-create-4wrhd" Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.619055 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/617495c8-6e95-4b00-a9ae-8a89fdf3eb3f-operator-scripts\") pod \"glance-db-create-4wrhd\" (UID: \"617495c8-6e95-4b00-a9ae-8a89fdf3eb3f\") " pod="openstack/glance-db-create-4wrhd" Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.672464 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-350d-account-create-update-q5qhn" event={"ID":"d587d573-77d2-41a6-a9c9-3cf63b24512d","Type":"ContainerStarted","Data":"229f8ed31081767179b20914f7dbed3f1c55965b79494282e19251ec19bf368a"} Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.725110 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzq6k\" (UniqueName: \"kubernetes.io/projected/617495c8-6e95-4b00-a9ae-8a89fdf3eb3f-kube-api-access-kzq6k\") pod \"glance-db-create-4wrhd\" (UID: \"617495c8-6e95-4b00-a9ae-8a89fdf3eb3f\") " pod="openstack/glance-db-create-4wrhd" Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.725167 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/617495c8-6e95-4b00-a9ae-8a89fdf3eb3f-operator-scripts\") pod \"glance-db-create-4wrhd\" (UID: \"617495c8-6e95-4b00-a9ae-8a89fdf3eb3f\") " pod="openstack/glance-db-create-4wrhd" Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.746735 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-2435-account-create-update-bm9tc"] Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.749873 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/617495c8-6e95-4b00-a9ae-8a89fdf3eb3f-operator-scripts\") pod \"glance-db-create-4wrhd\" (UID: \"617495c8-6e95-4b00-a9ae-8a89fdf3eb3f\") " pod="openstack/glance-db-create-4wrhd" Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.750595 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2435-account-create-update-bm9tc" Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.760088 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.794249 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzq6k\" (UniqueName: \"kubernetes.io/projected/617495c8-6e95-4b00-a9ae-8a89fdf3eb3f-kube-api-access-kzq6k\") pod \"glance-db-create-4wrhd\" (UID: \"617495c8-6e95-4b00-a9ae-8a89fdf3eb3f\") " pod="openstack/glance-db-create-4wrhd" Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.802156 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2435-account-create-update-bm9tc"] Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.817992 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dlfdf"] Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.825925 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-20b2-account-create-update-6nbn5"] Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.826918 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53a7b226-48b5-4c3c-ba60-fe472d7c6694-operator-scripts\") pod \"glance-2435-account-create-update-bm9tc\" (UID: \"53a7b226-48b5-4c3c-ba60-fe472d7c6694\") " pod="openstack/glance-2435-account-create-update-bm9tc" Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.826966 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spsgd\" (UniqueName: \"kubernetes.io/projected/53a7b226-48b5-4c3c-ba60-fe472d7c6694-kube-api-access-spsgd\") pod \"glance-2435-account-create-update-bm9tc\" (UID: \"53a7b226-48b5-4c3c-ba60-fe472d7c6694\") " pod="openstack/glance-2435-account-create-update-bm9tc" Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.884948 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4wrhd" Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.929141 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53a7b226-48b5-4c3c-ba60-fe472d7c6694-operator-scripts\") pod \"glance-2435-account-create-update-bm9tc\" (UID: \"53a7b226-48b5-4c3c-ba60-fe472d7c6694\") " pod="openstack/glance-2435-account-create-update-bm9tc" Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.929211 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spsgd\" (UniqueName: \"kubernetes.io/projected/53a7b226-48b5-4c3c-ba60-fe472d7c6694-kube-api-access-spsgd\") pod \"glance-2435-account-create-update-bm9tc\" (UID: \"53a7b226-48b5-4c3c-ba60-fe472d7c6694\") " pod="openstack/glance-2435-account-create-update-bm9tc" Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.930467 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53a7b226-48b5-4c3c-ba60-fe472d7c6694-operator-scripts\") pod \"glance-2435-account-create-update-bm9tc\" (UID: \"53a7b226-48b5-4c3c-ba60-fe472d7c6694\") " pod="openstack/glance-2435-account-create-update-bm9tc" Jan 28 20:58:38 crc kubenswrapper[4746]: I0128 20:58:38.954185 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spsgd\" (UniqueName: \"kubernetes.io/projected/53a7b226-48b5-4c3c-ba60-fe472d7c6694-kube-api-access-spsgd\") pod \"glance-2435-account-create-update-bm9tc\" (UID: \"53a7b226-48b5-4c3c-ba60-fe472d7c6694\") " pod="openstack/glance-2435-account-create-update-bm9tc" Jan 28 20:58:39 crc kubenswrapper[4746]: I0128 20:58:39.120148 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2435-account-create-update-bm9tc" Jan 28 20:58:39 crc kubenswrapper[4746]: I0128 20:58:39.240223 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="d3cad0b0-7b53-4280-9dec-05e01692820c" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 28 20:58:39 crc kubenswrapper[4746]: I0128 20:58:39.684982 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"af4de16a-caed-4c86-9cf8-da6f9214ca5f","Type":"ContainerStarted","Data":"08de9ebb33afab33f1c87e6cf6687af5e070722a920948781cba1b5fb0258361"} Jan 28 20:58:39 crc kubenswrapper[4746]: I0128 20:58:39.692134 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-20b2-account-create-update-6nbn5" event={"ID":"95d6c762-62af-4a0e-bbb9-af154d84b913","Type":"ContainerStarted","Data":"486ce78ac89b281a51f2829209528be0d12420f3b58f4875805da5e7f306e7e7"} Jan 28 20:58:39 crc kubenswrapper[4746]: I0128 20:58:39.695925 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"88718387-09d6-4e3d-a06f-4353ba42ce91","Type":"ContainerStarted","Data":"d03a46ad73048c6c7b226dc21426667c6e3fd353d111f06a75a8bf91d50aa9fd"} Jan 28 20:58:39 crc kubenswrapper[4746]: I0128 20:58:39.697719 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zmt7g" event={"ID":"80e2e7c1-645d-4709-b83e-c5604fcc4dfe","Type":"ContainerStarted","Data":"cfea17c1b96e092fa1e766d80c511bedf03ad7e0fe8fc7c506392d53cd4e245f"} Jan 28 20:58:39 crc kubenswrapper[4746]: I0128 20:58:39.702426 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5sxj6" event={"ID":"61a4ff02-ae06-438a-a39c-8264c8e61b38","Type":"ContainerStarted","Data":"8366ad10dfa6f272d73ade37166fb74c82fc244a58136aa35816741924f87c1b"} Jan 28 20:58:39 crc kubenswrapper[4746]: I0128 20:58:39.706519 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"701360b2-121a-4cb4-9a4f-9ce63391e740","Type":"ContainerStarted","Data":"ed9100b5aaafbed1ab7eef3015c4df8d2843b7bec57597f5b18c522d626ddc01"} Jan 28 20:58:39 crc kubenswrapper[4746]: I0128 20:58:39.707461 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:58:39 crc kubenswrapper[4746]: I0128 20:58:39.713830 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dlfdf" event={"ID":"5d335033-aade-4271-ae71-4bb277438111","Type":"ContainerStarted","Data":"2f9ca91f55d34efc25d3332edca4dbbfccf3919698d62c7f837018ab74b20e4b"} Jan 28 20:58:39 crc kubenswrapper[4746]: I0128 20:58:39.723387 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-pdnv6" event={"ID":"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda","Type":"ContainerStarted","Data":"c8423b419407b795d407f9fc1c460c07a6195201c2880c2f205d0e52f6c92d2c"} Jan 28 20:58:39 crc kubenswrapper[4746]: I0128 20:58:39.724619 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:39 crc kubenswrapper[4746]: I0128 20:58:39.735906 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-5sxj6" podStartSLOduration=5.953897909 podStartE2EDuration="13.735757013s" podCreationTimestamp="2026-01-28 20:58:26 +0000 UTC" firstStartedPulling="2026-01-28 20:58:30.256503039 +0000 UTC m=+1138.212689393" lastFinishedPulling="2026-01-28 20:58:38.038362143 +0000 UTC m=+1145.994548497" observedRunningTime="2026-01-28 20:58:39.719754814 +0000 UTC m=+1147.675941168" watchObservedRunningTime="2026-01-28 20:58:39.735757013 +0000 UTC m=+1147.691943367" Jan 28 20:58:39 crc kubenswrapper[4746]: I0128 20:58:39.785715 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=52.158119088 podStartE2EDuration="1m10.785696972s" podCreationTimestamp="2026-01-28 20:57:29 +0000 UTC" firstStartedPulling="2026-01-28 20:57:38.920537752 +0000 UTC m=+1086.876724106" lastFinishedPulling="2026-01-28 20:57:57.548115636 +0000 UTC m=+1105.504301990" observedRunningTime="2026-01-28 20:58:39.75876965 +0000 UTC m=+1147.714956004" watchObservedRunningTime="2026-01-28 20:58:39.785696972 +0000 UTC m=+1147.741883326" Jan 28 20:58:39 crc kubenswrapper[4746]: I0128 20:58:39.790816 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4wrhd"] Jan 28 20:58:39 crc kubenswrapper[4746]: I0128 20:58:39.797428 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-pdnv6" podStartSLOduration=14.797403356 podStartE2EDuration="14.797403356s" podCreationTimestamp="2026-01-28 20:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:58:39.785930248 +0000 UTC m=+1147.742116602" watchObservedRunningTime="2026-01-28 20:58:39.797403356 +0000 UTC m=+1147.753589710" Jan 28 20:58:39 crc kubenswrapper[4746]: I0128 20:58:39.807966 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2435-account-create-update-bm9tc"] Jan 28 20:58:39 crc kubenswrapper[4746]: W0128 20:58:39.829618 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53a7b226_48b5_4c3c_ba60_fe472d7c6694.slice/crio-77c66b5040e35894d3f957a92b9634968c7ca9df690cf73fcf17a10e625b2c6c WatchSource:0}: Error finding container 77c66b5040e35894d3f957a92b9634968c7ca9df690cf73fcf17a10e625b2c6c: Status 404 returned error can't find the container with id 77c66b5040e35894d3f957a92b9634968c7ca9df690cf73fcf17a10e625b2c6c Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.326898 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-hhpql"] Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.328782 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hhpql" Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.331305 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.337757 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hhpql"] Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.481599 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb3fedce-560c-4a95-860a-fedc72ad4d04-operator-scripts\") pod \"root-account-create-update-hhpql\" (UID: \"eb3fedce-560c-4a95-860a-fedc72ad4d04\") " pod="openstack/root-account-create-update-hhpql" Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.481684 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zdtk\" (UniqueName: \"kubernetes.io/projected/eb3fedce-560c-4a95-860a-fedc72ad4d04-kube-api-access-9zdtk\") pod \"root-account-create-update-hhpql\" (UID: \"eb3fedce-560c-4a95-860a-fedc72ad4d04\") " pod="openstack/root-account-create-update-hhpql" Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.583565 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb3fedce-560c-4a95-860a-fedc72ad4d04-operator-scripts\") pod \"root-account-create-update-hhpql\" (UID: \"eb3fedce-560c-4a95-860a-fedc72ad4d04\") " pod="openstack/root-account-create-update-hhpql" Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.583664 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zdtk\" (UniqueName: \"kubernetes.io/projected/eb3fedce-560c-4a95-860a-fedc72ad4d04-kube-api-access-9zdtk\") pod \"root-account-create-update-hhpql\" (UID: \"eb3fedce-560c-4a95-860a-fedc72ad4d04\") " pod="openstack/root-account-create-update-hhpql" Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.585144 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb3fedce-560c-4a95-860a-fedc72ad4d04-operator-scripts\") pod \"root-account-create-update-hhpql\" (UID: \"eb3fedce-560c-4a95-860a-fedc72ad4d04\") " pod="openstack/root-account-create-update-hhpql" Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.608394 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zdtk\" (UniqueName: \"kubernetes.io/projected/eb3fedce-560c-4a95-860a-fedc72ad4d04-kube-api-access-9zdtk\") pod \"root-account-create-update-hhpql\" (UID: \"eb3fedce-560c-4a95-860a-fedc72ad4d04\") " pod="openstack/root-account-create-update-hhpql" Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.650592 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hhpql" Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.738919 4746 generic.go:334] "Generic (PLEG): container finished" podID="80e2e7c1-645d-4709-b83e-c5604fcc4dfe" containerID="2dc4c36f3f34e1b08ec18c6496cc292ab5dccc3a1294d7cc357e44ca14604cc7" exitCode=0 Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.739017 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zmt7g" event={"ID":"80e2e7c1-645d-4709-b83e-c5604fcc4dfe","Type":"ContainerDied","Data":"2dc4c36f3f34e1b08ec18c6496cc292ab5dccc3a1294d7cc357e44ca14604cc7"} Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.742013 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2435-account-create-update-bm9tc" event={"ID":"53a7b226-48b5-4c3c-ba60-fe472d7c6694","Type":"ContainerStarted","Data":"0c3c88cb04a0226468c5a48fbfb28b8557b199093188a9fd188e97ad04b45164"} Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.742048 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2435-account-create-update-bm9tc" event={"ID":"53a7b226-48b5-4c3c-ba60-fe472d7c6694","Type":"ContainerStarted","Data":"77c66b5040e35894d3f957a92b9634968c7ca9df690cf73fcf17a10e625b2c6c"} Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.746170 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4wrhd" event={"ID":"617495c8-6e95-4b00-a9ae-8a89fdf3eb3f","Type":"ContainerStarted","Data":"e004cc641b94cea2da5f068e6c073b64b753e32e8bc0a28daff4bf2d2e5875da"} Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.746230 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4wrhd" event={"ID":"617495c8-6e95-4b00-a9ae-8a89fdf3eb3f","Type":"ContainerStarted","Data":"8dc33b68939ae4490a995b8c7d3a0478a3f9948d9e16f2440d19f728d2d2b899"} Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.750199 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dlfdf" event={"ID":"5d335033-aade-4271-ae71-4bb277438111","Type":"ContainerStarted","Data":"f0f29a1577f25521e0aec68786abc50afce8b05ccc2caa26cfb0a16ffb4c82ce"} Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.753657 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"af4de16a-caed-4c86-9cf8-da6f9214ca5f","Type":"ContainerStarted","Data":"619cb726a724a271ac6758cc06055ae011ba6da77341de9359dc1530c14af836"} Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.754310 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.756123 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-20b2-account-create-update-6nbn5" event={"ID":"95d6c762-62af-4a0e-bbb9-af154d84b913","Type":"ContainerStarted","Data":"e8afef6b78d19d71b1ada15e5cbe8848c9bdd70ed3fa51c3220528d8c7b7eec8"} Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.762255 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-350d-account-create-update-q5qhn" event={"ID":"d587d573-77d2-41a6-a9c9-3cf63b24512d","Type":"ContainerStarted","Data":"ed4f4d5a3a8a4e636f9e1a6642cf8b94121509448f29e951cec4ffb529f71c91"} Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.813067 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-2435-account-create-update-bm9tc" podStartSLOduration=2.813039277 podStartE2EDuration="2.813039277s" podCreationTimestamp="2026-01-28 20:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:58:40.811965818 +0000 UTC m=+1148.768152172" watchObservedRunningTime="2026-01-28 20:58:40.813039277 +0000 UTC m=+1148.769225631" Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.842011 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-4wrhd" podStartSLOduration=2.8419896529999997 podStartE2EDuration="2.841989653s" podCreationTimestamp="2026-01-28 20:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:58:40.838857369 +0000 UTC m=+1148.795043723" watchObservedRunningTime="2026-01-28 20:58:40.841989653 +0000 UTC m=+1148.798176007" Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.874125 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-20b2-account-create-update-6nbn5" podStartSLOduration=7.874102584 podStartE2EDuration="7.874102584s" podCreationTimestamp="2026-01-28 20:58:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:58:40.857766426 +0000 UTC m=+1148.813952790" watchObservedRunningTime="2026-01-28 20:58:40.874102584 +0000 UTC m=+1148.830288938" Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.896927 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=7.480204421 podStartE2EDuration="14.896899995s" podCreationTimestamp="2026-01-28 20:58:26 +0000 UTC" firstStartedPulling="2026-01-28 20:58:30.525610734 +0000 UTC m=+1138.481797088" lastFinishedPulling="2026-01-28 20:58:37.942306308 +0000 UTC m=+1145.898492662" observedRunningTime="2026-01-28 20:58:40.886114716 +0000 UTC m=+1148.842301070" watchObservedRunningTime="2026-01-28 20:58:40.896899995 +0000 UTC m=+1148.853086349" Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.918891 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-dlfdf" podStartSLOduration=8.918867554 podStartE2EDuration="8.918867554s" podCreationTimestamp="2026-01-28 20:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:58:40.909001839 +0000 UTC m=+1148.865188203" watchObservedRunningTime="2026-01-28 20:58:40.918867554 +0000 UTC m=+1148.875053908" Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.944025 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=54.315343416 podStartE2EDuration="1m12.944002468s" podCreationTimestamp="2026-01-28 20:57:28 +0000 UTC" firstStartedPulling="2026-01-28 20:57:38.920640175 +0000 UTC m=+1086.876826529" lastFinishedPulling="2026-01-28 20:57:57.549299227 +0000 UTC m=+1105.505485581" observedRunningTime="2026-01-28 20:58:40.931070901 +0000 UTC m=+1148.887257275" watchObservedRunningTime="2026-01-28 20:58:40.944002468 +0000 UTC m=+1148.900188822" Jan 28 20:58:40 crc kubenswrapper[4746]: I0128 20:58:40.959118 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-350d-account-create-update-q5qhn" podStartSLOduration=8.959093662 podStartE2EDuration="8.959093662s" podCreationTimestamp="2026-01-28 20:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:58:40.9541247 +0000 UTC m=+1148.910311054" watchObservedRunningTime="2026-01-28 20:58:40.959093662 +0000 UTC m=+1148.915280016" Jan 28 20:58:41 crc kubenswrapper[4746]: E0128 20:58:41.205998 4746 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod617495c8_6e95_4b00_a9ae_8a89fdf3eb3f.slice/crio-conmon-e004cc641b94cea2da5f068e6c073b64b753e32e8bc0a28daff4bf2d2e5875da.scope\": RecentStats: unable to find data in memory cache]" Jan 28 20:58:41 crc kubenswrapper[4746]: I0128 20:58:41.768208 4746 generic.go:334] "Generic (PLEG): container finished" podID="d587d573-77d2-41a6-a9c9-3cf63b24512d" containerID="ed4f4d5a3a8a4e636f9e1a6642cf8b94121509448f29e951cec4ffb529f71c91" exitCode=0 Jan 28 20:58:41 crc kubenswrapper[4746]: I0128 20:58:41.768278 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-350d-account-create-update-q5qhn" event={"ID":"d587d573-77d2-41a6-a9c9-3cf63b24512d","Type":"ContainerDied","Data":"ed4f4d5a3a8a4e636f9e1a6642cf8b94121509448f29e951cec4ffb529f71c91"} Jan 28 20:58:41 crc kubenswrapper[4746]: I0128 20:58:41.769846 4746 generic.go:334] "Generic (PLEG): container finished" podID="53a7b226-48b5-4c3c-ba60-fe472d7c6694" containerID="0c3c88cb04a0226468c5a48fbfb28b8557b199093188a9fd188e97ad04b45164" exitCode=0 Jan 28 20:58:41 crc kubenswrapper[4746]: I0128 20:58:41.769932 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2435-account-create-update-bm9tc" event={"ID":"53a7b226-48b5-4c3c-ba60-fe472d7c6694","Type":"ContainerDied","Data":"0c3c88cb04a0226468c5a48fbfb28b8557b199093188a9fd188e97ad04b45164"} Jan 28 20:58:41 crc kubenswrapper[4746]: I0128 20:58:41.771546 4746 generic.go:334] "Generic (PLEG): container finished" podID="617495c8-6e95-4b00-a9ae-8a89fdf3eb3f" containerID="e004cc641b94cea2da5f068e6c073b64b753e32e8bc0a28daff4bf2d2e5875da" exitCode=0 Jan 28 20:58:41 crc kubenswrapper[4746]: I0128 20:58:41.771593 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4wrhd" event={"ID":"617495c8-6e95-4b00-a9ae-8a89fdf3eb3f","Type":"ContainerDied","Data":"e004cc641b94cea2da5f068e6c073b64b753e32e8bc0a28daff4bf2d2e5875da"} Jan 28 20:58:41 crc kubenswrapper[4746]: I0128 20:58:41.772855 4746 generic.go:334] "Generic (PLEG): container finished" podID="5d335033-aade-4271-ae71-4bb277438111" containerID="f0f29a1577f25521e0aec68786abc50afce8b05ccc2caa26cfb0a16ffb4c82ce" exitCode=0 Jan 28 20:58:41 crc kubenswrapper[4746]: I0128 20:58:41.772927 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dlfdf" event={"ID":"5d335033-aade-4271-ae71-4bb277438111","Type":"ContainerDied","Data":"f0f29a1577f25521e0aec68786abc50afce8b05ccc2caa26cfb0a16ffb4c82ce"} Jan 28 20:58:41 crc kubenswrapper[4746]: I0128 20:58:41.774538 4746 generic.go:334] "Generic (PLEG): container finished" podID="95d6c762-62af-4a0e-bbb9-af154d84b913" containerID="e8afef6b78d19d71b1ada15e5cbe8848c9bdd70ed3fa51c3220528d8c7b7eec8" exitCode=0 Jan 28 20:58:41 crc kubenswrapper[4746]: I0128 20:58:41.774610 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-20b2-account-create-update-6nbn5" event={"ID":"95d6c762-62af-4a0e-bbb9-af154d84b913","Type":"ContainerDied","Data":"e8afef6b78d19d71b1ada15e5cbe8848c9bdd70ed3fa51c3220528d8c7b7eec8"} Jan 28 20:58:42 crc kubenswrapper[4746]: I0128 20:58:42.121242 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:42 crc kubenswrapper[4746]: E0128 20:58:42.121430 4746 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 28 20:58:42 crc kubenswrapper[4746]: E0128 20:58:42.122303 4746 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 28 20:58:42 crc kubenswrapper[4746]: E0128 20:58:42.122419 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift podName:39e8de66-78c6-45cf-b026-7783ef89922d nodeName:}" failed. No retries permitted until 2026-01-28 20:58:58.122399293 +0000 UTC m=+1166.078585667 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift") pod "swift-storage-0" (UID: "39e8de66-78c6-45cf-b026-7783ef89922d") : configmap "swift-ring-files" not found Jan 28 20:58:42 crc kubenswrapper[4746]: I0128 20:58:42.358289 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zmt7g" Jan 28 20:58:42 crc kubenswrapper[4746]: I0128 20:58:42.428648 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80e2e7c1-645d-4709-b83e-c5604fcc4dfe-operator-scripts\") pod \"80e2e7c1-645d-4709-b83e-c5604fcc4dfe\" (UID: \"80e2e7c1-645d-4709-b83e-c5604fcc4dfe\") " Jan 28 20:58:42 crc kubenswrapper[4746]: I0128 20:58:42.428942 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjjzv\" (UniqueName: \"kubernetes.io/projected/80e2e7c1-645d-4709-b83e-c5604fcc4dfe-kube-api-access-tjjzv\") pod \"80e2e7c1-645d-4709-b83e-c5604fcc4dfe\" (UID: \"80e2e7c1-645d-4709-b83e-c5604fcc4dfe\") " Jan 28 20:58:42 crc kubenswrapper[4746]: I0128 20:58:42.430815 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80e2e7c1-645d-4709-b83e-c5604fcc4dfe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80e2e7c1-645d-4709-b83e-c5604fcc4dfe" (UID: "80e2e7c1-645d-4709-b83e-c5604fcc4dfe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:42 crc kubenswrapper[4746]: I0128 20:58:42.458721 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80e2e7c1-645d-4709-b83e-c5604fcc4dfe-kube-api-access-tjjzv" (OuterVolumeSpecName: "kube-api-access-tjjzv") pod "80e2e7c1-645d-4709-b83e-c5604fcc4dfe" (UID: "80e2e7c1-645d-4709-b83e-c5604fcc4dfe"). InnerVolumeSpecName "kube-api-access-tjjzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:42 crc kubenswrapper[4746]: I0128 20:58:42.534875 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjjzv\" (UniqueName: \"kubernetes.io/projected/80e2e7c1-645d-4709-b83e-c5604fcc4dfe-kube-api-access-tjjzv\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:42 crc kubenswrapper[4746]: I0128 20:58:42.534931 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80e2e7c1-645d-4709-b83e-c5604fcc4dfe-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:42 crc kubenswrapper[4746]: I0128 20:58:42.785004 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zmt7g" event={"ID":"80e2e7c1-645d-4709-b83e-c5604fcc4dfe","Type":"ContainerDied","Data":"cfea17c1b96e092fa1e766d80c511bedf03ad7e0fe8fc7c506392d53cd4e245f"} Jan 28 20:58:42 crc kubenswrapper[4746]: I0128 20:58:42.785402 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfea17c1b96e092fa1e766d80c511bedf03ad7e0fe8fc7c506392d53cd4e245f" Jan 28 20:58:42 crc kubenswrapper[4746]: I0128 20:58:42.785042 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zmt7g" Jan 28 20:58:42 crc kubenswrapper[4746]: I0128 20:58:42.789041 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fde93743-7b9d-4175-abdf-bd74008cf4b0","Type":"ContainerStarted","Data":"c0ddb2ac8b981c68e3c5cdd2b45c613d4095039ce06d3da8fa9d8eabf8cbe483"} Jan 28 20:58:42 crc kubenswrapper[4746]: I0128 20:58:42.822563 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hhpql"] Jan 28 20:58:42 crc kubenswrapper[4746]: I0128 20:58:42.853299 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=25.365757972 podStartE2EDuration="1m7.853272969s" podCreationTimestamp="2026-01-28 20:57:35 +0000 UTC" firstStartedPulling="2026-01-28 20:57:59.752268903 +0000 UTC m=+1107.708455257" lastFinishedPulling="2026-01-28 20:58:42.2397839 +0000 UTC m=+1150.195970254" observedRunningTime="2026-01-28 20:58:42.842794238 +0000 UTC m=+1150.798980592" watchObservedRunningTime="2026-01-28 20:58:42.853272969 +0000 UTC m=+1150.809459323" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.313829 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4wrhd" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.464010 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/617495c8-6e95-4b00-a9ae-8a89fdf3eb3f-operator-scripts\") pod \"617495c8-6e95-4b00-a9ae-8a89fdf3eb3f\" (UID: \"617495c8-6e95-4b00-a9ae-8a89fdf3eb3f\") " Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.464229 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzq6k\" (UniqueName: \"kubernetes.io/projected/617495c8-6e95-4b00-a9ae-8a89fdf3eb3f-kube-api-access-kzq6k\") pod \"617495c8-6e95-4b00-a9ae-8a89fdf3eb3f\" (UID: \"617495c8-6e95-4b00-a9ae-8a89fdf3eb3f\") " Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.465528 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/617495c8-6e95-4b00-a9ae-8a89fdf3eb3f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "617495c8-6e95-4b00-a9ae-8a89fdf3eb3f" (UID: "617495c8-6e95-4b00-a9ae-8a89fdf3eb3f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.481888 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/617495c8-6e95-4b00-a9ae-8a89fdf3eb3f-kube-api-access-kzq6k" (OuterVolumeSpecName: "kube-api-access-kzq6k") pod "617495c8-6e95-4b00-a9ae-8a89fdf3eb3f" (UID: "617495c8-6e95-4b00-a9ae-8a89fdf3eb3f"). InnerVolumeSpecName "kube-api-access-kzq6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.569631 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzq6k\" (UniqueName: \"kubernetes.io/projected/617495c8-6e95-4b00-a9ae-8a89fdf3eb3f-kube-api-access-kzq6k\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.569656 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/617495c8-6e95-4b00-a9ae-8a89fdf3eb3f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.582592 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2435-account-create-update-bm9tc" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.586821 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-350d-account-create-update-q5qhn" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.591626 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dlfdf" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.592964 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-20b2-account-create-update-6nbn5" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.671953 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spsgd\" (UniqueName: \"kubernetes.io/projected/53a7b226-48b5-4c3c-ba60-fe472d7c6694-kube-api-access-spsgd\") pod \"53a7b226-48b5-4c3c-ba60-fe472d7c6694\" (UID: \"53a7b226-48b5-4c3c-ba60-fe472d7c6694\") " Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.672035 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jn7w\" (UniqueName: \"kubernetes.io/projected/d587d573-77d2-41a6-a9c9-3cf63b24512d-kube-api-access-7jn7w\") pod \"d587d573-77d2-41a6-a9c9-3cf63b24512d\" (UID: \"d587d573-77d2-41a6-a9c9-3cf63b24512d\") " Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.672136 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53a7b226-48b5-4c3c-ba60-fe472d7c6694-operator-scripts\") pod \"53a7b226-48b5-4c3c-ba60-fe472d7c6694\" (UID: \"53a7b226-48b5-4c3c-ba60-fe472d7c6694\") " Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.672195 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d587d573-77d2-41a6-a9c9-3cf63b24512d-operator-scripts\") pod \"d587d573-77d2-41a6-a9c9-3cf63b24512d\" (UID: \"d587d573-77d2-41a6-a9c9-3cf63b24512d\") " Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.673019 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d587d573-77d2-41a6-a9c9-3cf63b24512d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d587d573-77d2-41a6-a9c9-3cf63b24512d" (UID: "d587d573-77d2-41a6-a9c9-3cf63b24512d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.684361 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53a7b226-48b5-4c3c-ba60-fe472d7c6694-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "53a7b226-48b5-4c3c-ba60-fe472d7c6694" (UID: "53a7b226-48b5-4c3c-ba60-fe472d7c6694"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.686582 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d587d573-77d2-41a6-a9c9-3cf63b24512d-kube-api-access-7jn7w" (OuterVolumeSpecName: "kube-api-access-7jn7w") pod "d587d573-77d2-41a6-a9c9-3cf63b24512d" (UID: "d587d573-77d2-41a6-a9c9-3cf63b24512d"). InnerVolumeSpecName "kube-api-access-7jn7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.687179 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53a7b226-48b5-4c3c-ba60-fe472d7c6694-kube-api-access-spsgd" (OuterVolumeSpecName: "kube-api-access-spsgd") pod "53a7b226-48b5-4c3c-ba60-fe472d7c6694" (UID: "53a7b226-48b5-4c3c-ba60-fe472d7c6694"). InnerVolumeSpecName "kube-api-access-spsgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.773710 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d335033-aade-4271-ae71-4bb277438111-operator-scripts\") pod \"5d335033-aade-4271-ae71-4bb277438111\" (UID: \"5d335033-aade-4271-ae71-4bb277438111\") " Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.773776 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4729z\" (UniqueName: \"kubernetes.io/projected/95d6c762-62af-4a0e-bbb9-af154d84b913-kube-api-access-4729z\") pod \"95d6c762-62af-4a0e-bbb9-af154d84b913\" (UID: \"95d6c762-62af-4a0e-bbb9-af154d84b913\") " Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.773847 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95d6c762-62af-4a0e-bbb9-af154d84b913-operator-scripts\") pod \"95d6c762-62af-4a0e-bbb9-af154d84b913\" (UID: \"95d6c762-62af-4a0e-bbb9-af154d84b913\") " Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.773941 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5s8l\" (UniqueName: \"kubernetes.io/projected/5d335033-aade-4271-ae71-4bb277438111-kube-api-access-q5s8l\") pod \"5d335033-aade-4271-ae71-4bb277438111\" (UID: \"5d335033-aade-4271-ae71-4bb277438111\") " Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.774267 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d335033-aade-4271-ae71-4bb277438111-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d335033-aade-4271-ae71-4bb277438111" (UID: "5d335033-aade-4271-ae71-4bb277438111"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.774646 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95d6c762-62af-4a0e-bbb9-af154d84b913-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95d6c762-62af-4a0e-bbb9-af154d84b913" (UID: "95d6c762-62af-4a0e-bbb9-af154d84b913"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.774945 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spsgd\" (UniqueName: \"kubernetes.io/projected/53a7b226-48b5-4c3c-ba60-fe472d7c6694-kube-api-access-spsgd\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.774965 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95d6c762-62af-4a0e-bbb9-af154d84b913-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.774975 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jn7w\" (UniqueName: \"kubernetes.io/projected/d587d573-77d2-41a6-a9c9-3cf63b24512d-kube-api-access-7jn7w\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.774985 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53a7b226-48b5-4c3c-ba60-fe472d7c6694-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.774994 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d587d573-77d2-41a6-a9c9-3cf63b24512d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.775002 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d335033-aade-4271-ae71-4bb277438111-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.777641 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d6c762-62af-4a0e-bbb9-af154d84b913-kube-api-access-4729z" (OuterVolumeSpecName: "kube-api-access-4729z") pod "95d6c762-62af-4a0e-bbb9-af154d84b913" (UID: "95d6c762-62af-4a0e-bbb9-af154d84b913"). InnerVolumeSpecName "kube-api-access-4729z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.777811 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d335033-aade-4271-ae71-4bb277438111-kube-api-access-q5s8l" (OuterVolumeSpecName: "kube-api-access-q5s8l") pod "5d335033-aade-4271-ae71-4bb277438111" (UID: "5d335033-aade-4271-ae71-4bb277438111"). InnerVolumeSpecName "kube-api-access-q5s8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.798460 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dlfdf" event={"ID":"5d335033-aade-4271-ae71-4bb277438111","Type":"ContainerDied","Data":"2f9ca91f55d34efc25d3332edca4dbbfccf3919698d62c7f837018ab74b20e4b"} Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.798511 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f9ca91f55d34efc25d3332edca4dbbfccf3919698d62c7f837018ab74b20e4b" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.798522 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dlfdf" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.800712 4746 generic.go:334] "Generic (PLEG): container finished" podID="eb3fedce-560c-4a95-860a-fedc72ad4d04" containerID="5efd43f67cdb780cefb6d4e7f5b9be112a6ae66e6f4fa4a140595b1baac5638d" exitCode=0 Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.800785 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hhpql" event={"ID":"eb3fedce-560c-4a95-860a-fedc72ad4d04","Type":"ContainerDied","Data":"5efd43f67cdb780cefb6d4e7f5b9be112a6ae66e6f4fa4a140595b1baac5638d"} Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.800810 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hhpql" event={"ID":"eb3fedce-560c-4a95-860a-fedc72ad4d04","Type":"ContainerStarted","Data":"97b847ae7520bffba20c22f9bddd03c659c98066aaa9b579a9fc042deda33c01"} Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.810225 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-20b2-account-create-update-6nbn5" event={"ID":"95d6c762-62af-4a0e-bbb9-af154d84b913","Type":"ContainerDied","Data":"486ce78ac89b281a51f2829209528be0d12420f3b58f4875805da5e7f306e7e7"} Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.810274 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="486ce78ac89b281a51f2829209528be0d12420f3b58f4875805da5e7f306e7e7" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.810284 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-20b2-account-create-update-6nbn5" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.821290 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-350d-account-create-update-q5qhn" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.823195 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-350d-account-create-update-q5qhn" event={"ID":"d587d573-77d2-41a6-a9c9-3cf63b24512d","Type":"ContainerDied","Data":"229f8ed31081767179b20914f7dbed3f1c55965b79494282e19251ec19bf368a"} Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.823258 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="229f8ed31081767179b20914f7dbed3f1c55965b79494282e19251ec19bf368a" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.830368 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2435-account-create-update-bm9tc" event={"ID":"53a7b226-48b5-4c3c-ba60-fe472d7c6694","Type":"ContainerDied","Data":"77c66b5040e35894d3f957a92b9634968c7ca9df690cf73fcf17a10e625b2c6c"} Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.830419 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77c66b5040e35894d3f957a92b9634968c7ca9df690cf73fcf17a10e625b2c6c" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.830505 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2435-account-create-update-bm9tc" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.835956 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4wrhd" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.835969 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4wrhd" event={"ID":"617495c8-6e95-4b00-a9ae-8a89fdf3eb3f","Type":"ContainerDied","Data":"8dc33b68939ae4490a995b8c7d3a0478a3f9948d9e16f2440d19f728d2d2b899"} Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.836314 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dc33b68939ae4490a995b8c7d3a0478a3f9948d9e16f2440d19f728d2d2b899" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.879697 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5s8l\" (UniqueName: \"kubernetes.io/projected/5d335033-aade-4271-ae71-4bb277438111-kube-api-access-q5s8l\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:43 crc kubenswrapper[4746]: I0128 20:58:43.879738 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4729z\" (UniqueName: \"kubernetes.io/projected/95d6c762-62af-4a0e-bbb9-af154d84b913-kube-api-access-4729z\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.379009 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ms9wc" podUID="754a9c43-4753-41cd-945d-93f7fa2b715e" containerName="ovn-controller" probeResult="failure" output=< Jan 28 20:58:44 crc kubenswrapper[4746]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 28 20:58:44 crc kubenswrapper[4746]: > Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.384138 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.389339 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-fcvh6" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.647320 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ms9wc-config-t7qwd"] Jan 28 20:58:44 crc kubenswrapper[4746]: E0128 20:58:44.647781 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d6c762-62af-4a0e-bbb9-af154d84b913" containerName="mariadb-account-create-update" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.647806 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d6c762-62af-4a0e-bbb9-af154d84b913" containerName="mariadb-account-create-update" Jan 28 20:58:44 crc kubenswrapper[4746]: E0128 20:58:44.647826 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617495c8-6e95-4b00-a9ae-8a89fdf3eb3f" containerName="mariadb-database-create" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.647834 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="617495c8-6e95-4b00-a9ae-8a89fdf3eb3f" containerName="mariadb-database-create" Jan 28 20:58:44 crc kubenswrapper[4746]: E0128 20:58:44.647846 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d587d573-77d2-41a6-a9c9-3cf63b24512d" containerName="mariadb-account-create-update" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.647854 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="d587d573-77d2-41a6-a9c9-3cf63b24512d" containerName="mariadb-account-create-update" Jan 28 20:58:44 crc kubenswrapper[4746]: E0128 20:58:44.647865 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80e2e7c1-645d-4709-b83e-c5604fcc4dfe" containerName="mariadb-database-create" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.647872 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="80e2e7c1-645d-4709-b83e-c5604fcc4dfe" containerName="mariadb-database-create" Jan 28 20:58:44 crc kubenswrapper[4746]: E0128 20:58:44.647884 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a7b226-48b5-4c3c-ba60-fe472d7c6694" containerName="mariadb-account-create-update" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.647892 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a7b226-48b5-4c3c-ba60-fe472d7c6694" containerName="mariadb-account-create-update" Jan 28 20:58:44 crc kubenswrapper[4746]: E0128 20:58:44.647905 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d335033-aade-4271-ae71-4bb277438111" containerName="mariadb-database-create" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.647912 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d335033-aade-4271-ae71-4bb277438111" containerName="mariadb-database-create" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.648173 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="617495c8-6e95-4b00-a9ae-8a89fdf3eb3f" containerName="mariadb-database-create" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.648193 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="d587d573-77d2-41a6-a9c9-3cf63b24512d" containerName="mariadb-account-create-update" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.648213 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="80e2e7c1-645d-4709-b83e-c5604fcc4dfe" containerName="mariadb-database-create" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.648225 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d6c762-62af-4a0e-bbb9-af154d84b913" containerName="mariadb-account-create-update" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.648235 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="53a7b226-48b5-4c3c-ba60-fe472d7c6694" containerName="mariadb-account-create-update" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.648250 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d335033-aade-4271-ae71-4bb277438111" containerName="mariadb-database-create" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.649190 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.652612 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.667918 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ms9wc-config-t7qwd"] Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.701264 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-var-log-ovn\") pod \"ovn-controller-ms9wc-config-t7qwd\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.701357 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-scripts\") pod \"ovn-controller-ms9wc-config-t7qwd\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.701387 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-var-run-ovn\") pod \"ovn-controller-ms9wc-config-t7qwd\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.701439 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-var-run\") pod \"ovn-controller-ms9wc-config-t7qwd\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.701531 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-additional-scripts\") pod \"ovn-controller-ms9wc-config-t7qwd\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.701554 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7mb5\" (UniqueName: \"kubernetes.io/projected/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-kube-api-access-j7mb5\") pod \"ovn-controller-ms9wc-config-t7qwd\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.802807 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-var-run\") pod \"ovn-controller-ms9wc-config-t7qwd\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.802923 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-additional-scripts\") pod \"ovn-controller-ms9wc-config-t7qwd\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.802946 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7mb5\" (UniqueName: \"kubernetes.io/projected/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-kube-api-access-j7mb5\") pod \"ovn-controller-ms9wc-config-t7qwd\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.802977 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-var-log-ovn\") pod \"ovn-controller-ms9wc-config-t7qwd\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.803026 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-scripts\") pod \"ovn-controller-ms9wc-config-t7qwd\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.803056 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-var-run-ovn\") pod \"ovn-controller-ms9wc-config-t7qwd\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.803227 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-var-run\") pod \"ovn-controller-ms9wc-config-t7qwd\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.803254 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-var-run-ovn\") pod \"ovn-controller-ms9wc-config-t7qwd\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.803265 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-var-log-ovn\") pod \"ovn-controller-ms9wc-config-t7qwd\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.803831 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-additional-scripts\") pod \"ovn-controller-ms9wc-config-t7qwd\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.805305 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-scripts\") pod \"ovn-controller-ms9wc-config-t7qwd\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.827837 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7mb5\" (UniqueName: \"kubernetes.io/projected/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-kube-api-access-j7mb5\") pod \"ovn-controller-ms9wc-config-t7qwd\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:44 crc kubenswrapper[4746]: I0128 20:58:44.965465 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:45 crc kubenswrapper[4746]: I0128 20:58:45.281200 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hhpql" Jan 28 20:58:45 crc kubenswrapper[4746]: I0128 20:58:45.414278 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb3fedce-560c-4a95-860a-fedc72ad4d04-operator-scripts\") pod \"eb3fedce-560c-4a95-860a-fedc72ad4d04\" (UID: \"eb3fedce-560c-4a95-860a-fedc72ad4d04\") " Jan 28 20:58:45 crc kubenswrapper[4746]: I0128 20:58:45.414410 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zdtk\" (UniqueName: \"kubernetes.io/projected/eb3fedce-560c-4a95-860a-fedc72ad4d04-kube-api-access-9zdtk\") pod \"eb3fedce-560c-4a95-860a-fedc72ad4d04\" (UID: \"eb3fedce-560c-4a95-860a-fedc72ad4d04\") " Jan 28 20:58:45 crc kubenswrapper[4746]: I0128 20:58:45.415200 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb3fedce-560c-4a95-860a-fedc72ad4d04-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb3fedce-560c-4a95-860a-fedc72ad4d04" (UID: "eb3fedce-560c-4a95-860a-fedc72ad4d04"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:45 crc kubenswrapper[4746]: I0128 20:58:45.436287 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb3fedce-560c-4a95-860a-fedc72ad4d04-kube-api-access-9zdtk" (OuterVolumeSpecName: "kube-api-access-9zdtk") pod "eb3fedce-560c-4a95-860a-fedc72ad4d04" (UID: "eb3fedce-560c-4a95-860a-fedc72ad4d04"). InnerVolumeSpecName "kube-api-access-9zdtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:45 crc kubenswrapper[4746]: I0128 20:58:45.493283 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ms9wc-config-t7qwd"] Jan 28 20:58:45 crc kubenswrapper[4746]: I0128 20:58:45.516496 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb3fedce-560c-4a95-860a-fedc72ad4d04-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:45 crc kubenswrapper[4746]: I0128 20:58:45.516557 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zdtk\" (UniqueName: \"kubernetes.io/projected/eb3fedce-560c-4a95-860a-fedc72ad4d04-kube-api-access-9zdtk\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:45 crc kubenswrapper[4746]: I0128 20:58:45.865904 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ms9wc-config-t7qwd" event={"ID":"54c7c3a5-c7c4-499b-8ded-2a2341a7d928","Type":"ContainerStarted","Data":"d4a7991e03a107fe03667d4568bdff822eccb58ad8a1254b56c2585d609c36b0"} Jan 28 20:58:45 crc kubenswrapper[4746]: I0128 20:58:45.866647 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ms9wc-config-t7qwd" event={"ID":"54c7c3a5-c7c4-499b-8ded-2a2341a7d928","Type":"ContainerStarted","Data":"135a6bcb81886df2f7909d9bbaaa999ddf19ed694f4b70e173c93451365476dc"} Jan 28 20:58:45 crc kubenswrapper[4746]: I0128 20:58:45.871814 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hhpql" event={"ID":"eb3fedce-560c-4a95-860a-fedc72ad4d04","Type":"ContainerDied","Data":"97b847ae7520bffba20c22f9bddd03c659c98066aaa9b579a9fc042deda33c01"} Jan 28 20:58:45 crc kubenswrapper[4746]: I0128 20:58:45.871852 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97b847ae7520bffba20c22f9bddd03c659c98066aaa9b579a9fc042deda33c01" Jan 28 20:58:45 crc kubenswrapper[4746]: I0128 20:58:45.871928 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hhpql" Jan 28 20:58:45 crc kubenswrapper[4746]: I0128 20:58:45.887661 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ms9wc-config-t7qwd" podStartSLOduration=1.887640615 podStartE2EDuration="1.887640615s" podCreationTimestamp="2026-01-28 20:58:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:58:45.882633011 +0000 UTC m=+1153.838819375" watchObservedRunningTime="2026-01-28 20:58:45.887640615 +0000 UTC m=+1153.843826969" Jan 28 20:58:46 crc kubenswrapper[4746]: I0128 20:58:46.306307 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 20:58:46 crc kubenswrapper[4746]: I0128 20:58:46.385741 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vfmkg"] Jan 28 20:58:46 crc kubenswrapper[4746]: I0128 20:58:46.386331 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" podUID="0e48f2c5-a005-440d-b1d4-885bd3dd4a82" containerName="dnsmasq-dns" containerID="cri-o://c77139041dada346e753825ccb577fd7e0206e3e918bc5faa9fb0bae913ce06a" gracePeriod=10 Jan 28 20:58:46 crc kubenswrapper[4746]: I0128 20:58:46.809267 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 28 20:58:46 crc kubenswrapper[4746]: I0128 20:58:46.888041 4746 generic.go:334] "Generic (PLEG): container finished" podID="54c7c3a5-c7c4-499b-8ded-2a2341a7d928" containerID="d4a7991e03a107fe03667d4568bdff822eccb58ad8a1254b56c2585d609c36b0" exitCode=0 Jan 28 20:58:46 crc kubenswrapper[4746]: I0128 20:58:46.888125 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ms9wc-config-t7qwd" event={"ID":"54c7c3a5-c7c4-499b-8ded-2a2341a7d928","Type":"ContainerDied","Data":"d4a7991e03a107fe03667d4568bdff822eccb58ad8a1254b56c2585d609c36b0"} Jan 28 20:58:46 crc kubenswrapper[4746]: I0128 20:58:46.890631 4746 generic.go:334] "Generic (PLEG): container finished" podID="0e48f2c5-a005-440d-b1d4-885bd3dd4a82" containerID="c77139041dada346e753825ccb577fd7e0206e3e918bc5faa9fb0bae913ce06a" exitCode=0 Jan 28 20:58:46 crc kubenswrapper[4746]: I0128 20:58:46.890689 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" event={"ID":"0e48f2c5-a005-440d-b1d4-885bd3dd4a82","Type":"ContainerDied","Data":"c77139041dada346e753825ccb577fd7e0206e3e918bc5faa9fb0bae913ce06a"} Jan 28 20:58:46 crc kubenswrapper[4746]: I0128 20:58:46.890728 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" event={"ID":"0e48f2c5-a005-440d-b1d4-885bd3dd4a82","Type":"ContainerDied","Data":"329fb4ca85ba72bf66c84f9a20f3bfedf90782590cd83726463253a42452a8d8"} Jan 28 20:58:46 crc kubenswrapper[4746]: I0128 20:58:46.890748 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="329fb4ca85ba72bf66c84f9a20f3bfedf90782590cd83726463253a42452a8d8" Jan 28 20:58:46 crc kubenswrapper[4746]: I0128 20:58:46.904504 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" Jan 28 20:58:46 crc kubenswrapper[4746]: I0128 20:58:46.913487 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-hhpql"] Jan 28 20:58:46 crc kubenswrapper[4746]: I0128 20:58:46.920464 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-hhpql"] Jan 28 20:58:46 crc kubenswrapper[4746]: I0128 20:58:46.942734 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc6mn\" (UniqueName: \"kubernetes.io/projected/0e48f2c5-a005-440d-b1d4-885bd3dd4a82-kube-api-access-gc6mn\") pod \"0e48f2c5-a005-440d-b1d4-885bd3dd4a82\" (UID: \"0e48f2c5-a005-440d-b1d4-885bd3dd4a82\") " Jan 28 20:58:46 crc kubenswrapper[4746]: I0128 20:58:46.943105 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e48f2c5-a005-440d-b1d4-885bd3dd4a82-dns-svc\") pod \"0e48f2c5-a005-440d-b1d4-885bd3dd4a82\" (UID: \"0e48f2c5-a005-440d-b1d4-885bd3dd4a82\") " Jan 28 20:58:46 crc kubenswrapper[4746]: I0128 20:58:46.943298 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e48f2c5-a005-440d-b1d4-885bd3dd4a82-config\") pod \"0e48f2c5-a005-440d-b1d4-885bd3dd4a82\" (UID: \"0e48f2c5-a005-440d-b1d4-885bd3dd4a82\") " Jan 28 20:58:46 crc kubenswrapper[4746]: I0128 20:58:46.980723 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e48f2c5-a005-440d-b1d4-885bd3dd4a82-kube-api-access-gc6mn" (OuterVolumeSpecName: "kube-api-access-gc6mn") pod "0e48f2c5-a005-440d-b1d4-885bd3dd4a82" (UID: "0e48f2c5-a005-440d-b1d4-885bd3dd4a82"). InnerVolumeSpecName "kube-api-access-gc6mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:47 crc kubenswrapper[4746]: I0128 20:58:47.003960 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e48f2c5-a005-440d-b1d4-885bd3dd4a82-config" (OuterVolumeSpecName: "config") pod "0e48f2c5-a005-440d-b1d4-885bd3dd4a82" (UID: "0e48f2c5-a005-440d-b1d4-885bd3dd4a82"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:47 crc kubenswrapper[4746]: I0128 20:58:47.005387 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e48f2c5-a005-440d-b1d4-885bd3dd4a82-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0e48f2c5-a005-440d-b1d4-885bd3dd4a82" (UID: "0e48f2c5-a005-440d-b1d4-885bd3dd4a82"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:47 crc kubenswrapper[4746]: I0128 20:58:47.045322 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc6mn\" (UniqueName: \"kubernetes.io/projected/0e48f2c5-a005-440d-b1d4-885bd3dd4a82-kube-api-access-gc6mn\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:47 crc kubenswrapper[4746]: I0128 20:58:47.045350 4746 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0e48f2c5-a005-440d-b1d4-885bd3dd4a82-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:47 crc kubenswrapper[4746]: I0128 20:58:47.045359 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e48f2c5-a005-440d-b1d4-885bd3dd4a82-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:47 crc kubenswrapper[4746]: I0128 20:58:47.901392 4746 generic.go:334] "Generic (PLEG): container finished" podID="61a4ff02-ae06-438a-a39c-8264c8e61b38" containerID="8366ad10dfa6f272d73ade37166fb74c82fc244a58136aa35816741924f87c1b" exitCode=0 Jan 28 20:58:47 crc kubenswrapper[4746]: I0128 20:58:47.901486 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vfmkg" Jan 28 20:58:47 crc kubenswrapper[4746]: I0128 20:58:47.902323 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5sxj6" event={"ID":"61a4ff02-ae06-438a-a39c-8264c8e61b38","Type":"ContainerDied","Data":"8366ad10dfa6f272d73ade37166fb74c82fc244a58136aa35816741924f87c1b"} Jan 28 20:58:47 crc kubenswrapper[4746]: I0128 20:58:47.959769 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vfmkg"] Jan 28 20:58:47 crc kubenswrapper[4746]: I0128 20:58:47.966855 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vfmkg"] Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.297063 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.370449 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7mb5\" (UniqueName: \"kubernetes.io/projected/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-kube-api-access-j7mb5\") pod \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.370555 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-var-log-ovn\") pod \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.370597 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-additional-scripts\") pod \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.370624 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-scripts\") pod \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.370638 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-var-run\") pod \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.370679 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-var-run-ovn\") pod \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\" (UID: \"54c7c3a5-c7c4-499b-8ded-2a2341a7d928\") " Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.371081 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "54c7c3a5-c7c4-499b-8ded-2a2341a7d928" (UID: "54c7c3a5-c7c4-499b-8ded-2a2341a7d928"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.371440 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-var-run" (OuterVolumeSpecName: "var-run") pod "54c7c3a5-c7c4-499b-8ded-2a2341a7d928" (UID: "54c7c3a5-c7c4-499b-8ded-2a2341a7d928"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.371520 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "54c7c3a5-c7c4-499b-8ded-2a2341a7d928" (UID: "54c7c3a5-c7c4-499b-8ded-2a2341a7d928"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.372088 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "54c7c3a5-c7c4-499b-8ded-2a2341a7d928" (UID: "54c7c3a5-c7c4-499b-8ded-2a2341a7d928"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.372195 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-scripts" (OuterVolumeSpecName: "scripts") pod "54c7c3a5-c7c4-499b-8ded-2a2341a7d928" (UID: "54c7c3a5-c7c4-499b-8ded-2a2341a7d928"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.394276 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-kube-api-access-j7mb5" (OuterVolumeSpecName: "kube-api-access-j7mb5") pod "54c7c3a5-c7c4-499b-8ded-2a2341a7d928" (UID: "54c7c3a5-c7c4-499b-8ded-2a2341a7d928"). InnerVolumeSpecName "kube-api-access-j7mb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.472210 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7mb5\" (UniqueName: \"kubernetes.io/projected/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-kube-api-access-j7mb5\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.472243 4746 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.472251 4746 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.472261 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.472269 4746 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-var-run\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.472276 4746 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/54c7c3a5-c7c4-499b-8ded-2a2341a7d928-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.711446 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-dsp4s"] Jan 28 20:58:48 crc kubenswrapper[4746]: E0128 20:58:48.711881 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c7c3a5-c7c4-499b-8ded-2a2341a7d928" containerName="ovn-config" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.711898 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c7c3a5-c7c4-499b-8ded-2a2341a7d928" containerName="ovn-config" Jan 28 20:58:48 crc kubenswrapper[4746]: E0128 20:58:48.711909 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e48f2c5-a005-440d-b1d4-885bd3dd4a82" containerName="init" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.711915 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e48f2c5-a005-440d-b1d4-885bd3dd4a82" containerName="init" Jan 28 20:58:48 crc kubenswrapper[4746]: E0128 20:58:48.711926 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e48f2c5-a005-440d-b1d4-885bd3dd4a82" containerName="dnsmasq-dns" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.711933 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e48f2c5-a005-440d-b1d4-885bd3dd4a82" containerName="dnsmasq-dns" Jan 28 20:58:48 crc kubenswrapper[4746]: E0128 20:58:48.711946 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb3fedce-560c-4a95-860a-fedc72ad4d04" containerName="mariadb-account-create-update" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.711951 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb3fedce-560c-4a95-860a-fedc72ad4d04" containerName="mariadb-account-create-update" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.712122 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="54c7c3a5-c7c4-499b-8ded-2a2341a7d928" containerName="ovn-config" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.712139 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e48f2c5-a005-440d-b1d4-885bd3dd4a82" containerName="dnsmasq-dns" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.712153 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb3fedce-560c-4a95-860a-fedc72ad4d04" containerName="mariadb-account-create-update" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.712759 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dsp4s" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.715633 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.715872 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sx27z" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.724905 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-dsp4s"] Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.777669 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6qc5\" (UniqueName: \"kubernetes.io/projected/faabc487-475c-4f5b-b135-5a96d1ed9269-kube-api-access-r6qc5\") pod \"glance-db-sync-dsp4s\" (UID: \"faabc487-475c-4f5b-b135-5a96d1ed9269\") " pod="openstack/glance-db-sync-dsp4s" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.777745 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faabc487-475c-4f5b-b135-5a96d1ed9269-combined-ca-bundle\") pod \"glance-db-sync-dsp4s\" (UID: \"faabc487-475c-4f5b-b135-5a96d1ed9269\") " pod="openstack/glance-db-sync-dsp4s" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.777815 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/faabc487-475c-4f5b-b135-5a96d1ed9269-db-sync-config-data\") pod \"glance-db-sync-dsp4s\" (UID: \"faabc487-475c-4f5b-b135-5a96d1ed9269\") " pod="openstack/glance-db-sync-dsp4s" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.777843 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faabc487-475c-4f5b-b135-5a96d1ed9269-config-data\") pod \"glance-db-sync-dsp4s\" (UID: \"faabc487-475c-4f5b-b135-5a96d1ed9269\") " pod="openstack/glance-db-sync-dsp4s" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.849891 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e48f2c5-a005-440d-b1d4-885bd3dd4a82" path="/var/lib/kubelet/pods/0e48f2c5-a005-440d-b1d4-885bd3dd4a82/volumes" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.850966 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb3fedce-560c-4a95-860a-fedc72ad4d04" path="/var/lib/kubelet/pods/eb3fedce-560c-4a95-860a-fedc72ad4d04/volumes" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.879652 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6qc5\" (UniqueName: \"kubernetes.io/projected/faabc487-475c-4f5b-b135-5a96d1ed9269-kube-api-access-r6qc5\") pod \"glance-db-sync-dsp4s\" (UID: \"faabc487-475c-4f5b-b135-5a96d1ed9269\") " pod="openstack/glance-db-sync-dsp4s" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.879709 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faabc487-475c-4f5b-b135-5a96d1ed9269-combined-ca-bundle\") pod \"glance-db-sync-dsp4s\" (UID: \"faabc487-475c-4f5b-b135-5a96d1ed9269\") " pod="openstack/glance-db-sync-dsp4s" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.879749 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/faabc487-475c-4f5b-b135-5a96d1ed9269-db-sync-config-data\") pod \"glance-db-sync-dsp4s\" (UID: \"faabc487-475c-4f5b-b135-5a96d1ed9269\") " pod="openstack/glance-db-sync-dsp4s" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.879767 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faabc487-475c-4f5b-b135-5a96d1ed9269-config-data\") pod \"glance-db-sync-dsp4s\" (UID: \"faabc487-475c-4f5b-b135-5a96d1ed9269\") " pod="openstack/glance-db-sync-dsp4s" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.883820 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/faabc487-475c-4f5b-b135-5a96d1ed9269-db-sync-config-data\") pod \"glance-db-sync-dsp4s\" (UID: \"faabc487-475c-4f5b-b135-5a96d1ed9269\") " pod="openstack/glance-db-sync-dsp4s" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.884293 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faabc487-475c-4f5b-b135-5a96d1ed9269-config-data\") pod \"glance-db-sync-dsp4s\" (UID: \"faabc487-475c-4f5b-b135-5a96d1ed9269\") " pod="openstack/glance-db-sync-dsp4s" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.888808 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faabc487-475c-4f5b-b135-5a96d1ed9269-combined-ca-bundle\") pod \"glance-db-sync-dsp4s\" (UID: \"faabc487-475c-4f5b-b135-5a96d1ed9269\") " pod="openstack/glance-db-sync-dsp4s" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.908844 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6qc5\" (UniqueName: \"kubernetes.io/projected/faabc487-475c-4f5b-b135-5a96d1ed9269-kube-api-access-r6qc5\") pod \"glance-db-sync-dsp4s\" (UID: \"faabc487-475c-4f5b-b135-5a96d1ed9269\") " pod="openstack/glance-db-sync-dsp4s" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.915733 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ms9wc-config-t7qwd" Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.915729 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ms9wc-config-t7qwd" event={"ID":"54c7c3a5-c7c4-499b-8ded-2a2341a7d928","Type":"ContainerDied","Data":"135a6bcb81886df2f7909d9bbaaa999ddf19ed694f4b70e173c93451365476dc"} Jan 28 20:58:48 crc kubenswrapper[4746]: I0128 20:58:48.915791 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="135a6bcb81886df2f7909d9bbaaa999ddf19ed694f4b70e173c93451365476dc" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.009923 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ms9wc-config-t7qwd"] Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.017112 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ms9wc-config-t7qwd"] Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.030187 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dsp4s" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.238788 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="d3cad0b0-7b53-4280-9dec-05e01692820c" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.268152 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.286232 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/61a4ff02-ae06-438a-a39c-8264c8e61b38-dispersionconf\") pod \"61a4ff02-ae06-438a-a39c-8264c8e61b38\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.286356 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/61a4ff02-ae06-438a-a39c-8264c8e61b38-swiftconf\") pod \"61a4ff02-ae06-438a-a39c-8264c8e61b38\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.286411 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61a4ff02-ae06-438a-a39c-8264c8e61b38-scripts\") pod \"61a4ff02-ae06-438a-a39c-8264c8e61b38\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.286507 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/61a4ff02-ae06-438a-a39c-8264c8e61b38-ring-data-devices\") pod \"61a4ff02-ae06-438a-a39c-8264c8e61b38\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.286572 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a4ff02-ae06-438a-a39c-8264c8e61b38-combined-ca-bundle\") pod \"61a4ff02-ae06-438a-a39c-8264c8e61b38\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.287349 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61a4ff02-ae06-438a-a39c-8264c8e61b38-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "61a4ff02-ae06-438a-a39c-8264c8e61b38" (UID: "61a4ff02-ae06-438a-a39c-8264c8e61b38"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.287405 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvfrh\" (UniqueName: \"kubernetes.io/projected/61a4ff02-ae06-438a-a39c-8264c8e61b38-kube-api-access-bvfrh\") pod \"61a4ff02-ae06-438a-a39c-8264c8e61b38\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.287549 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/61a4ff02-ae06-438a-a39c-8264c8e61b38-etc-swift\") pod \"61a4ff02-ae06-438a-a39c-8264c8e61b38\" (UID: \"61a4ff02-ae06-438a-a39c-8264c8e61b38\") " Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.288717 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61a4ff02-ae06-438a-a39c-8264c8e61b38-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "61a4ff02-ae06-438a-a39c-8264c8e61b38" (UID: "61a4ff02-ae06-438a-a39c-8264c8e61b38"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.289502 4746 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/61a4ff02-ae06-438a-a39c-8264c8e61b38-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.289525 4746 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/61a4ff02-ae06-438a-a39c-8264c8e61b38-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.294832 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61a4ff02-ae06-438a-a39c-8264c8e61b38-kube-api-access-bvfrh" (OuterVolumeSpecName: "kube-api-access-bvfrh") pod "61a4ff02-ae06-438a-a39c-8264c8e61b38" (UID: "61a4ff02-ae06-438a-a39c-8264c8e61b38"). InnerVolumeSpecName "kube-api-access-bvfrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.299495 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a4ff02-ae06-438a-a39c-8264c8e61b38-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "61a4ff02-ae06-438a-a39c-8264c8e61b38" (UID: "61a4ff02-ae06-438a-a39c-8264c8e61b38"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.327381 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a4ff02-ae06-438a-a39c-8264c8e61b38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61a4ff02-ae06-438a-a39c-8264c8e61b38" (UID: "61a4ff02-ae06-438a-a39c-8264c8e61b38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.331294 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a4ff02-ae06-438a-a39c-8264c8e61b38-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "61a4ff02-ae06-438a-a39c-8264c8e61b38" (UID: "61a4ff02-ae06-438a-a39c-8264c8e61b38"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.353362 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61a4ff02-ae06-438a-a39c-8264c8e61b38-scripts" (OuterVolumeSpecName: "scripts") pod "61a4ff02-ae06-438a-a39c-8264c8e61b38" (UID: "61a4ff02-ae06-438a-a39c-8264c8e61b38"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.390046 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ms9wc" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.395307 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a4ff02-ae06-438a-a39c-8264c8e61b38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.395351 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvfrh\" (UniqueName: \"kubernetes.io/projected/61a4ff02-ae06-438a-a39c-8264c8e61b38-kube-api-access-bvfrh\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.395360 4746 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/61a4ff02-ae06-438a-a39c-8264c8e61b38-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.395371 4746 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/61a4ff02-ae06-438a-a39c-8264c8e61b38-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.395380 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/61a4ff02-ae06-438a-a39c-8264c8e61b38-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.859492 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-dsp4s"] Jan 28 20:58:49 crc kubenswrapper[4746]: W0128 20:58:49.865173 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaabc487_475c_4f5b_b135_5a96d1ed9269.slice/crio-adcf78c9f27a696bc9f434c166d6cfc01dfee6269abbb5b91e2272e9575b3219 WatchSource:0}: Error finding container adcf78c9f27a696bc9f434c166d6cfc01dfee6269abbb5b91e2272e9575b3219: Status 404 returned error can't find the container with id adcf78c9f27a696bc9f434c166d6cfc01dfee6269abbb5b91e2272e9575b3219 Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.927344 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5sxj6" event={"ID":"61a4ff02-ae06-438a-a39c-8264c8e61b38","Type":"ContainerDied","Data":"a8d31a39a56a93819c68ad91f78932813cd375e53a1b32ba0f782db3d6547413"} Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.927402 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8d31a39a56a93819c68ad91f78932813cd375e53a1b32ba0f782db3d6547413" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.927363 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5sxj6" Jan 28 20:58:49 crc kubenswrapper[4746]: I0128 20:58:49.928575 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dsp4s" event={"ID":"faabc487-475c-4f5b-b135-5a96d1ed9269","Type":"ContainerStarted","Data":"adcf78c9f27a696bc9f434c166d6cfc01dfee6269abbb5b91e2272e9575b3219"} Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.084127 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.086455 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.443612 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9xsgn"] Jan 28 20:58:50 crc kubenswrapper[4746]: E0128 20:58:50.443980 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a4ff02-ae06-438a-a39c-8264c8e61b38" containerName="swift-ring-rebalance" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.443993 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a4ff02-ae06-438a-a39c-8264c8e61b38" containerName="swift-ring-rebalance" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.444208 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="61a4ff02-ae06-438a-a39c-8264c8e61b38" containerName="swift-ring-rebalance" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.444938 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9xsgn" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.448115 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.471142 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9xsgn"] Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.476291 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.517424 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90854a3f-3d4b-4a6b-a269-d4b33cbf94c9-operator-scripts\") pod \"root-account-create-update-9xsgn\" (UID: \"90854a3f-3d4b-4a6b-a269-d4b33cbf94c9\") " pod="openstack/root-account-create-update-9xsgn" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.517631 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqh9z\" (UniqueName: \"kubernetes.io/projected/90854a3f-3d4b-4a6b-a269-d4b33cbf94c9-kube-api-access-hqh9z\") pod \"root-account-create-update-9xsgn\" (UID: \"90854a3f-3d4b-4a6b-a269-d4b33cbf94c9\") " pod="openstack/root-account-create-update-9xsgn" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.622594 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90854a3f-3d4b-4a6b-a269-d4b33cbf94c9-operator-scripts\") pod \"root-account-create-update-9xsgn\" (UID: \"90854a3f-3d4b-4a6b-a269-d4b33cbf94c9\") " pod="openstack/root-account-create-update-9xsgn" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.622723 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqh9z\" (UniqueName: \"kubernetes.io/projected/90854a3f-3d4b-4a6b-a269-d4b33cbf94c9-kube-api-access-hqh9z\") pod \"root-account-create-update-9xsgn\" (UID: \"90854a3f-3d4b-4a6b-a269-d4b33cbf94c9\") " pod="openstack/root-account-create-update-9xsgn" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.623686 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90854a3f-3d4b-4a6b-a269-d4b33cbf94c9-operator-scripts\") pod \"root-account-create-update-9xsgn\" (UID: \"90854a3f-3d4b-4a6b-a269-d4b33cbf94c9\") " pod="openstack/root-account-create-update-9xsgn" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.680879 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2635-account-create-update-hz28g"] Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.682716 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2635-account-create-update-hz28g" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.684057 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqh9z\" (UniqueName: \"kubernetes.io/projected/90854a3f-3d4b-4a6b-a269-d4b33cbf94c9-kube-api-access-hqh9z\") pod \"root-account-create-update-9xsgn\" (UID: \"90854a3f-3d4b-4a6b-a269-d4b33cbf94c9\") " pod="openstack/root-account-create-update-9xsgn" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.688772 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.744697 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-kp8xl"] Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.746036 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kp8xl" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.772798 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-kp8xl"] Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.781325 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9xsgn" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.803488 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2635-account-create-update-hz28g"] Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.826226 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdhpp\" (UniqueName: \"kubernetes.io/projected/e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc-kube-api-access-jdhpp\") pod \"cinder-2635-account-create-update-hz28g\" (UID: \"e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc\") " pod="openstack/cinder-2635-account-create-update-hz28g" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.826464 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc-operator-scripts\") pod \"cinder-2635-account-create-update-hz28g\" (UID: \"e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc\") " pod="openstack/cinder-2635-account-create-update-hz28g" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.870862 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54c7c3a5-c7c4-499b-8ded-2a2341a7d928" path="/var/lib/kubelet/pods/54c7c3a5-c7c4-499b-8ded-2a2341a7d928/volumes" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.929025 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdhpp\" (UniqueName: \"kubernetes.io/projected/e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc-kube-api-access-jdhpp\") pod \"cinder-2635-account-create-update-hz28g\" (UID: \"e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc\") " pod="openstack/cinder-2635-account-create-update-hz28g" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.929126 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db403f32-6720-4260-a66d-45e5d0e7b5c6-operator-scripts\") pod \"cinder-db-create-kp8xl\" (UID: \"db403f32-6720-4260-a66d-45e5d0e7b5c6\") " pod="openstack/cinder-db-create-kp8xl" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.929180 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl9hg\" (UniqueName: \"kubernetes.io/projected/db403f32-6720-4260-a66d-45e5d0e7b5c6-kube-api-access-jl9hg\") pod \"cinder-db-create-kp8xl\" (UID: \"db403f32-6720-4260-a66d-45e5d0e7b5c6\") " pod="openstack/cinder-db-create-kp8xl" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.929241 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc-operator-scripts\") pod \"cinder-2635-account-create-update-hz28g\" (UID: \"e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc\") " pod="openstack/cinder-2635-account-create-update-hz28g" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.929993 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc-operator-scripts\") pod \"cinder-2635-account-create-update-hz28g\" (UID: \"e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc\") " pod="openstack/cinder-2635-account-create-update-hz28g" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.944442 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-7mlnk"] Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.945590 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7mlnk" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.949657 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.950494 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-j5js5" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.950687 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.950758 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.976717 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7mlnk"] Jan 28 20:58:50 crc kubenswrapper[4746]: I0128 20:58:50.994422 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdhpp\" (UniqueName: \"kubernetes.io/projected/e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc-kube-api-access-jdhpp\") pod \"cinder-2635-account-create-update-hz28g\" (UID: \"e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc\") " pod="openstack/cinder-2635-account-create-update-hz28g" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.017233 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-27prr"] Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.018366 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-27prr" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.033806 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl9hg\" (UniqueName: \"kubernetes.io/projected/db403f32-6720-4260-a66d-45e5d0e7b5c6-kube-api-access-jl9hg\") pod \"cinder-db-create-kp8xl\" (UID: \"db403f32-6720-4260-a66d-45e5d0e7b5c6\") " pod="openstack/cinder-db-create-kp8xl" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.034224 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db403f32-6720-4260-a66d-45e5d0e7b5c6-operator-scripts\") pod \"cinder-db-create-kp8xl\" (UID: \"db403f32-6720-4260-a66d-45e5d0e7b5c6\") " pod="openstack/cinder-db-create-kp8xl" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.034940 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db403f32-6720-4260-a66d-45e5d0e7b5c6-operator-scripts\") pod \"cinder-db-create-kp8xl\" (UID: \"db403f32-6720-4260-a66d-45e5d0e7b5c6\") " pod="openstack/cinder-db-create-kp8xl" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.046042 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-27prr"] Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.059936 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2635-account-create-update-hz28g" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.075706 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-564b-account-create-update-m6nh8"] Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.077217 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-564b-account-create-update-m6nh8" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.078360 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl9hg\" (UniqueName: \"kubernetes.io/projected/db403f32-6720-4260-a66d-45e5d0e7b5c6-kube-api-access-jl9hg\") pod \"cinder-db-create-kp8xl\" (UID: \"db403f32-6720-4260-a66d-45e5d0e7b5c6\") " pod="openstack/cinder-db-create-kp8xl" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.081772 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.098532 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kp8xl" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.135659 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1176d52c-0fec-4346-ad79-af25ac4c3f62-config-data\") pod \"keystone-db-sync-7mlnk\" (UID: \"1176d52c-0fec-4346-ad79-af25ac4c3f62\") " pod="openstack/keystone-db-sync-7mlnk" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.135726 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5-operator-scripts\") pod \"barbican-db-create-27prr\" (UID: \"a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5\") " pod="openstack/barbican-db-create-27prr" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.135877 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl58t\" (UniqueName: \"kubernetes.io/projected/a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5-kube-api-access-jl58t\") pod \"barbican-db-create-27prr\" (UID: \"a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5\") " pod="openstack/barbican-db-create-27prr" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.135925 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5vb7\" (UniqueName: \"kubernetes.io/projected/1176d52c-0fec-4346-ad79-af25ac4c3f62-kube-api-access-k5vb7\") pod \"keystone-db-sync-7mlnk\" (UID: \"1176d52c-0fec-4346-ad79-af25ac4c3f62\") " pod="openstack/keystone-db-sync-7mlnk" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.141595 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1176d52c-0fec-4346-ad79-af25ac4c3f62-combined-ca-bundle\") pod \"keystone-db-sync-7mlnk\" (UID: \"1176d52c-0fec-4346-ad79-af25ac4c3f62\") " pod="openstack/keystone-db-sync-7mlnk" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.150597 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-6pvd6"] Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.151974 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-6pvd6" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.174670 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-564b-account-create-update-m6nh8"] Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.186190 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-6pvd6"] Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.236075 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-4eb8-account-create-update-k449n"] Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.237470 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4eb8-account-create-update-k449n" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.241050 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.242893 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drf9n\" (UniqueName: \"kubernetes.io/projected/da1723e7-7026-414e-b1e1-79911e331408-kube-api-access-drf9n\") pod \"neutron-564b-account-create-update-m6nh8\" (UID: \"da1723e7-7026-414e-b1e1-79911e331408\") " pod="openstack/neutron-564b-account-create-update-m6nh8" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.242936 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da1723e7-7026-414e-b1e1-79911e331408-operator-scripts\") pod \"neutron-564b-account-create-update-m6nh8\" (UID: \"da1723e7-7026-414e-b1e1-79911e331408\") " pod="openstack/neutron-564b-account-create-update-m6nh8" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.242959 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1176d52c-0fec-4346-ad79-af25ac4c3f62-config-data\") pod \"keystone-db-sync-7mlnk\" (UID: \"1176d52c-0fec-4346-ad79-af25ac4c3f62\") " pod="openstack/keystone-db-sync-7mlnk" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.242982 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aed034c-35b5-4fd4-b0c4-cebbdfb41da2-operator-scripts\") pod \"cloudkitty-db-create-6pvd6\" (UID: \"2aed034c-35b5-4fd4-b0c4-cebbdfb41da2\") " pod="openstack/cloudkitty-db-create-6pvd6" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.243026 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbdpr\" (UniqueName: \"kubernetes.io/projected/2aed034c-35b5-4fd4-b0c4-cebbdfb41da2-kube-api-access-dbdpr\") pod \"cloudkitty-db-create-6pvd6\" (UID: \"2aed034c-35b5-4fd4-b0c4-cebbdfb41da2\") " pod="openstack/cloudkitty-db-create-6pvd6" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.243052 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5-operator-scripts\") pod \"barbican-db-create-27prr\" (UID: \"a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5\") " pod="openstack/barbican-db-create-27prr" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.243142 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl58t\" (UniqueName: \"kubernetes.io/projected/a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5-kube-api-access-jl58t\") pod \"barbican-db-create-27prr\" (UID: \"a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5\") " pod="openstack/barbican-db-create-27prr" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.243172 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5vb7\" (UniqueName: \"kubernetes.io/projected/1176d52c-0fec-4346-ad79-af25ac4c3f62-kube-api-access-k5vb7\") pod \"keystone-db-sync-7mlnk\" (UID: \"1176d52c-0fec-4346-ad79-af25ac4c3f62\") " pod="openstack/keystone-db-sync-7mlnk" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.243212 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1176d52c-0fec-4346-ad79-af25ac4c3f62-combined-ca-bundle\") pod \"keystone-db-sync-7mlnk\" (UID: \"1176d52c-0fec-4346-ad79-af25ac4c3f62\") " pod="openstack/keystone-db-sync-7mlnk" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.248052 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5-operator-scripts\") pod \"barbican-db-create-27prr\" (UID: \"a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5\") " pod="openstack/barbican-db-create-27prr" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.249290 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4eb8-account-create-update-k449n"] Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.254744 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1176d52c-0fec-4346-ad79-af25ac4c3f62-combined-ca-bundle\") pod \"keystone-db-sync-7mlnk\" (UID: \"1176d52c-0fec-4346-ad79-af25ac4c3f62\") " pod="openstack/keystone-db-sync-7mlnk" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.262778 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1176d52c-0fec-4346-ad79-af25ac4c3f62-config-data\") pod \"keystone-db-sync-7mlnk\" (UID: \"1176d52c-0fec-4346-ad79-af25ac4c3f62\") " pod="openstack/keystone-db-sync-7mlnk" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.286872 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5vb7\" (UniqueName: \"kubernetes.io/projected/1176d52c-0fec-4346-ad79-af25ac4c3f62-kube-api-access-k5vb7\") pod \"keystone-db-sync-7mlnk\" (UID: \"1176d52c-0fec-4346-ad79-af25ac4c3f62\") " pod="openstack/keystone-db-sync-7mlnk" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.288350 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl58t\" (UniqueName: \"kubernetes.io/projected/a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5-kube-api-access-jl58t\") pod \"barbican-db-create-27prr\" (UID: \"a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5\") " pod="openstack/barbican-db-create-27prr" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.289352 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7mlnk" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.345208 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-6f70-account-create-update-zsths"] Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.346436 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-6f70-account-create-update-zsths" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.348271 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf1cab3-21ae-4850-a732-d0e75f55ffc4-operator-scripts\") pod \"barbican-4eb8-account-create-update-k449n\" (UID: \"eaf1cab3-21ae-4850-a732-d0e75f55ffc4\") " pod="openstack/barbican-4eb8-account-create-update-k449n" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.348449 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drf9n\" (UniqueName: \"kubernetes.io/projected/da1723e7-7026-414e-b1e1-79911e331408-kube-api-access-drf9n\") pod \"neutron-564b-account-create-update-m6nh8\" (UID: \"da1723e7-7026-414e-b1e1-79911e331408\") " pod="openstack/neutron-564b-account-create-update-m6nh8" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.348481 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da1723e7-7026-414e-b1e1-79911e331408-operator-scripts\") pod \"neutron-564b-account-create-update-m6nh8\" (UID: \"da1723e7-7026-414e-b1e1-79911e331408\") " pod="openstack/neutron-564b-account-create-update-m6nh8" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.348500 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aed034c-35b5-4fd4-b0c4-cebbdfb41da2-operator-scripts\") pod \"cloudkitty-db-create-6pvd6\" (UID: \"2aed034c-35b5-4fd4-b0c4-cebbdfb41da2\") " pod="openstack/cloudkitty-db-create-6pvd6" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.348525 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8w79\" (UniqueName: \"kubernetes.io/projected/eaf1cab3-21ae-4850-a732-d0e75f55ffc4-kube-api-access-l8w79\") pod \"barbican-4eb8-account-create-update-k449n\" (UID: \"eaf1cab3-21ae-4850-a732-d0e75f55ffc4\") " pod="openstack/barbican-4eb8-account-create-update-k449n" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.348549 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbdpr\" (UniqueName: \"kubernetes.io/projected/2aed034c-35b5-4fd4-b0c4-cebbdfb41da2-kube-api-access-dbdpr\") pod \"cloudkitty-db-create-6pvd6\" (UID: \"2aed034c-35b5-4fd4-b0c4-cebbdfb41da2\") " pod="openstack/cloudkitty-db-create-6pvd6" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.350186 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da1723e7-7026-414e-b1e1-79911e331408-operator-scripts\") pod \"neutron-564b-account-create-update-m6nh8\" (UID: \"da1723e7-7026-414e-b1e1-79911e331408\") " pod="openstack/neutron-564b-account-create-update-m6nh8" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.350524 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aed034c-35b5-4fd4-b0c4-cebbdfb41da2-operator-scripts\") pod \"cloudkitty-db-create-6pvd6\" (UID: \"2aed034c-35b5-4fd4-b0c4-cebbdfb41da2\") " pod="openstack/cloudkitty-db-create-6pvd6" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.351219 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.364822 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-dlwzg"] Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.366487 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dlwzg" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.376888 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drf9n\" (UniqueName: \"kubernetes.io/projected/da1723e7-7026-414e-b1e1-79911e331408-kube-api-access-drf9n\") pod \"neutron-564b-account-create-update-m6nh8\" (UID: \"da1723e7-7026-414e-b1e1-79911e331408\") " pod="openstack/neutron-564b-account-create-update-m6nh8" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.382583 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-6f70-account-create-update-zsths"] Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.385208 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbdpr\" (UniqueName: \"kubernetes.io/projected/2aed034c-35b5-4fd4-b0c4-cebbdfb41da2-kube-api-access-dbdpr\") pod \"cloudkitty-db-create-6pvd6\" (UID: \"2aed034c-35b5-4fd4-b0c4-cebbdfb41da2\") " pod="openstack/cloudkitty-db-create-6pvd6" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.418307 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-dlwzg"] Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.450864 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4a37860-0f42-4d7c-89e0-8505e8e49c59-operator-scripts\") pod \"neutron-db-create-dlwzg\" (UID: \"c4a37860-0f42-4d7c-89e0-8505e8e49c59\") " pod="openstack/neutron-db-create-dlwzg" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.451062 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb4a489a-d80d-49a4-9624-ecfe6a4200ca-operator-scripts\") pod \"cloudkitty-6f70-account-create-update-zsths\" (UID: \"fb4a489a-d80d-49a4-9624-ecfe6a4200ca\") " pod="openstack/cloudkitty-6f70-account-create-update-zsths" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.451119 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sx7r\" (UniqueName: \"kubernetes.io/projected/c4a37860-0f42-4d7c-89e0-8505e8e49c59-kube-api-access-7sx7r\") pod \"neutron-db-create-dlwzg\" (UID: \"c4a37860-0f42-4d7c-89e0-8505e8e49c59\") " pod="openstack/neutron-db-create-dlwzg" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.451153 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8w79\" (UniqueName: \"kubernetes.io/projected/eaf1cab3-21ae-4850-a732-d0e75f55ffc4-kube-api-access-l8w79\") pod \"barbican-4eb8-account-create-update-k449n\" (UID: \"eaf1cab3-21ae-4850-a732-d0e75f55ffc4\") " pod="openstack/barbican-4eb8-account-create-update-k449n" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.451261 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf1cab3-21ae-4850-a732-d0e75f55ffc4-operator-scripts\") pod \"barbican-4eb8-account-create-update-k449n\" (UID: \"eaf1cab3-21ae-4850-a732-d0e75f55ffc4\") " pod="openstack/barbican-4eb8-account-create-update-k449n" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.451302 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrxbb\" (UniqueName: \"kubernetes.io/projected/fb4a489a-d80d-49a4-9624-ecfe6a4200ca-kube-api-access-qrxbb\") pod \"cloudkitty-6f70-account-create-update-zsths\" (UID: \"fb4a489a-d80d-49a4-9624-ecfe6a4200ca\") " pod="openstack/cloudkitty-6f70-account-create-update-zsths" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.453352 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf1cab3-21ae-4850-a732-d0e75f55ffc4-operator-scripts\") pod \"barbican-4eb8-account-create-update-k449n\" (UID: \"eaf1cab3-21ae-4850-a732-d0e75f55ffc4\") " pod="openstack/barbican-4eb8-account-create-update-k449n" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.466117 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-27prr" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.482877 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8w79\" (UniqueName: \"kubernetes.io/projected/eaf1cab3-21ae-4850-a732-d0e75f55ffc4-kube-api-access-l8w79\") pod \"barbican-4eb8-account-create-update-k449n\" (UID: \"eaf1cab3-21ae-4850-a732-d0e75f55ffc4\") " pod="openstack/barbican-4eb8-account-create-update-k449n" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.528060 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-564b-account-create-update-m6nh8" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.546610 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-6pvd6" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.553396 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb4a489a-d80d-49a4-9624-ecfe6a4200ca-operator-scripts\") pod \"cloudkitty-6f70-account-create-update-zsths\" (UID: \"fb4a489a-d80d-49a4-9624-ecfe6a4200ca\") " pod="openstack/cloudkitty-6f70-account-create-update-zsths" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.553491 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sx7r\" (UniqueName: \"kubernetes.io/projected/c4a37860-0f42-4d7c-89e0-8505e8e49c59-kube-api-access-7sx7r\") pod \"neutron-db-create-dlwzg\" (UID: \"c4a37860-0f42-4d7c-89e0-8505e8e49c59\") " pod="openstack/neutron-db-create-dlwzg" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.553677 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrxbb\" (UniqueName: \"kubernetes.io/projected/fb4a489a-d80d-49a4-9624-ecfe6a4200ca-kube-api-access-qrxbb\") pod \"cloudkitty-6f70-account-create-update-zsths\" (UID: \"fb4a489a-d80d-49a4-9624-ecfe6a4200ca\") " pod="openstack/cloudkitty-6f70-account-create-update-zsths" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.553768 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4a37860-0f42-4d7c-89e0-8505e8e49c59-operator-scripts\") pod \"neutron-db-create-dlwzg\" (UID: \"c4a37860-0f42-4d7c-89e0-8505e8e49c59\") " pod="openstack/neutron-db-create-dlwzg" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.554545 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb4a489a-d80d-49a4-9624-ecfe6a4200ca-operator-scripts\") pod \"cloudkitty-6f70-account-create-update-zsths\" (UID: \"fb4a489a-d80d-49a4-9624-ecfe6a4200ca\") " pod="openstack/cloudkitty-6f70-account-create-update-zsths" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.554626 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4a37860-0f42-4d7c-89e0-8505e8e49c59-operator-scripts\") pod \"neutron-db-create-dlwzg\" (UID: \"c4a37860-0f42-4d7c-89e0-8505e8e49c59\") " pod="openstack/neutron-db-create-dlwzg" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.573722 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrxbb\" (UniqueName: \"kubernetes.io/projected/fb4a489a-d80d-49a4-9624-ecfe6a4200ca-kube-api-access-qrxbb\") pod \"cloudkitty-6f70-account-create-update-zsths\" (UID: \"fb4a489a-d80d-49a4-9624-ecfe6a4200ca\") " pod="openstack/cloudkitty-6f70-account-create-update-zsths" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.576423 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sx7r\" (UniqueName: \"kubernetes.io/projected/c4a37860-0f42-4d7c-89e0-8505e8e49c59-kube-api-access-7sx7r\") pod \"neutron-db-create-dlwzg\" (UID: \"c4a37860-0f42-4d7c-89e0-8505e8e49c59\") " pod="openstack/neutron-db-create-dlwzg" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.630395 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4eb8-account-create-update-k449n" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.671424 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-6f70-account-create-update-zsths" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.695209 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dlwzg" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.740525 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9xsgn"] Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.812974 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.836887 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.885962 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2635-account-create-update-hz28g"] Jan 28 20:58:51 crc kubenswrapper[4746]: I0128 20:58:51.963446 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-7mlnk"] Jan 28 20:58:52 crc kubenswrapper[4746]: I0128 20:58:52.017348 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-kp8xl"] Jan 28 20:58:52 crc kubenswrapper[4746]: I0128 20:58:52.032296 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9xsgn" event={"ID":"90854a3f-3d4b-4a6b-a269-d4b33cbf94c9","Type":"ContainerStarted","Data":"d1ed7ff7cfd603e02a0adeb2f5ab7edae52e633ff5053605c37bbfcfec9a6916"} Jan 28 20:58:52 crc kubenswrapper[4746]: I0128 20:58:52.036832 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 28 20:58:52 crc kubenswrapper[4746]: W0128 20:58:52.138870 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb403f32_6720_4260_a66d_45e5d0e7b5c6.slice/crio-af87909f074419f5a49706ff02eabf8a083f5ad2a5fbeb910a8082b93318a832 WatchSource:0}: Error finding container af87909f074419f5a49706ff02eabf8a083f5ad2a5fbeb910a8082b93318a832: Status 404 returned error can't find the container with id af87909f074419f5a49706ff02eabf8a083f5ad2a5fbeb910a8082b93318a832 Jan 28 20:58:52 crc kubenswrapper[4746]: I0128 20:58:52.292238 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-27prr"] Jan 28 20:58:52 crc kubenswrapper[4746]: I0128 20:58:52.322453 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-564b-account-create-update-m6nh8"] Jan 28 20:58:52 crc kubenswrapper[4746]: I0128 20:58:52.570946 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-6pvd6"] Jan 28 20:58:52 crc kubenswrapper[4746]: W0128 20:58:52.582238 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2aed034c_35b5_4fd4_b0c4_cebbdfb41da2.slice/crio-d471a2b18dd03278c2450d4227c778cf7b7906bceb6a8b8ea2b168e5987e8b0f WatchSource:0}: Error finding container d471a2b18dd03278c2450d4227c778cf7b7906bceb6a8b8ea2b168e5987e8b0f: Status 404 returned error can't find the container with id d471a2b18dd03278c2450d4227c778cf7b7906bceb6a8b8ea2b168e5987e8b0f Jan 28 20:58:52 crc kubenswrapper[4746]: I0128 20:58:52.715576 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4eb8-account-create-update-k449n"] Jan 28 20:58:52 crc kubenswrapper[4746]: W0128 20:58:52.715639 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4a37860_0f42_4d7c_89e0_8505e8e49c59.slice/crio-ad2f4a3c5954186ccb0ca39a6ce3172a0a1c5ef7ec6d594a1e9269865d91d60d WatchSource:0}: Error finding container ad2f4a3c5954186ccb0ca39a6ce3172a0a1c5ef7ec6d594a1e9269865d91d60d: Status 404 returned error can't find the container with id ad2f4a3c5954186ccb0ca39a6ce3172a0a1c5ef7ec6d594a1e9269865d91d60d Jan 28 20:58:52 crc kubenswrapper[4746]: I0128 20:58:52.744034 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-dlwzg"] Jan 28 20:58:52 crc kubenswrapper[4746]: I0128 20:58:52.744692 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-6f70-account-create-update-zsths"] Jan 28 20:58:52 crc kubenswrapper[4746]: W0128 20:58:52.751234 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb4a489a_d80d_49a4_9624_ecfe6a4200ca.slice/crio-5c70ad139cf4e200678dce14e55d5e86bcd738be3708dfbc0ec0f8f471848a24 WatchSource:0}: Error finding container 5c70ad139cf4e200678dce14e55d5e86bcd738be3708dfbc0ec0f8f471848a24: Status 404 returned error can't find the container with id 5c70ad139cf4e200678dce14e55d5e86bcd738be3708dfbc0ec0f8f471848a24 Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.057749 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-564b-account-create-update-m6nh8" event={"ID":"da1723e7-7026-414e-b1e1-79911e331408","Type":"ContainerStarted","Data":"d7e34ce150b2322e65670a106c0878e3573205aebc249421f89da536b3a40251"} Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.058031 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-564b-account-create-update-m6nh8" event={"ID":"da1723e7-7026-414e-b1e1-79911e331408","Type":"ContainerStarted","Data":"d8ae3e54e8cbd8df2ee805212faea4258c0efc0f59edb18becf670d8a0186ce0"} Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.065167 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dlwzg" event={"ID":"c4a37860-0f42-4d7c-89e0-8505e8e49c59","Type":"ContainerStarted","Data":"51190e28e35697c5ce6351b8e77e5a505b8502b14e2c1dcb41d9348a50e9aae0"} Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.065197 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dlwzg" event={"ID":"c4a37860-0f42-4d7c-89e0-8505e8e49c59","Type":"ContainerStarted","Data":"ad2f4a3c5954186ccb0ca39a6ce3172a0a1c5ef7ec6d594a1e9269865d91d60d"} Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.067830 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-27prr" event={"ID":"a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5","Type":"ContainerStarted","Data":"68613854b61f89e684047d7c10e6855b82ef3a9fefe04b5e735ff2578df9b978"} Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.067852 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-27prr" event={"ID":"a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5","Type":"ContainerStarted","Data":"5b075e56aa69af819848f0626947761447a29601516062badfcaf022a77735c3"} Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.070888 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7mlnk" event={"ID":"1176d52c-0fec-4346-ad79-af25ac4c3f62","Type":"ContainerStarted","Data":"047893e9bcb3936b14de9e12190ed07b929028ec05ed100b6d303d345d2956a8"} Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.073416 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-6f70-account-create-update-zsths" event={"ID":"fb4a489a-d80d-49a4-9624-ecfe6a4200ca","Type":"ContainerStarted","Data":"5c70ad139cf4e200678dce14e55d5e86bcd738be3708dfbc0ec0f8f471848a24"} Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.077902 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kp8xl" event={"ID":"db403f32-6720-4260-a66d-45e5d0e7b5c6","Type":"ContainerStarted","Data":"389aa29b705d25b729a72555f23f81dc2c5269229ea8207977650d2dd547da96"} Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.077928 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kp8xl" event={"ID":"db403f32-6720-4260-a66d-45e5d0e7b5c6","Type":"ContainerStarted","Data":"af87909f074419f5a49706ff02eabf8a083f5ad2a5fbeb910a8082b93318a832"} Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.085803 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4eb8-account-create-update-k449n" event={"ID":"eaf1cab3-21ae-4850-a732-d0e75f55ffc4","Type":"ContainerStarted","Data":"6e3e149dc3582a03e82019199a0aca09e0e0edb92775c952b6d66f45ebbdcdc5"} Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.091386 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9xsgn" event={"ID":"90854a3f-3d4b-4a6b-a269-d4b33cbf94c9","Type":"ContainerStarted","Data":"906ae965fc7672b88c668edce8c19b1b658c73b843592f53f153c5d2ad471532"} Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.094279 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2635-account-create-update-hz28g" event={"ID":"e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc","Type":"ContainerStarted","Data":"41aa4c87a1fca91145b590894f1cb3b229262a1f21e38f4a02bcd4065c56e8e1"} Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.094309 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2635-account-create-update-hz28g" event={"ID":"e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc","Type":"ContainerStarted","Data":"beb7a3385952ec7bacb02835a4fa41d4b54cf4dda046274d7627549eb92bc11b"} Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.119366 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-6pvd6" event={"ID":"2aed034c-35b5-4fd4-b0c4-cebbdfb41da2","Type":"ContainerStarted","Data":"d97d47184e7df00db5d7e48a8f7a9092af3bdb44d39a15968acfaacb8c9ef76d"} Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.119406 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-6pvd6" event={"ID":"2aed034c-35b5-4fd4-b0c4-cebbdfb41da2","Type":"ContainerStarted","Data":"d471a2b18dd03278c2450d4227c778cf7b7906bceb6a8b8ea2b168e5987e8b0f"} Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.195771 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-9xsgn" podStartSLOduration=3.195752917 podStartE2EDuration="3.195752917s" podCreationTimestamp="2026-01-28 20:58:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:58:53.115590948 +0000 UTC m=+1161.071777302" watchObservedRunningTime="2026-01-28 20:58:53.195752917 +0000 UTC m=+1161.151939271" Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.202353 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-kp8xl" podStartSLOduration=3.202334293 podStartE2EDuration="3.202334293s" podCreationTimestamp="2026-01-28 20:58:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:58:53.130256751 +0000 UTC m=+1161.086443095" watchObservedRunningTime="2026-01-28 20:58:53.202334293 +0000 UTC m=+1161.158520647" Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.215679 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-564b-account-create-update-m6nh8" podStartSLOduration=2.215659881 podStartE2EDuration="2.215659881s" podCreationTimestamp="2026-01-28 20:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:58:53.153661899 +0000 UTC m=+1161.109848263" watchObservedRunningTime="2026-01-28 20:58:53.215659881 +0000 UTC m=+1161.171846235" Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.235698 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-2635-account-create-update-hz28g" podStartSLOduration=3.235677517 podStartE2EDuration="3.235677517s" podCreationTimestamp="2026-01-28 20:58:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:58:53.17346692 +0000 UTC m=+1161.129653274" watchObservedRunningTime="2026-01-28 20:58:53.235677517 +0000 UTC m=+1161.191863871" Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.237032 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-27prr" podStartSLOduration=3.237025853 podStartE2EDuration="3.237025853s" podCreationTimestamp="2026-01-28 20:58:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:58:53.192283724 +0000 UTC m=+1161.148470088" watchObservedRunningTime="2026-01-28 20:58:53.237025853 +0000 UTC m=+1161.193212207" Jan 28 20:58:53 crc kubenswrapper[4746]: I0128 20:58:53.254531 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-create-6pvd6" podStartSLOduration=2.254510653 podStartE2EDuration="2.254510653s" podCreationTimestamp="2026-01-28 20:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:58:53.233409447 +0000 UTC m=+1161.189595801" watchObservedRunningTime="2026-01-28 20:58:53.254510653 +0000 UTC m=+1161.210697007" Jan 28 20:58:54 crc kubenswrapper[4746]: I0128 20:58:54.151562 4746 generic.go:334] "Generic (PLEG): container finished" podID="2aed034c-35b5-4fd4-b0c4-cebbdfb41da2" containerID="d97d47184e7df00db5d7e48a8f7a9092af3bdb44d39a15968acfaacb8c9ef76d" exitCode=0 Jan 28 20:58:54 crc kubenswrapper[4746]: I0128 20:58:54.151851 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-6pvd6" event={"ID":"2aed034c-35b5-4fd4-b0c4-cebbdfb41da2","Type":"ContainerDied","Data":"d97d47184e7df00db5d7e48a8f7a9092af3bdb44d39a15968acfaacb8c9ef76d"} Jan 28 20:58:54 crc kubenswrapper[4746]: I0128 20:58:54.155506 4746 generic.go:334] "Generic (PLEG): container finished" podID="c4a37860-0f42-4d7c-89e0-8505e8e49c59" containerID="51190e28e35697c5ce6351b8e77e5a505b8502b14e2c1dcb41d9348a50e9aae0" exitCode=0 Jan 28 20:58:54 crc kubenswrapper[4746]: I0128 20:58:54.155692 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dlwzg" event={"ID":"c4a37860-0f42-4d7c-89e0-8505e8e49c59","Type":"ContainerDied","Data":"51190e28e35697c5ce6351b8e77e5a505b8502b14e2c1dcb41d9348a50e9aae0"} Jan 28 20:58:54 crc kubenswrapper[4746]: I0128 20:58:54.169364 4746 generic.go:334] "Generic (PLEG): container finished" podID="e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc" containerID="41aa4c87a1fca91145b590894f1cb3b229262a1f21e38f4a02bcd4065c56e8e1" exitCode=0 Jan 28 20:58:54 crc kubenswrapper[4746]: I0128 20:58:54.169577 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2635-account-create-update-hz28g" event={"ID":"e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc","Type":"ContainerDied","Data":"41aa4c87a1fca91145b590894f1cb3b229262a1f21e38f4a02bcd4065c56e8e1"} Jan 28 20:58:54 crc kubenswrapper[4746]: I0128 20:58:54.171627 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-6f70-account-create-update-zsths" event={"ID":"fb4a489a-d80d-49a4-9624-ecfe6a4200ca","Type":"ContainerStarted","Data":"5948b5a9f847a8b980f1decf747168335524f42737ad09efc0b1fca00783847a"} Jan 28 20:58:54 crc kubenswrapper[4746]: I0128 20:58:54.175987 4746 generic.go:334] "Generic (PLEG): container finished" podID="da1723e7-7026-414e-b1e1-79911e331408" containerID="d7e34ce150b2322e65670a106c0878e3573205aebc249421f89da536b3a40251" exitCode=0 Jan 28 20:58:54 crc kubenswrapper[4746]: I0128 20:58:54.176058 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-564b-account-create-update-m6nh8" event={"ID":"da1723e7-7026-414e-b1e1-79911e331408","Type":"ContainerDied","Data":"d7e34ce150b2322e65670a106c0878e3573205aebc249421f89da536b3a40251"} Jan 28 20:58:54 crc kubenswrapper[4746]: I0128 20:58:54.179007 4746 generic.go:334] "Generic (PLEG): container finished" podID="a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5" containerID="68613854b61f89e684047d7c10e6855b82ef3a9fefe04b5e735ff2578df9b978" exitCode=0 Jan 28 20:58:54 crc kubenswrapper[4746]: I0128 20:58:54.179217 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-27prr" event={"ID":"a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5","Type":"ContainerDied","Data":"68613854b61f89e684047d7c10e6855b82ef3a9fefe04b5e735ff2578df9b978"} Jan 28 20:58:54 crc kubenswrapper[4746]: I0128 20:58:54.188459 4746 generic.go:334] "Generic (PLEG): container finished" podID="eaf1cab3-21ae-4850-a732-d0e75f55ffc4" containerID="b6c27ab23e35d3e023d033707ffd8c5967ba793323f79bc0ea3639bf0b9ccab6" exitCode=0 Jan 28 20:58:54 crc kubenswrapper[4746]: I0128 20:58:54.188679 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4eb8-account-create-update-k449n" event={"ID":"eaf1cab3-21ae-4850-a732-d0e75f55ffc4","Type":"ContainerDied","Data":"b6c27ab23e35d3e023d033707ffd8c5967ba793323f79bc0ea3639bf0b9ccab6"} Jan 28 20:58:54 crc kubenswrapper[4746]: I0128 20:58:54.191472 4746 generic.go:334] "Generic (PLEG): container finished" podID="90854a3f-3d4b-4a6b-a269-d4b33cbf94c9" containerID="906ae965fc7672b88c668edce8c19b1b658c73b843592f53f153c5d2ad471532" exitCode=0 Jan 28 20:58:54 crc kubenswrapper[4746]: I0128 20:58:54.191548 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9xsgn" event={"ID":"90854a3f-3d4b-4a6b-a269-d4b33cbf94c9","Type":"ContainerDied","Data":"906ae965fc7672b88c668edce8c19b1b658c73b843592f53f153c5d2ad471532"} Jan 28 20:58:54 crc kubenswrapper[4746]: I0128 20:58:54.195017 4746 generic.go:334] "Generic (PLEG): container finished" podID="db403f32-6720-4260-a66d-45e5d0e7b5c6" containerID="389aa29b705d25b729a72555f23f81dc2c5269229ea8207977650d2dd547da96" exitCode=0 Jan 28 20:58:54 crc kubenswrapper[4746]: I0128 20:58:54.195071 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kp8xl" event={"ID":"db403f32-6720-4260-a66d-45e5d0e7b5c6","Type":"ContainerDied","Data":"389aa29b705d25b729a72555f23f81dc2c5269229ea8207977650d2dd547da96"} Jan 28 20:58:55 crc kubenswrapper[4746]: I0128 20:58:55.217224 4746 generic.go:334] "Generic (PLEG): container finished" podID="fb4a489a-d80d-49a4-9624-ecfe6a4200ca" containerID="5948b5a9f847a8b980f1decf747168335524f42737ad09efc0b1fca00783847a" exitCode=0 Jan 28 20:58:55 crc kubenswrapper[4746]: I0128 20:58:55.217334 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-6f70-account-create-update-zsths" event={"ID":"fb4a489a-d80d-49a4-9624-ecfe6a4200ca","Type":"ContainerDied","Data":"5948b5a9f847a8b980f1decf747168335524f42737ad09efc0b1fca00783847a"} Jan 28 20:58:56 crc kubenswrapper[4746]: I0128 20:58:56.042350 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 20:58:56 crc kubenswrapper[4746]: I0128 20:58:56.042805 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerName="prometheus" containerID="cri-o://7f07d0762d2d78bb7a35ece131dabc38efd5554409eccb72f034b896277380c1" gracePeriod=600 Jan 28 20:58:56 crc kubenswrapper[4746]: I0128 20:58:56.043209 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerName="thanos-sidecar" containerID="cri-o://c0ddb2ac8b981c68e3c5cdd2b45c613d4095039ce06d3da8fa9d8eabf8cbe483" gracePeriod=600 Jan 28 20:58:56 crc kubenswrapper[4746]: I0128 20:58:56.043265 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerName="config-reloader" containerID="cri-o://87a1fc8f482054b4ebcdef39a9f24f66265346cafd84734616e8affa161376bd" gracePeriod=600 Jan 28 20:58:56 crc kubenswrapper[4746]: I0128 20:58:56.230115 4746 generic.go:334] "Generic (PLEG): container finished" podID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerID="c0ddb2ac8b981c68e3c5cdd2b45c613d4095039ce06d3da8fa9d8eabf8cbe483" exitCode=0 Jan 28 20:58:56 crc kubenswrapper[4746]: I0128 20:58:56.230155 4746 generic.go:334] "Generic (PLEG): container finished" podID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerID="7f07d0762d2d78bb7a35ece131dabc38efd5554409eccb72f034b896277380c1" exitCode=0 Jan 28 20:58:56 crc kubenswrapper[4746]: I0128 20:58:56.230178 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fde93743-7b9d-4175-abdf-bd74008cf4b0","Type":"ContainerDied","Data":"c0ddb2ac8b981c68e3c5cdd2b45c613d4095039ce06d3da8fa9d8eabf8cbe483"} Jan 28 20:58:56 crc kubenswrapper[4746]: I0128 20:58:56.230214 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fde93743-7b9d-4175-abdf-bd74008cf4b0","Type":"ContainerDied","Data":"7f07d0762d2d78bb7a35ece131dabc38efd5554409eccb72f034b896277380c1"} Jan 28 20:58:56 crc kubenswrapper[4746]: I0128 20:58:56.809508 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.113:9090/-/ready\": dial tcp 10.217.0.113:9090: connect: connection refused" Jan 28 20:58:57 crc kubenswrapper[4746]: I0128 20:58:57.242015 4746 generic.go:334] "Generic (PLEG): container finished" podID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerID="87a1fc8f482054b4ebcdef39a9f24f66265346cafd84734616e8affa161376bd" exitCode=0 Jan 28 20:58:57 crc kubenswrapper[4746]: I0128 20:58:57.242053 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fde93743-7b9d-4175-abdf-bd74008cf4b0","Type":"ContainerDied","Data":"87a1fc8f482054b4ebcdef39a9f24f66265346cafd84734616e8affa161376bd"} Jan 28 20:58:57 crc kubenswrapper[4746]: I0128 20:58:57.253962 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 28 20:58:58 crc kubenswrapper[4746]: I0128 20:58:58.222049 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:58 crc kubenswrapper[4746]: I0128 20:58:58.236512 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/39e8de66-78c6-45cf-b026-7783ef89922d-etc-swift\") pod \"swift-storage-0\" (UID: \"39e8de66-78c6-45cf-b026-7783ef89922d\") " pod="openstack/swift-storage-0" Jan 28 20:58:58 crc kubenswrapper[4746]: I0128 20:58:58.441289 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.236946 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="d3cad0b0-7b53-4280-9dec-05e01692820c" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.445095 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kp8xl" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.456785 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-564b-account-create-update-m6nh8" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.481659 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-6f70-account-create-update-zsths" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.489949 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2635-account-create-update-hz28g" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.499995 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-6pvd6" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.521733 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dlwzg" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.528734 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-27prr" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.545568 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9xsgn" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.550614 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4eb8-account-create-update-k449n" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.556594 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl9hg\" (UniqueName: \"kubernetes.io/projected/db403f32-6720-4260-a66d-45e5d0e7b5c6-kube-api-access-jl9hg\") pod \"db403f32-6720-4260-a66d-45e5d0e7b5c6\" (UID: \"db403f32-6720-4260-a66d-45e5d0e7b5c6\") " Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.556710 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da1723e7-7026-414e-b1e1-79911e331408-operator-scripts\") pod \"da1723e7-7026-414e-b1e1-79911e331408\" (UID: \"da1723e7-7026-414e-b1e1-79911e331408\") " Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.556986 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drf9n\" (UniqueName: \"kubernetes.io/projected/da1723e7-7026-414e-b1e1-79911e331408-kube-api-access-drf9n\") pod \"da1723e7-7026-414e-b1e1-79911e331408\" (UID: \"da1723e7-7026-414e-b1e1-79911e331408\") " Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.557026 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb4a489a-d80d-49a4-9624-ecfe6a4200ca-operator-scripts\") pod \"fb4a489a-d80d-49a4-9624-ecfe6a4200ca\" (UID: \"fb4a489a-d80d-49a4-9624-ecfe6a4200ca\") " Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.557114 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrxbb\" (UniqueName: \"kubernetes.io/projected/fb4a489a-d80d-49a4-9624-ecfe6a4200ca-kube-api-access-qrxbb\") pod \"fb4a489a-d80d-49a4-9624-ecfe6a4200ca\" (UID: \"fb4a489a-d80d-49a4-9624-ecfe6a4200ca\") " Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.557200 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db403f32-6720-4260-a66d-45e5d0e7b5c6-operator-scripts\") pod \"db403f32-6720-4260-a66d-45e5d0e7b5c6\" (UID: \"db403f32-6720-4260-a66d-45e5d0e7b5c6\") " Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.558983 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb4a489a-d80d-49a4-9624-ecfe6a4200ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fb4a489a-d80d-49a4-9624-ecfe6a4200ca" (UID: "fb4a489a-d80d-49a4-9624-ecfe6a4200ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.559062 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da1723e7-7026-414e-b1e1-79911e331408-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da1723e7-7026-414e-b1e1-79911e331408" (UID: "da1723e7-7026-414e-b1e1-79911e331408"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.559072 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db403f32-6720-4260-a66d-45e5d0e7b5c6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db403f32-6720-4260-a66d-45e5d0e7b5c6" (UID: "db403f32-6720-4260-a66d-45e5d0e7b5c6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.564293 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb4a489a-d80d-49a4-9624-ecfe6a4200ca-kube-api-access-qrxbb" (OuterVolumeSpecName: "kube-api-access-qrxbb") pod "fb4a489a-d80d-49a4-9624-ecfe6a4200ca" (UID: "fb4a489a-d80d-49a4-9624-ecfe6a4200ca"). InnerVolumeSpecName "kube-api-access-qrxbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.564836 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db403f32-6720-4260-a66d-45e5d0e7b5c6-kube-api-access-jl9hg" (OuterVolumeSpecName: "kube-api-access-jl9hg") pod "db403f32-6720-4260-a66d-45e5d0e7b5c6" (UID: "db403f32-6720-4260-a66d-45e5d0e7b5c6"). InnerVolumeSpecName "kube-api-access-jl9hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.590938 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da1723e7-7026-414e-b1e1-79911e331408-kube-api-access-drf9n" (OuterVolumeSpecName: "kube-api-access-drf9n") pod "da1723e7-7026-414e-b1e1-79911e331408" (UID: "da1723e7-7026-414e-b1e1-79911e331408"). InnerVolumeSpecName "kube-api-access-drf9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.658828 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc-operator-scripts\") pod \"e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc\" (UID: \"e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc\") " Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.658870 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sx7r\" (UniqueName: \"kubernetes.io/projected/c4a37860-0f42-4d7c-89e0-8505e8e49c59-kube-api-access-7sx7r\") pod \"c4a37860-0f42-4d7c-89e0-8505e8e49c59\" (UID: \"c4a37860-0f42-4d7c-89e0-8505e8e49c59\") " Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.658910 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8w79\" (UniqueName: \"kubernetes.io/projected/eaf1cab3-21ae-4850-a732-d0e75f55ffc4-kube-api-access-l8w79\") pod \"eaf1cab3-21ae-4850-a732-d0e75f55ffc4\" (UID: \"eaf1cab3-21ae-4850-a732-d0e75f55ffc4\") " Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.658946 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aed034c-35b5-4fd4-b0c4-cebbdfb41da2-operator-scripts\") pod \"2aed034c-35b5-4fd4-b0c4-cebbdfb41da2\" (UID: \"2aed034c-35b5-4fd4-b0c4-cebbdfb41da2\") " Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.659004 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4a37860-0f42-4d7c-89e0-8505e8e49c59-operator-scripts\") pod \"c4a37860-0f42-4d7c-89e0-8505e8e49c59\" (UID: \"c4a37860-0f42-4d7c-89e0-8505e8e49c59\") " Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.659063 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90854a3f-3d4b-4a6b-a269-d4b33cbf94c9-operator-scripts\") pod \"90854a3f-3d4b-4a6b-a269-d4b33cbf94c9\" (UID: \"90854a3f-3d4b-4a6b-a269-d4b33cbf94c9\") " Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.659148 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf1cab3-21ae-4850-a732-d0e75f55ffc4-operator-scripts\") pod \"eaf1cab3-21ae-4850-a732-d0e75f55ffc4\" (UID: \"eaf1cab3-21ae-4850-a732-d0e75f55ffc4\") " Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.659197 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbdpr\" (UniqueName: \"kubernetes.io/projected/2aed034c-35b5-4fd4-b0c4-cebbdfb41da2-kube-api-access-dbdpr\") pod \"2aed034c-35b5-4fd4-b0c4-cebbdfb41da2\" (UID: \"2aed034c-35b5-4fd4-b0c4-cebbdfb41da2\") " Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.659222 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqh9z\" (UniqueName: \"kubernetes.io/projected/90854a3f-3d4b-4a6b-a269-d4b33cbf94c9-kube-api-access-hqh9z\") pod \"90854a3f-3d4b-4a6b-a269-d4b33cbf94c9\" (UID: \"90854a3f-3d4b-4a6b-a269-d4b33cbf94c9\") " Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.659239 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl58t\" (UniqueName: \"kubernetes.io/projected/a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5-kube-api-access-jl58t\") pod \"a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5\" (UID: \"a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5\") " Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.659276 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5-operator-scripts\") pod \"a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5\" (UID: \"a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5\") " Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.659302 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdhpp\" (UniqueName: \"kubernetes.io/projected/e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc-kube-api-access-jdhpp\") pod \"e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc\" (UID: \"e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc\") " Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.659677 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jl9hg\" (UniqueName: \"kubernetes.io/projected/db403f32-6720-4260-a66d-45e5d0e7b5c6-kube-api-access-jl9hg\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.659695 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da1723e7-7026-414e-b1e1-79911e331408-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.659704 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drf9n\" (UniqueName: \"kubernetes.io/projected/da1723e7-7026-414e-b1e1-79911e331408-kube-api-access-drf9n\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.659714 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb4a489a-d80d-49a4-9624-ecfe6a4200ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.659722 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrxbb\" (UniqueName: \"kubernetes.io/projected/fb4a489a-d80d-49a4-9624-ecfe6a4200ca-kube-api-access-qrxbb\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.659731 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db403f32-6720-4260-a66d-45e5d0e7b5c6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.659904 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5" (UID: "a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.659989 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaf1cab3-21ae-4850-a732-d0e75f55ffc4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eaf1cab3-21ae-4850-a732-d0e75f55ffc4" (UID: "eaf1cab3-21ae-4850-a732-d0e75f55ffc4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.660034 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90854a3f-3d4b-4a6b-a269-d4b33cbf94c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90854a3f-3d4b-4a6b-a269-d4b33cbf94c9" (UID: "90854a3f-3d4b-4a6b-a269-d4b33cbf94c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.660330 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4a37860-0f42-4d7c-89e0-8505e8e49c59-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4a37860-0f42-4d7c-89e0-8505e8e49c59" (UID: "c4a37860-0f42-4d7c-89e0-8505e8e49c59"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.660413 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc" (UID: "e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.660597 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aed034c-35b5-4fd4-b0c4-cebbdfb41da2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2aed034c-35b5-4fd4-b0c4-cebbdfb41da2" (UID: "2aed034c-35b5-4fd4-b0c4-cebbdfb41da2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.662910 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90854a3f-3d4b-4a6b-a269-d4b33cbf94c9-kube-api-access-hqh9z" (OuterVolumeSpecName: "kube-api-access-hqh9z") pod "90854a3f-3d4b-4a6b-a269-d4b33cbf94c9" (UID: "90854a3f-3d4b-4a6b-a269-d4b33cbf94c9"). InnerVolumeSpecName "kube-api-access-hqh9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.663161 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5-kube-api-access-jl58t" (OuterVolumeSpecName: "kube-api-access-jl58t") pod "a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5" (UID: "a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5"). InnerVolumeSpecName "kube-api-access-jl58t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.664337 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc-kube-api-access-jdhpp" (OuterVolumeSpecName: "kube-api-access-jdhpp") pod "e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc" (UID: "e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc"). InnerVolumeSpecName "kube-api-access-jdhpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.666553 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4a37860-0f42-4d7c-89e0-8505e8e49c59-kube-api-access-7sx7r" (OuterVolumeSpecName: "kube-api-access-7sx7r") pod "c4a37860-0f42-4d7c-89e0-8505e8e49c59" (UID: "c4a37860-0f42-4d7c-89e0-8505e8e49c59"). InnerVolumeSpecName "kube-api-access-7sx7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.666792 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aed034c-35b5-4fd4-b0c4-cebbdfb41da2-kube-api-access-dbdpr" (OuterVolumeSpecName: "kube-api-access-dbdpr") pod "2aed034c-35b5-4fd4-b0c4-cebbdfb41da2" (UID: "2aed034c-35b5-4fd4-b0c4-cebbdfb41da2"). InnerVolumeSpecName "kube-api-access-dbdpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.667213 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaf1cab3-21ae-4850-a732-d0e75f55ffc4-kube-api-access-l8w79" (OuterVolumeSpecName: "kube-api-access-l8w79") pod "eaf1cab3-21ae-4850-a732-d0e75f55ffc4" (UID: "eaf1cab3-21ae-4850-a732-d0e75f55ffc4"). InnerVolumeSpecName "kube-api-access-l8w79". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.761198 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaf1cab3-21ae-4850-a732-d0e75f55ffc4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.761241 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbdpr\" (UniqueName: \"kubernetes.io/projected/2aed034c-35b5-4fd4-b0c4-cebbdfb41da2-kube-api-access-dbdpr\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.761254 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqh9z\" (UniqueName: \"kubernetes.io/projected/90854a3f-3d4b-4a6b-a269-d4b33cbf94c9-kube-api-access-hqh9z\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.761284 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jl58t\" (UniqueName: \"kubernetes.io/projected/a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5-kube-api-access-jl58t\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.761294 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.761302 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdhpp\" (UniqueName: \"kubernetes.io/projected/e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc-kube-api-access-jdhpp\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.761316 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.761325 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sx7r\" (UniqueName: \"kubernetes.io/projected/c4a37860-0f42-4d7c-89e0-8505e8e49c59-kube-api-access-7sx7r\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.761333 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8w79\" (UniqueName: \"kubernetes.io/projected/eaf1cab3-21ae-4850-a732-d0e75f55ffc4-kube-api-access-l8w79\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.761342 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aed034c-35b5-4fd4-b0c4-cebbdfb41da2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.761351 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4a37860-0f42-4d7c-89e0-8505e8e49c59-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:58:59 crc kubenswrapper[4746]: I0128 20:58:59.761360 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90854a3f-3d4b-4a6b-a269-d4b33cbf94c9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.288884 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9xsgn" event={"ID":"90854a3f-3d4b-4a6b-a269-d4b33cbf94c9","Type":"ContainerDied","Data":"d1ed7ff7cfd603e02a0adeb2f5ab7edae52e633ff5053605c37bbfcfec9a6916"} Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.288919 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9xsgn" Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.288929 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1ed7ff7cfd603e02a0adeb2f5ab7edae52e633ff5053605c37bbfcfec9a6916" Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.291355 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2635-account-create-update-hz28g" event={"ID":"e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc","Type":"ContainerDied","Data":"beb7a3385952ec7bacb02835a4fa41d4b54cf4dda046274d7627549eb92bc11b"} Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.291388 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beb7a3385952ec7bacb02835a4fa41d4b54cf4dda046274d7627549eb92bc11b" Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.291406 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2635-account-create-update-hz28g" Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.293598 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-6f70-account-create-update-zsths" event={"ID":"fb4a489a-d80d-49a4-9624-ecfe6a4200ca","Type":"ContainerDied","Data":"5c70ad139cf4e200678dce14e55d5e86bcd738be3708dfbc0ec0f8f471848a24"} Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.293619 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c70ad139cf4e200678dce14e55d5e86bcd738be3708dfbc0ec0f8f471848a24" Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.293624 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-6f70-account-create-update-zsths" Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.295436 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kp8xl" event={"ID":"db403f32-6720-4260-a66d-45e5d0e7b5c6","Type":"ContainerDied","Data":"af87909f074419f5a49706ff02eabf8a083f5ad2a5fbeb910a8082b93318a832"} Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.295458 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af87909f074419f5a49706ff02eabf8a083f5ad2a5fbeb910a8082b93318a832" Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.295459 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kp8xl" Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.297660 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-6pvd6" event={"ID":"2aed034c-35b5-4fd4-b0c4-cebbdfb41da2","Type":"ContainerDied","Data":"d471a2b18dd03278c2450d4227c778cf7b7906bceb6a8b8ea2b168e5987e8b0f"} Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.297788 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d471a2b18dd03278c2450d4227c778cf7b7906bceb6a8b8ea2b168e5987e8b0f" Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.297688 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-6pvd6" Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.299155 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dlwzg" event={"ID":"c4a37860-0f42-4d7c-89e0-8505e8e49c59","Type":"ContainerDied","Data":"ad2f4a3c5954186ccb0ca39a6ce3172a0a1c5ef7ec6d594a1e9269865d91d60d"} Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.299191 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dlwzg" Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.299195 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad2f4a3c5954186ccb0ca39a6ce3172a0a1c5ef7ec6d594a1e9269865d91d60d" Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.302856 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4eb8-account-create-update-k449n" event={"ID":"eaf1cab3-21ae-4850-a732-d0e75f55ffc4","Type":"ContainerDied","Data":"6e3e149dc3582a03e82019199a0aca09e0e0edb92775c952b6d66f45ebbdcdc5"} Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.302884 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e3e149dc3582a03e82019199a0aca09e0e0edb92775c952b6d66f45ebbdcdc5" Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.302922 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4eb8-account-create-update-k449n" Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.309811 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-564b-account-create-update-m6nh8" event={"ID":"da1723e7-7026-414e-b1e1-79911e331408","Type":"ContainerDied","Data":"d8ae3e54e8cbd8df2ee805212faea4258c0efc0f59edb18becf670d8a0186ce0"} Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.309839 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8ae3e54e8cbd8df2ee805212faea4258c0efc0f59edb18becf670d8a0186ce0" Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.309859 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-564b-account-create-update-m6nh8" Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.311287 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-27prr" event={"ID":"a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5","Type":"ContainerDied","Data":"5b075e56aa69af819848f0626947761447a29601516062badfcaf022a77735c3"} Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.311320 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b075e56aa69af819848f0626947761447a29601516062badfcaf022a77735c3" Jan 28 20:59:00 crc kubenswrapper[4746]: I0128 20:59:00.311383 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-27prr" Jan 28 20:59:01 crc kubenswrapper[4746]: I0128 20:59:01.908488 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9xsgn"] Jan 28 20:59:01 crc kubenswrapper[4746]: I0128 20:59:01.918382 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9xsgn"] Jan 28 20:59:02 crc kubenswrapper[4746]: I0128 20:59:02.862056 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90854a3f-3d4b-4a6b-a269-d4b33cbf94c9" path="/var/lib/kubelet/pods/90854a3f-3d4b-4a6b-a269-d4b33cbf94c9/volumes" Jan 28 20:59:04 crc kubenswrapper[4746]: I0128 20:59:04.810664 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.113:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.473929 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-q885p"] Jan 28 20:59:05 crc kubenswrapper[4746]: E0128 20:59:05.474448 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db403f32-6720-4260-a66d-45e5d0e7b5c6" containerName="mariadb-database-create" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.474479 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="db403f32-6720-4260-a66d-45e5d0e7b5c6" containerName="mariadb-database-create" Jan 28 20:59:05 crc kubenswrapper[4746]: E0128 20:59:05.474506 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a37860-0f42-4d7c-89e0-8505e8e49c59" containerName="mariadb-database-create" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.474515 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a37860-0f42-4d7c-89e0-8505e8e49c59" containerName="mariadb-database-create" Jan 28 20:59:05 crc kubenswrapper[4746]: E0128 20:59:05.474537 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf1cab3-21ae-4850-a732-d0e75f55ffc4" containerName="mariadb-account-create-update" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.474545 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf1cab3-21ae-4850-a732-d0e75f55ffc4" containerName="mariadb-account-create-update" Jan 28 20:59:05 crc kubenswrapper[4746]: E0128 20:59:05.474563 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb4a489a-d80d-49a4-9624-ecfe6a4200ca" containerName="mariadb-account-create-update" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.474571 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb4a489a-d80d-49a4-9624-ecfe6a4200ca" containerName="mariadb-account-create-update" Jan 28 20:59:05 crc kubenswrapper[4746]: E0128 20:59:05.474588 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da1723e7-7026-414e-b1e1-79911e331408" containerName="mariadb-account-create-update" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.474596 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1723e7-7026-414e-b1e1-79911e331408" containerName="mariadb-account-create-update" Jan 28 20:59:05 crc kubenswrapper[4746]: E0128 20:59:05.474609 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc" containerName="mariadb-account-create-update" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.474619 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc" containerName="mariadb-account-create-update" Jan 28 20:59:05 crc kubenswrapper[4746]: E0128 20:59:05.474630 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aed034c-35b5-4fd4-b0c4-cebbdfb41da2" containerName="mariadb-database-create" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.474637 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aed034c-35b5-4fd4-b0c4-cebbdfb41da2" containerName="mariadb-database-create" Jan 28 20:59:05 crc kubenswrapper[4746]: E0128 20:59:05.474653 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5" containerName="mariadb-database-create" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.474660 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5" containerName="mariadb-database-create" Jan 28 20:59:05 crc kubenswrapper[4746]: E0128 20:59:05.474672 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90854a3f-3d4b-4a6b-a269-d4b33cbf94c9" containerName="mariadb-account-create-update" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.474680 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="90854a3f-3d4b-4a6b-a269-d4b33cbf94c9" containerName="mariadb-account-create-update" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.474897 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb4a489a-d80d-49a4-9624-ecfe6a4200ca" containerName="mariadb-account-create-update" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.474915 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="90854a3f-3d4b-4a6b-a269-d4b33cbf94c9" containerName="mariadb-account-create-update" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.474928 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="da1723e7-7026-414e-b1e1-79911e331408" containerName="mariadb-account-create-update" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.474941 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="db403f32-6720-4260-a66d-45e5d0e7b5c6" containerName="mariadb-database-create" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.474953 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aed034c-35b5-4fd4-b0c4-cebbdfb41da2" containerName="mariadb-database-create" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.474968 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5" containerName="mariadb-database-create" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.474981 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc" containerName="mariadb-account-create-update" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.474993 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4a37860-0f42-4d7c-89e0-8505e8e49c59" containerName="mariadb-database-create" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.475006 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf1cab3-21ae-4850-a732-d0e75f55ffc4" containerName="mariadb-account-create-update" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.475778 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q885p" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.479999 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.486875 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-q885p"] Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.614874 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0229a209-4581-4685-ae13-fb7e3be8e743-operator-scripts\") pod \"root-account-create-update-q885p\" (UID: \"0229a209-4581-4685-ae13-fb7e3be8e743\") " pod="openstack/root-account-create-update-q885p" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.615592 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4xx5\" (UniqueName: \"kubernetes.io/projected/0229a209-4581-4685-ae13-fb7e3be8e743-kube-api-access-h4xx5\") pod \"root-account-create-update-q885p\" (UID: \"0229a209-4581-4685-ae13-fb7e3be8e743\") " pod="openstack/root-account-create-update-q885p" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.717045 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0229a209-4581-4685-ae13-fb7e3be8e743-operator-scripts\") pod \"root-account-create-update-q885p\" (UID: \"0229a209-4581-4685-ae13-fb7e3be8e743\") " pod="openstack/root-account-create-update-q885p" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.717208 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4xx5\" (UniqueName: \"kubernetes.io/projected/0229a209-4581-4685-ae13-fb7e3be8e743-kube-api-access-h4xx5\") pod \"root-account-create-update-q885p\" (UID: \"0229a209-4581-4685-ae13-fb7e3be8e743\") " pod="openstack/root-account-create-update-q885p" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.717871 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0229a209-4581-4685-ae13-fb7e3be8e743-operator-scripts\") pod \"root-account-create-update-q885p\" (UID: \"0229a209-4581-4685-ae13-fb7e3be8e743\") " pod="openstack/root-account-create-update-q885p" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.747272 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4xx5\" (UniqueName: \"kubernetes.io/projected/0229a209-4581-4685-ae13-fb7e3be8e743-kube-api-access-h4xx5\") pod \"root-account-create-update-q885p\" (UID: \"0229a209-4581-4685-ae13-fb7e3be8e743\") " pod="openstack/root-account-create-update-q885p" Jan 28 20:59:05 crc kubenswrapper[4746]: I0128 20:59:05.834571 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q885p" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.027793 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.163295 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fde93743-7b9d-4175-abdf-bd74008cf4b0-config-out\") pod \"fde93743-7b9d-4175-abdf-bd74008cf4b0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.163354 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fde93743-7b9d-4175-abdf-bd74008cf4b0-config\") pod \"fde93743-7b9d-4175-abdf-bd74008cf4b0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.163414 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fde93743-7b9d-4175-abdf-bd74008cf4b0-prometheus-metric-storage-rulefiles-0\") pod \"fde93743-7b9d-4175-abdf-bd74008cf4b0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.163483 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fde93743-7b9d-4175-abdf-bd74008cf4b0-web-config\") pod \"fde93743-7b9d-4175-abdf-bd74008cf4b0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.163521 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fde93743-7b9d-4175-abdf-bd74008cf4b0-tls-assets\") pod \"fde93743-7b9d-4175-abdf-bd74008cf4b0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.163618 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd6fp\" (UniqueName: \"kubernetes.io/projected/fde93743-7b9d-4175-abdf-bd74008cf4b0-kube-api-access-fd6fp\") pod \"fde93743-7b9d-4175-abdf-bd74008cf4b0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.163839 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\") pod \"fde93743-7b9d-4175-abdf-bd74008cf4b0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.163877 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fde93743-7b9d-4175-abdf-bd74008cf4b0-prometheus-metric-storage-rulefiles-2\") pod \"fde93743-7b9d-4175-abdf-bd74008cf4b0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.163941 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fde93743-7b9d-4175-abdf-bd74008cf4b0-prometheus-metric-storage-rulefiles-1\") pod \"fde93743-7b9d-4175-abdf-bd74008cf4b0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.163973 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fde93743-7b9d-4175-abdf-bd74008cf4b0-thanos-prometheus-http-client-file\") pod \"fde93743-7b9d-4175-abdf-bd74008cf4b0\" (UID: \"fde93743-7b9d-4175-abdf-bd74008cf4b0\") " Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.164551 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fde93743-7b9d-4175-abdf-bd74008cf4b0-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "fde93743-7b9d-4175-abdf-bd74008cf4b0" (UID: "fde93743-7b9d-4175-abdf-bd74008cf4b0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.164931 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fde93743-7b9d-4175-abdf-bd74008cf4b0-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "fde93743-7b9d-4175-abdf-bd74008cf4b0" (UID: "fde93743-7b9d-4175-abdf-bd74008cf4b0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.165271 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fde93743-7b9d-4175-abdf-bd74008cf4b0-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "fde93743-7b9d-4175-abdf-bd74008cf4b0" (UID: "fde93743-7b9d-4175-abdf-bd74008cf4b0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.169105 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fde93743-7b9d-4175-abdf-bd74008cf4b0-config-out" (OuterVolumeSpecName: "config-out") pod "fde93743-7b9d-4175-abdf-bd74008cf4b0" (UID: "fde93743-7b9d-4175-abdf-bd74008cf4b0"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.169490 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde93743-7b9d-4175-abdf-bd74008cf4b0-config" (OuterVolumeSpecName: "config") pod "fde93743-7b9d-4175-abdf-bd74008cf4b0" (UID: "fde93743-7b9d-4175-abdf-bd74008cf4b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.171261 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fde93743-7b9d-4175-abdf-bd74008cf4b0-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "fde93743-7b9d-4175-abdf-bd74008cf4b0" (UID: "fde93743-7b9d-4175-abdf-bd74008cf4b0"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.171514 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fde93743-7b9d-4175-abdf-bd74008cf4b0-kube-api-access-fd6fp" (OuterVolumeSpecName: "kube-api-access-fd6fp") pod "fde93743-7b9d-4175-abdf-bd74008cf4b0" (UID: "fde93743-7b9d-4175-abdf-bd74008cf4b0"). InnerVolumeSpecName "kube-api-access-fd6fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.179142 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde93743-7b9d-4175-abdf-bd74008cf4b0-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "fde93743-7b9d-4175-abdf-bd74008cf4b0" (UID: "fde93743-7b9d-4175-abdf-bd74008cf4b0"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.206389 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fde93743-7b9d-4175-abdf-bd74008cf4b0-web-config" (OuterVolumeSpecName: "web-config") pod "fde93743-7b9d-4175-abdf-bd74008cf4b0" (UID: "fde93743-7b9d-4175-abdf-bd74008cf4b0"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.206612 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "fde93743-7b9d-4175-abdf-bd74008cf4b0" (UID: "fde93743-7b9d-4175-abdf-bd74008cf4b0"). InnerVolumeSpecName "pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.267158 4746 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fde93743-7b9d-4175-abdf-bd74008cf4b0-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.267295 4746 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fde93743-7b9d-4175-abdf-bd74008cf4b0-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.267313 4746 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fde93743-7b9d-4175-abdf-bd74008cf4b0-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.267326 4746 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fde93743-7b9d-4175-abdf-bd74008cf4b0-config-out\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.267574 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fde93743-7b9d-4175-abdf-bd74008cf4b0-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.267587 4746 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fde93743-7b9d-4175-abdf-bd74008cf4b0-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.267599 4746 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fde93743-7b9d-4175-abdf-bd74008cf4b0-web-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.267611 4746 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fde93743-7b9d-4175-abdf-bd74008cf4b0-tls-assets\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.267622 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd6fp\" (UniqueName: \"kubernetes.io/projected/fde93743-7b9d-4175-abdf-bd74008cf4b0-kube-api-access-fd6fp\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.267695 4746 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\") on node \"crc\" " Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.296638 4746 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.296821 4746 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0") on node "crc" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.341663 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.369794 4746 reconciler_common.go:293] "Volume detached for volume \"pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.390941 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-q885p"] Jan 28 20:59:08 crc kubenswrapper[4746]: W0128 20:59:08.393455 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0229a209_4581_4685_ae13_fb7e3be8e743.slice/crio-ee3d3ce9f76d8c7d2f45f698c036b1cd63a7749c429a508346870bde69a490e2 WatchSource:0}: Error finding container ee3d3ce9f76d8c7d2f45f698c036b1cd63a7749c429a508346870bde69a490e2: Status 404 returned error can't find the container with id ee3d3ce9f76d8c7d2f45f698c036b1cd63a7749c429a508346870bde69a490e2 Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.396368 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7mlnk" event={"ID":"1176d52c-0fec-4346-ad79-af25ac4c3f62","Type":"ContainerStarted","Data":"81caf5ae6d81fbfe614995cb0c9805b2ad35632a010c63d5790e8ab11eccc724"} Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.409848 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fde93743-7b9d-4175-abdf-bd74008cf4b0","Type":"ContainerDied","Data":"40d7ba0b2886f49e10fa30076957a979cdf53e9e3cd86fc83316978ff59321c8"} Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.409930 4746 scope.go:117] "RemoveContainer" containerID="c0ddb2ac8b981c68e3c5cdd2b45c613d4095039ce06d3da8fa9d8eabf8cbe483" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.410166 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.416915 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39e8de66-78c6-45cf-b026-7783ef89922d","Type":"ContainerStarted","Data":"b609c3a3a336b55b5e5daffaa4574768bb5918c3fde343a5c9b8e3591d51ac98"} Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.418297 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-7mlnk" podStartSLOduration=2.619757204 podStartE2EDuration="18.418281478s" podCreationTimestamp="2026-01-28 20:58:50 +0000 UTC" firstStartedPulling="2026-01-28 20:58:52.044684145 +0000 UTC m=+1160.000870499" lastFinishedPulling="2026-01-28 20:59:07.843208399 +0000 UTC m=+1175.799394773" observedRunningTime="2026-01-28 20:59:08.415874283 +0000 UTC m=+1176.372060637" watchObservedRunningTime="2026-01-28 20:59:08.418281478 +0000 UTC m=+1176.374467832" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.433398 4746 scope.go:117] "RemoveContainer" containerID="87a1fc8f482054b4ebcdef39a9f24f66265346cafd84734616e8affa161376bd" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.453215 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.462858 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.494037 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 20:59:08 crc kubenswrapper[4746]: E0128 20:59:08.495256 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerName="prometheus" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.495276 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerName="prometheus" Jan 28 20:59:08 crc kubenswrapper[4746]: E0128 20:59:08.495286 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerName="init-config-reloader" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.495293 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerName="init-config-reloader" Jan 28 20:59:08 crc kubenswrapper[4746]: E0128 20:59:08.495312 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerName="thanos-sidecar" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.495319 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerName="thanos-sidecar" Jan 28 20:59:08 crc kubenswrapper[4746]: E0128 20:59:08.495329 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerName="config-reloader" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.495335 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerName="config-reloader" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.495489 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerName="config-reloader" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.495501 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerName="prometheus" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.495515 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerName="thanos-sidecar" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.497597 4746 scope.go:117] "RemoveContainer" containerID="7f07d0762d2d78bb7a35ece131dabc38efd5554409eccb72f034b896277380c1" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.499327 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.502064 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.502227 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.505949 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.506215 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.506698 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.507181 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.507278 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.507364 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.508351 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-9bwbj" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.509073 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.548058 4746 scope.go:117] "RemoveContainer" containerID="af4cdf1309c1a6bdd11844ef3a15794d9f902b1f3ec405bfe53f763a4322e93a" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.677921 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/914308b3-0f5e-4716-bc87-948f8a8acfb3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.678021 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/914308b3-0f5e-4716-bc87-948f8a8acfb3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.678061 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/914308b3-0f5e-4716-bc87-948f8a8acfb3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.678133 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/914308b3-0f5e-4716-bc87-948f8a8acfb3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.678162 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/914308b3-0f5e-4716-bc87-948f8a8acfb3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.678219 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/914308b3-0f5e-4716-bc87-948f8a8acfb3-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.678312 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-schm6\" (UniqueName: \"kubernetes.io/projected/914308b3-0f5e-4716-bc87-948f8a8acfb3-kube-api-access-schm6\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.678357 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/914308b3-0f5e-4716-bc87-948f8a8acfb3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.678388 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/914308b3-0f5e-4716-bc87-948f8a8acfb3-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.678536 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/914308b3-0f5e-4716-bc87-948f8a8acfb3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.678576 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.678607 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/914308b3-0f5e-4716-bc87-948f8a8acfb3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.678712 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/914308b3-0f5e-4716-bc87-948f8a8acfb3-config\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.780564 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-schm6\" (UniqueName: \"kubernetes.io/projected/914308b3-0f5e-4716-bc87-948f8a8acfb3-kube-api-access-schm6\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.780628 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/914308b3-0f5e-4716-bc87-948f8a8acfb3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.780673 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/914308b3-0f5e-4716-bc87-948f8a8acfb3-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.780715 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.780736 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/914308b3-0f5e-4716-bc87-948f8a8acfb3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.780784 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/914308b3-0f5e-4716-bc87-948f8a8acfb3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.780828 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/914308b3-0f5e-4716-bc87-948f8a8acfb3-config\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.780889 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/914308b3-0f5e-4716-bc87-948f8a8acfb3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.780929 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/914308b3-0f5e-4716-bc87-948f8a8acfb3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.780956 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/914308b3-0f5e-4716-bc87-948f8a8acfb3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.780998 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/914308b3-0f5e-4716-bc87-948f8a8acfb3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.781025 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/914308b3-0f5e-4716-bc87-948f8a8acfb3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.781058 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/914308b3-0f5e-4716-bc87-948f8a8acfb3-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.781657 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/914308b3-0f5e-4716-bc87-948f8a8acfb3-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.781695 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/914308b3-0f5e-4716-bc87-948f8a8acfb3-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.782375 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/914308b3-0f5e-4716-bc87-948f8a8acfb3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.784406 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.784465 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/02b2347eca05efae332ac6f226bae6da2144dba7ab9a77b7543473de2684cdce/globalmount\"" pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.785903 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/914308b3-0f5e-4716-bc87-948f8a8acfb3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.788729 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/914308b3-0f5e-4716-bc87-948f8a8acfb3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.789046 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/914308b3-0f5e-4716-bc87-948f8a8acfb3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.789832 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/914308b3-0f5e-4716-bc87-948f8a8acfb3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.790954 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/914308b3-0f5e-4716-bc87-948f8a8acfb3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.791128 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/914308b3-0f5e-4716-bc87-948f8a8acfb3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.792407 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/914308b3-0f5e-4716-bc87-948f8a8acfb3-config\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.794520 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/914308b3-0f5e-4716-bc87-948f8a8acfb3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.800927 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-schm6\" (UniqueName: \"kubernetes.io/projected/914308b3-0f5e-4716-bc87-948f8a8acfb3-kube-api-access-schm6\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.819353 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f32ea73f-dc0c-45fe-af2d-08fac84e44a0\") pod \"prometheus-metric-storage-0\" (UID: \"914308b3-0f5e-4716-bc87-948f8a8acfb3\") " pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.839395 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:08 crc kubenswrapper[4746]: I0128 20:59:08.846027 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fde93743-7b9d-4175-abdf-bd74008cf4b0" path="/var/lib/kubelet/pods/fde93743-7b9d-4175-abdf-bd74008cf4b0/volumes" Jan 28 20:59:09 crc kubenswrapper[4746]: I0128 20:59:09.236405 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Jan 28 20:59:09 crc kubenswrapper[4746]: I0128 20:59:09.310471 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Jan 28 20:59:09 crc kubenswrapper[4746]: I0128 20:59:09.443641 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dsp4s" event={"ID":"faabc487-475c-4f5b-b135-5a96d1ed9269","Type":"ContainerStarted","Data":"adf6407c69131aebac7a4d54d91e59942a3a262af5a8ce578e90d9e795210918"} Jan 28 20:59:09 crc kubenswrapper[4746]: I0128 20:59:09.456318 4746 generic.go:334] "Generic (PLEG): container finished" podID="0229a209-4581-4685-ae13-fb7e3be8e743" containerID="745dea18a1ae2a9ea69ab20defd44eb5a20396c2fb60159aa14e6d18d0ea07b8" exitCode=0 Jan 28 20:59:09 crc kubenswrapper[4746]: I0128 20:59:09.456396 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q885p" event={"ID":"0229a209-4581-4685-ae13-fb7e3be8e743","Type":"ContainerDied","Data":"745dea18a1ae2a9ea69ab20defd44eb5a20396c2fb60159aa14e6d18d0ea07b8"} Jan 28 20:59:09 crc kubenswrapper[4746]: I0128 20:59:09.456446 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q885p" event={"ID":"0229a209-4581-4685-ae13-fb7e3be8e743","Type":"ContainerStarted","Data":"ee3d3ce9f76d8c7d2f45f698c036b1cd63a7749c429a508346870bde69a490e2"} Jan 28 20:59:09 crc kubenswrapper[4746]: I0128 20:59:09.481007 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-dsp4s" podStartSLOduration=3.46307058 podStartE2EDuration="21.480973403s" podCreationTimestamp="2026-01-28 20:58:48 +0000 UTC" firstStartedPulling="2026-01-28 20:58:49.867983234 +0000 UTC m=+1157.824169588" lastFinishedPulling="2026-01-28 20:59:07.885886057 +0000 UTC m=+1175.842072411" observedRunningTime="2026-01-28 20:59:09.463470392 +0000 UTC m=+1177.419656746" watchObservedRunningTime="2026-01-28 20:59:09.480973403 +0000 UTC m=+1177.437159757" Jan 28 20:59:09 crc kubenswrapper[4746]: W0128 20:59:09.643551 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod914308b3_0f5e_4716_bc87_948f8a8acfb3.slice/crio-0cc7cb7f178db4cd198234961b55b96b5e70452d5182b281dddda4f6a7107ebe WatchSource:0}: Error finding container 0cc7cb7f178db4cd198234961b55b96b5e70452d5182b281dddda4f6a7107ebe: Status 404 returned error can't find the container with id 0cc7cb7f178db4cd198234961b55b96b5e70452d5182b281dddda4f6a7107ebe Jan 28 20:59:09 crc kubenswrapper[4746]: I0128 20:59:09.810741 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="fde93743-7b9d-4175-abdf-bd74008cf4b0" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.113:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 20:59:10 crc kubenswrapper[4746]: I0128 20:59:10.467049 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"914308b3-0f5e-4716-bc87-948f8a8acfb3","Type":"ContainerStarted","Data":"0cc7cb7f178db4cd198234961b55b96b5e70452d5182b281dddda4f6a7107ebe"} Jan 28 20:59:10 crc kubenswrapper[4746]: I0128 20:59:10.471320 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39e8de66-78c6-45cf-b026-7783ef89922d","Type":"ContainerStarted","Data":"3be260cc22f991c83df36f686cb5b98f083109707f5f44ae73c065ebae1feb5b"} Jan 28 20:59:10 crc kubenswrapper[4746]: I0128 20:59:10.471385 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39e8de66-78c6-45cf-b026-7783ef89922d","Type":"ContainerStarted","Data":"a4d51545ace7abc32b761224ee85d3de52e46dc48809e3b235ddb10e325b54e6"} Jan 28 20:59:10 crc kubenswrapper[4746]: I0128 20:59:10.471408 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39e8de66-78c6-45cf-b026-7783ef89922d","Type":"ContainerStarted","Data":"b284d4b5a06e9e615f792b0c2c980eb71dca92102e2cba37022066763b9769d3"} Jan 28 20:59:10 crc kubenswrapper[4746]: I0128 20:59:10.742915 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q885p" Jan 28 20:59:10 crc kubenswrapper[4746]: I0128 20:59:10.833919 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4xx5\" (UniqueName: \"kubernetes.io/projected/0229a209-4581-4685-ae13-fb7e3be8e743-kube-api-access-h4xx5\") pod \"0229a209-4581-4685-ae13-fb7e3be8e743\" (UID: \"0229a209-4581-4685-ae13-fb7e3be8e743\") " Jan 28 20:59:10 crc kubenswrapper[4746]: I0128 20:59:10.834122 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0229a209-4581-4685-ae13-fb7e3be8e743-operator-scripts\") pod \"0229a209-4581-4685-ae13-fb7e3be8e743\" (UID: \"0229a209-4581-4685-ae13-fb7e3be8e743\") " Jan 28 20:59:10 crc kubenswrapper[4746]: I0128 20:59:10.834611 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0229a209-4581-4685-ae13-fb7e3be8e743-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0229a209-4581-4685-ae13-fb7e3be8e743" (UID: "0229a209-4581-4685-ae13-fb7e3be8e743"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:10 crc kubenswrapper[4746]: I0128 20:59:10.837806 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0229a209-4581-4685-ae13-fb7e3be8e743-kube-api-access-h4xx5" (OuterVolumeSpecName: "kube-api-access-h4xx5") pod "0229a209-4581-4685-ae13-fb7e3be8e743" (UID: "0229a209-4581-4685-ae13-fb7e3be8e743"). InnerVolumeSpecName "kube-api-access-h4xx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:59:10 crc kubenswrapper[4746]: I0128 20:59:10.935650 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4xx5\" (UniqueName: \"kubernetes.io/projected/0229a209-4581-4685-ae13-fb7e3be8e743-kube-api-access-h4xx5\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:10 crc kubenswrapper[4746]: I0128 20:59:10.935878 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0229a209-4581-4685-ae13-fb7e3be8e743-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:11 crc kubenswrapper[4746]: I0128 20:59:11.483332 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-q885p" event={"ID":"0229a209-4581-4685-ae13-fb7e3be8e743","Type":"ContainerDied","Data":"ee3d3ce9f76d8c7d2f45f698c036b1cd63a7749c429a508346870bde69a490e2"} Jan 28 20:59:11 crc kubenswrapper[4746]: I0128 20:59:11.483381 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee3d3ce9f76d8c7d2f45f698c036b1cd63a7749c429a508346870bde69a490e2" Jan 28 20:59:11 crc kubenswrapper[4746]: I0128 20:59:11.483443 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-q885p" Jan 28 20:59:11 crc kubenswrapper[4746]: I0128 20:59:11.491202 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39e8de66-78c6-45cf-b026-7783ef89922d","Type":"ContainerStarted","Data":"616dbcbba4fffe40e2de50996a92017ee7c6230eacade3737b87571d8255a9ad"} Jan 28 20:59:11 crc kubenswrapper[4746]: I0128 20:59:11.955623 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-q885p"] Jan 28 20:59:11 crc kubenswrapper[4746]: I0128 20:59:11.964290 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-q885p"] Jan 28 20:59:12 crc kubenswrapper[4746]: I0128 20:59:12.506277 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39e8de66-78c6-45cf-b026-7783ef89922d","Type":"ContainerStarted","Data":"081079233e9e46af835e9cb18a7d81078df1f765a7938c7b8c0e8dc7edff4250"} Jan 28 20:59:12 crc kubenswrapper[4746]: I0128 20:59:12.506672 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39e8de66-78c6-45cf-b026-7783ef89922d","Type":"ContainerStarted","Data":"fea7e811e740471873a39b1e0daecdb6dd8b63e1a5d98fd49a008f44993c18f7"} Jan 28 20:59:12 crc kubenswrapper[4746]: I0128 20:59:12.858705 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0229a209-4581-4685-ae13-fb7e3be8e743" path="/var/lib/kubelet/pods/0229a209-4581-4685-ae13-fb7e3be8e743/volumes" Jan 28 20:59:13 crc kubenswrapper[4746]: I0128 20:59:13.520026 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"914308b3-0f5e-4716-bc87-948f8a8acfb3","Type":"ContainerStarted","Data":"d4996791d963ecd750f539082e489ee83a1e085a98ce40656fb05ae4b8c66f2a"} Jan 28 20:59:13 crc kubenswrapper[4746]: I0128 20:59:13.524715 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39e8de66-78c6-45cf-b026-7783ef89922d","Type":"ContainerStarted","Data":"646ccedc31e23404c16fc747f5b4374f7b874d635c4905eff2834f6ca6dcfca1"} Jan 28 20:59:13 crc kubenswrapper[4746]: I0128 20:59:13.524751 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39e8de66-78c6-45cf-b026-7783ef89922d","Type":"ContainerStarted","Data":"a8909d9c059f207702b843f95b30a3fe45539bc4c4e2255dde1449a227fae808"} Jan 28 20:59:13 crc kubenswrapper[4746]: I0128 20:59:13.526219 4746 generic.go:334] "Generic (PLEG): container finished" podID="1176d52c-0fec-4346-ad79-af25ac4c3f62" containerID="81caf5ae6d81fbfe614995cb0c9805b2ad35632a010c63d5790e8ab11eccc724" exitCode=0 Jan 28 20:59:13 crc kubenswrapper[4746]: I0128 20:59:13.526279 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7mlnk" event={"ID":"1176d52c-0fec-4346-ad79-af25ac4c3f62","Type":"ContainerDied","Data":"81caf5ae6d81fbfe614995cb0c9805b2ad35632a010c63d5790e8ab11eccc724"} Jan 28 20:59:14 crc kubenswrapper[4746]: I0128 20:59:14.541151 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39e8de66-78c6-45cf-b026-7783ef89922d","Type":"ContainerStarted","Data":"b510d048081b43bfa1fc0c70fcbd5935979926b05f3fc75013c66772c8e20c8b"} Jan 28 20:59:14 crc kubenswrapper[4746]: I0128 20:59:14.825252 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7mlnk" Jan 28 20:59:14 crc kubenswrapper[4746]: I0128 20:59:14.907569 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1176d52c-0fec-4346-ad79-af25ac4c3f62-combined-ca-bundle\") pod \"1176d52c-0fec-4346-ad79-af25ac4c3f62\" (UID: \"1176d52c-0fec-4346-ad79-af25ac4c3f62\") " Jan 28 20:59:14 crc kubenswrapper[4746]: I0128 20:59:14.907630 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1176d52c-0fec-4346-ad79-af25ac4c3f62-config-data\") pod \"1176d52c-0fec-4346-ad79-af25ac4c3f62\" (UID: \"1176d52c-0fec-4346-ad79-af25ac4c3f62\") " Jan 28 20:59:14 crc kubenswrapper[4746]: I0128 20:59:14.907702 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5vb7\" (UniqueName: \"kubernetes.io/projected/1176d52c-0fec-4346-ad79-af25ac4c3f62-kube-api-access-k5vb7\") pod \"1176d52c-0fec-4346-ad79-af25ac4c3f62\" (UID: \"1176d52c-0fec-4346-ad79-af25ac4c3f62\") " Jan 28 20:59:14 crc kubenswrapper[4746]: I0128 20:59:14.912550 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1176d52c-0fec-4346-ad79-af25ac4c3f62-kube-api-access-k5vb7" (OuterVolumeSpecName: "kube-api-access-k5vb7") pod "1176d52c-0fec-4346-ad79-af25ac4c3f62" (UID: "1176d52c-0fec-4346-ad79-af25ac4c3f62"). InnerVolumeSpecName "kube-api-access-k5vb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:59:14 crc kubenswrapper[4746]: I0128 20:59:14.951118 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1176d52c-0fec-4346-ad79-af25ac4c3f62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1176d52c-0fec-4346-ad79-af25ac4c3f62" (UID: "1176d52c-0fec-4346-ad79-af25ac4c3f62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:14 crc kubenswrapper[4746]: I0128 20:59:14.978042 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1176d52c-0fec-4346-ad79-af25ac4c3f62-config-data" (OuterVolumeSpecName: "config-data") pod "1176d52c-0fec-4346-ad79-af25ac4c3f62" (UID: "1176d52c-0fec-4346-ad79-af25ac4c3f62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.011219 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1176d52c-0fec-4346-ad79-af25ac4c3f62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.011621 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1176d52c-0fec-4346-ad79-af25ac4c3f62-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.011637 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5vb7\" (UniqueName: \"kubernetes.io/projected/1176d52c-0fec-4346-ad79-af25ac4c3f62-kube-api-access-k5vb7\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.496185 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9v856"] Jan 28 20:59:15 crc kubenswrapper[4746]: E0128 20:59:15.496708 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1176d52c-0fec-4346-ad79-af25ac4c3f62" containerName="keystone-db-sync" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.496734 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="1176d52c-0fec-4346-ad79-af25ac4c3f62" containerName="keystone-db-sync" Jan 28 20:59:15 crc kubenswrapper[4746]: E0128 20:59:15.496757 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0229a209-4581-4685-ae13-fb7e3be8e743" containerName="mariadb-account-create-update" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.496766 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="0229a209-4581-4685-ae13-fb7e3be8e743" containerName="mariadb-account-create-update" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.497101 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="1176d52c-0fec-4346-ad79-af25ac4c3f62" containerName="keystone-db-sync" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.497155 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="0229a209-4581-4685-ae13-fb7e3be8e743" containerName="mariadb-account-create-update" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.497957 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9v856" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.500319 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.501639 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9v856"] Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.580159 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39e8de66-78c6-45cf-b026-7783ef89922d","Type":"ContainerStarted","Data":"c7e2f9c94e702877fbfced50e04ed1840ce9bae8014419cfc361807f729e3fbb"} Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.580208 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39e8de66-78c6-45cf-b026-7783ef89922d","Type":"ContainerStarted","Data":"88dcb970217a43e640019b8a6378d6e83216499726bf447c893659e95244d7d2"} Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.580219 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39e8de66-78c6-45cf-b026-7783ef89922d","Type":"ContainerStarted","Data":"7fb878153273dada28e1008419f3a1aa725270137aaa0d2fd1f4f14c4873fe16"} Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.580229 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39e8de66-78c6-45cf-b026-7783ef89922d","Type":"ContainerStarted","Data":"bc2af8d3b3c8001a5517e0a556cb70752b1a44494048a91090ad4ab78f83d792"} Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.586009 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-7mlnk" event={"ID":"1176d52c-0fec-4346-ad79-af25ac4c3f62","Type":"ContainerDied","Data":"047893e9bcb3936b14de9e12190ed07b929028ec05ed100b6d303d345d2956a8"} Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.586042 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="047893e9bcb3936b14de9e12190ed07b929028ec05ed100b6d303d345d2956a8" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.586159 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-7mlnk" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.624215 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmzps\" (UniqueName: \"kubernetes.io/projected/8dd916f9-77ae-48e4-86f1-e224e2f1dd6c-kube-api-access-tmzps\") pod \"root-account-create-update-9v856\" (UID: \"8dd916f9-77ae-48e4-86f1-e224e2f1dd6c\") " pod="openstack/root-account-create-update-9v856" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.624365 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dd916f9-77ae-48e4-86f1-e224e2f1dd6c-operator-scripts\") pod \"root-account-create-update-9v856\" (UID: \"8dd916f9-77ae-48e4-86f1-e224e2f1dd6c\") " pod="openstack/root-account-create-update-9v856" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.739471 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dd916f9-77ae-48e4-86f1-e224e2f1dd6c-operator-scripts\") pod \"root-account-create-update-9v856\" (UID: \"8dd916f9-77ae-48e4-86f1-e224e2f1dd6c\") " pod="openstack/root-account-create-update-9v856" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.739606 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmzps\" (UniqueName: \"kubernetes.io/projected/8dd916f9-77ae-48e4-86f1-e224e2f1dd6c-kube-api-access-tmzps\") pod \"root-account-create-update-9v856\" (UID: \"8dd916f9-77ae-48e4-86f1-e224e2f1dd6c\") " pod="openstack/root-account-create-update-9v856" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.740758 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dd916f9-77ae-48e4-86f1-e224e2f1dd6c-operator-scripts\") pod \"root-account-create-update-9v856\" (UID: \"8dd916f9-77ae-48e4-86f1-e224e2f1dd6c\") " pod="openstack/root-account-create-update-9v856" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.767554 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmzps\" (UniqueName: \"kubernetes.io/projected/8dd916f9-77ae-48e4-86f1-e224e2f1dd6c-kube-api-access-tmzps\") pod \"root-account-create-update-9v856\" (UID: \"8dd916f9-77ae-48e4-86f1-e224e2f1dd6c\") " pod="openstack/root-account-create-update-9v856" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.822882 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-rkvws"] Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.824680 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.851103 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-rkvws"] Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.870702 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jhdzr"] Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.872228 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.878868 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.878887 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.878902 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.879598 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-j5js5" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.879718 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.891217 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jhdzr"] Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.894339 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9v856" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.944786 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-config\") pod \"dnsmasq-dns-f877ddd87-rkvws\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.944954 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-dns-svc\") pod \"dnsmasq-dns-f877ddd87-rkvws\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.945126 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-rkvws\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.945194 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p25td\" (UniqueName: \"kubernetes.io/projected/7fdad9a4-c388-49e0-aff0-b73c137b9862-kube-api-access-p25td\") pod \"dnsmasq-dns-f877ddd87-rkvws\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:15 crc kubenswrapper[4746]: I0128 20:59:15.945410 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-rkvws\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.028207 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.030613 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.047030 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5w7r\" (UniqueName: \"kubernetes.io/projected/d575464e-96b9-46bf-9ae9-37e25dafb223-kube-api-access-j5w7r\") pod \"keystone-bootstrap-jhdzr\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.047117 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-rkvws\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.047136 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-config-data\") pod \"keystone-bootstrap-jhdzr\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.047169 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p25td\" (UniqueName: \"kubernetes.io/projected/7fdad9a4-c388-49e0-aff0-b73c137b9862-kube-api-access-p25td\") pod \"dnsmasq-dns-f877ddd87-rkvws\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.047197 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-combined-ca-bundle\") pod \"keystone-bootstrap-jhdzr\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.047219 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-scripts\") pod \"keystone-bootstrap-jhdzr\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.047252 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-rkvws\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.047286 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-fernet-keys\") pod \"keystone-bootstrap-jhdzr\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.047323 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-config\") pod \"dnsmasq-dns-f877ddd87-rkvws\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.047359 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-credential-keys\") pod \"keystone-bootstrap-jhdzr\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.047391 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-dns-svc\") pod \"dnsmasq-dns-f877ddd87-rkvws\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.048387 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.048503 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-rkvws\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.049474 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-config\") pod \"dnsmasq-dns-f877ddd87-rkvws\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.049594 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.051276 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.060758 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-dns-svc\") pod \"dnsmasq-dns-f877ddd87-rkvws\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.063255 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-rkvws\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.095147 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-wwfkc"] Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.095960 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p25td\" (UniqueName: \"kubernetes.io/projected/7fdad9a4-c388-49e0-aff0-b73c137b9862-kube-api-access-p25td\") pod \"dnsmasq-dns-f877ddd87-rkvws\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.096706 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.119182 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wqf28" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.119351 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-p9ghn"] Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.120533 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.121148 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.130878 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.131124 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.131258 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-bjgvh" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.131332 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-g7kpz"] Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.131506 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.133005 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g7kpz" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.133556 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.140136 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.140407 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-s8hft" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.142293 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.151177 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.151253 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5w7r\" (UniqueName: \"kubernetes.io/projected/d575464e-96b9-46bf-9ae9-37e25dafb223-kube-api-access-j5w7r\") pod \"keystone-bootstrap-jhdzr\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.151281 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjbtc\" (UniqueName: \"kubernetes.io/projected/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-kube-api-access-rjbtc\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.152400 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-run-httpd\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.152441 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-config-data\") pod \"keystone-bootstrap-jhdzr\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.152481 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.152529 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-combined-ca-bundle\") pod \"keystone-bootstrap-jhdzr\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.152554 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-scripts\") pod \"keystone-bootstrap-jhdzr\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.152600 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-config-data\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.152646 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-fernet-keys\") pod \"keystone-bootstrap-jhdzr\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.152741 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-credential-keys\") pod \"keystone-bootstrap-jhdzr\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.152791 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-log-httpd\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.152840 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-scripts\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.159674 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-p9ghn"] Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.160286 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.176507 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-config-data\") pod \"keystone-bootstrap-jhdzr\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.178452 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-scripts\") pod \"keystone-bootstrap-jhdzr\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.178816 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-credential-keys\") pod \"keystone-bootstrap-jhdzr\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.184662 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-fernet-keys\") pod \"keystone-bootstrap-jhdzr\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.191668 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-combined-ca-bundle\") pod \"keystone-bootstrap-jhdzr\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.202393 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wwfkc"] Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.269259 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-scripts\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.269338 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b820b96e-5237-4984-a3e9-246b04980cbb-config\") pod \"neutron-db-sync-g7kpz\" (UID: \"b820b96e-5237-4984-a3e9-246b04980cbb\") " pod="openstack/neutron-db-sync-g7kpz" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.269383 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-combined-ca-bundle\") pod \"cinder-db-sync-wwfkc\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.269415 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.269456 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-db-sync-config-data\") pod \"cinder-db-sync-wwfkc\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.269491 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-config-data\") pod \"cinder-db-sync-wwfkc\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.269527 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjbtc\" (UniqueName: \"kubernetes.io/projected/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-kube-api-access-rjbtc\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.269555 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-scripts\") pod \"cinder-db-sync-wwfkc\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.269608 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce95230-2b72-4598-9d28-3a1465803567-combined-ca-bundle\") pod \"cloudkitty-db-sync-p9ghn\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.269635 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-run-httpd\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.277700 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a587d3d9-972c-47ae-8e29-5bfd977ff429-etc-machine-id\") pod \"cinder-db-sync-wwfkc\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.277770 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.277839 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84g7x\" (UniqueName: \"kubernetes.io/projected/b820b96e-5237-4984-a3e9-246b04980cbb-kube-api-access-84g7x\") pod \"neutron-db-sync-g7kpz\" (UID: \"b820b96e-5237-4984-a3e9-246b04980cbb\") " pod="openstack/neutron-db-sync-g7kpz" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.277900 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cce95230-2b72-4598-9d28-3a1465803567-scripts\") pod \"cloudkitty-db-sync-p9ghn\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.277922 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/cce95230-2b72-4598-9d28-3a1465803567-certs\") pod \"cloudkitty-db-sync-p9ghn\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.277953 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-config-data\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.278031 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm2k7\" (UniqueName: \"kubernetes.io/projected/cce95230-2b72-4598-9d28-3a1465803567-kube-api-access-bm2k7\") pod \"cloudkitty-db-sync-p9ghn\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.278074 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cce95230-2b72-4598-9d28-3a1465803567-config-data\") pod \"cloudkitty-db-sync-p9ghn\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.278119 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b820b96e-5237-4984-a3e9-246b04980cbb-combined-ca-bundle\") pod \"neutron-db-sync-g7kpz\" (UID: \"b820b96e-5237-4984-a3e9-246b04980cbb\") " pod="openstack/neutron-db-sync-g7kpz" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.278256 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-log-httpd\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.278314 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvpq6\" (UniqueName: \"kubernetes.io/projected/a587d3d9-972c-47ae-8e29-5bfd977ff429-kube-api-access-hvpq6\") pod \"cinder-db-sync-wwfkc\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.298232 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-g7kpz"] Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.299095 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-run-httpd\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.299650 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-log-httpd\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.323766 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.324400 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5w7r\" (UniqueName: \"kubernetes.io/projected/d575464e-96b9-46bf-9ae9-37e25dafb223-kube-api-access-j5w7r\") pod \"keystone-bootstrap-jhdzr\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.329861 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-config-data\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.330888 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.342602 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-scripts\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.359352 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjbtc\" (UniqueName: \"kubernetes.io/projected/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-kube-api-access-rjbtc\") pod \"ceilometer-0\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " pod="openstack/ceilometer-0" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.379469 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvpq6\" (UniqueName: \"kubernetes.io/projected/a587d3d9-972c-47ae-8e29-5bfd977ff429-kube-api-access-hvpq6\") pod \"cinder-db-sync-wwfkc\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.379524 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b820b96e-5237-4984-a3e9-246b04980cbb-config\") pod \"neutron-db-sync-g7kpz\" (UID: \"b820b96e-5237-4984-a3e9-246b04980cbb\") " pod="openstack/neutron-db-sync-g7kpz" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.379553 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-combined-ca-bundle\") pod \"cinder-db-sync-wwfkc\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.379578 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-db-sync-config-data\") pod \"cinder-db-sync-wwfkc\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.379604 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-config-data\") pod \"cinder-db-sync-wwfkc\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.379624 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-scripts\") pod \"cinder-db-sync-wwfkc\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.379650 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce95230-2b72-4598-9d28-3a1465803567-combined-ca-bundle\") pod \"cloudkitty-db-sync-p9ghn\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.379677 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a587d3d9-972c-47ae-8e29-5bfd977ff429-etc-machine-id\") pod \"cinder-db-sync-wwfkc\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.379702 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84g7x\" (UniqueName: \"kubernetes.io/projected/b820b96e-5237-4984-a3e9-246b04980cbb-kube-api-access-84g7x\") pod \"neutron-db-sync-g7kpz\" (UID: \"b820b96e-5237-4984-a3e9-246b04980cbb\") " pod="openstack/neutron-db-sync-g7kpz" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.379726 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cce95230-2b72-4598-9d28-3a1465803567-scripts\") pod \"cloudkitty-db-sync-p9ghn\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.379741 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/cce95230-2b72-4598-9d28-3a1465803567-certs\") pod \"cloudkitty-db-sync-p9ghn\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.379770 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm2k7\" (UniqueName: \"kubernetes.io/projected/cce95230-2b72-4598-9d28-3a1465803567-kube-api-access-bm2k7\") pod \"cloudkitty-db-sync-p9ghn\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.379792 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b820b96e-5237-4984-a3e9-246b04980cbb-combined-ca-bundle\") pod \"neutron-db-sync-g7kpz\" (UID: \"b820b96e-5237-4984-a3e9-246b04980cbb\") " pod="openstack/neutron-db-sync-g7kpz" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.379808 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cce95230-2b72-4598-9d28-3a1465803567-config-data\") pod \"cloudkitty-db-sync-p9ghn\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.391666 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a587d3d9-972c-47ae-8e29-5bfd977ff429-etc-machine-id\") pod \"cinder-db-sync-wwfkc\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.430911 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b820b96e-5237-4984-a3e9-246b04980cbb-config\") pod \"neutron-db-sync-g7kpz\" (UID: \"b820b96e-5237-4984-a3e9-246b04980cbb\") " pod="openstack/neutron-db-sync-g7kpz" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.438761 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvpq6\" (UniqueName: \"kubernetes.io/projected/a587d3d9-972c-47ae-8e29-5bfd977ff429-kube-api-access-hvpq6\") pod \"cinder-db-sync-wwfkc\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.439644 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-db-sync-config-data\") pod \"cinder-db-sync-wwfkc\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.440193 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce95230-2b72-4598-9d28-3a1465803567-combined-ca-bundle\") pod \"cloudkitty-db-sync-p9ghn\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.440499 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cce95230-2b72-4598-9d28-3a1465803567-scripts\") pod \"cloudkitty-db-sync-p9ghn\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.440658 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cce95230-2b72-4598-9d28-3a1465803567-config-data\") pod \"cloudkitty-db-sync-p9ghn\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.441070 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/cce95230-2b72-4598-9d28-3a1465803567-certs\") pod \"cloudkitty-db-sync-p9ghn\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.452488 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b820b96e-5237-4984-a3e9-246b04980cbb-combined-ca-bundle\") pod \"neutron-db-sync-g7kpz\" (UID: \"b820b96e-5237-4984-a3e9-246b04980cbb\") " pod="openstack/neutron-db-sync-g7kpz" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.452871 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-scripts\") pod \"cinder-db-sync-wwfkc\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.453761 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-config-data\") pod \"cinder-db-sync-wwfkc\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.460503 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm2k7\" (UniqueName: \"kubernetes.io/projected/cce95230-2b72-4598-9d28-3a1465803567-kube-api-access-bm2k7\") pod \"cloudkitty-db-sync-p9ghn\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.461041 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-combined-ca-bundle\") pod \"cinder-db-sync-wwfkc\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.489780 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.490702 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84g7x\" (UniqueName: \"kubernetes.io/projected/b820b96e-5237-4984-a3e9-246b04980cbb-kube-api-access-84g7x\") pod \"neutron-db-sync-g7kpz\" (UID: \"b820b96e-5237-4984-a3e9-246b04980cbb\") " pod="openstack/neutron-db-sync-g7kpz" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.515662 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g7kpz" Jan 28 20:59:16 crc kubenswrapper[4746]: I0128 20:59:16.517126 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.633508 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-rkvws"] Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.657345 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.667414 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39e8de66-78c6-45cf-b026-7783ef89922d","Type":"ContainerStarted","Data":"626ceefb78931c8bdada82af40a2c6904feb08d46e68862eb060b9edca89c9c0"} Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.667487 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"39e8de66-78c6-45cf-b026-7783ef89922d","Type":"ContainerStarted","Data":"4d55fc13fcb632aacf916417c8bb2cbbc44d919619a5548368a5a2fdbc88dcdd"} Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.679883 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-29n2p"] Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.693608 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-29n2p" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.701235 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.701468 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-cqvw2" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.710319 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jms86"] Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.712409 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jms86" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.715639 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-gb5kd"] Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.718438 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.721202 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.721369 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-l5j6d" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.721477 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.751250 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wwfkc" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.771357 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-29n2p"] Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.819492 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-gb5kd\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.819565 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-gb5kd\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.819675 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-gb5kd\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.819902 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cskxd\" (UniqueName: \"kubernetes.io/projected/1d79950b-c574-4952-8620-ff635db5e8de-kube-api-access-cskxd\") pod \"placement-db-sync-jms86\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " pod="openstack/placement-db-sync-jms86" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.819956 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l8q4\" (UniqueName: \"kubernetes.io/projected/37690909-5abe-4d42-9b66-d1398879fd15-kube-api-access-5l8q4\") pod \"dnsmasq-dns-68dcc9cf6f-gb5kd\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.820125 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d79950b-c574-4952-8620-ff635db5e8de-scripts\") pod \"placement-db-sync-jms86\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " pod="openstack/placement-db-sync-jms86" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.820187 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz7rb\" (UniqueName: \"kubernetes.io/projected/70e766dc-9f84-4d0c-af5b-3b044e06c09f-kube-api-access-hz7rb\") pod \"barbican-db-sync-29n2p\" (UID: \"70e766dc-9f84-4d0c-af5b-3b044e06c09f\") " pod="openstack/barbican-db-sync-29n2p" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.820223 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d79950b-c574-4952-8620-ff635db5e8de-config-data\") pod \"placement-db-sync-jms86\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " pod="openstack/placement-db-sync-jms86" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.820268 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/70e766dc-9f84-4d0c-af5b-3b044e06c09f-db-sync-config-data\") pod \"barbican-db-sync-29n2p\" (UID: \"70e766dc-9f84-4d0c-af5b-3b044e06c09f\") " pod="openstack/barbican-db-sync-29n2p" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.820399 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d79950b-c574-4952-8620-ff635db5e8de-logs\") pod \"placement-db-sync-jms86\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " pod="openstack/placement-db-sync-jms86" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.820497 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-config\") pod \"dnsmasq-dns-68dcc9cf6f-gb5kd\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.820573 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e766dc-9f84-4d0c-af5b-3b044e06c09f-combined-ca-bundle\") pod \"barbican-db-sync-29n2p\" (UID: \"70e766dc-9f84-4d0c-af5b-3b044e06c09f\") " pod="openstack/barbican-db-sync-29n2p" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.820600 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d79950b-c574-4952-8620-ff635db5e8de-combined-ca-bundle\") pod \"placement-db-sync-jms86\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " pod="openstack/placement-db-sync-jms86" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.986057 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d79950b-c574-4952-8620-ff635db5e8de-logs\") pod \"placement-db-sync-jms86\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " pod="openstack/placement-db-sync-jms86" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.986454 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-config\") pod \"dnsmasq-dns-68dcc9cf6f-gb5kd\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.986786 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e766dc-9f84-4d0c-af5b-3b044e06c09f-combined-ca-bundle\") pod \"barbican-db-sync-29n2p\" (UID: \"70e766dc-9f84-4d0c-af5b-3b044e06c09f\") " pod="openstack/barbican-db-sync-29n2p" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.986807 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d79950b-c574-4952-8620-ff635db5e8de-logs\") pod \"placement-db-sync-jms86\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " pod="openstack/placement-db-sync-jms86" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.986833 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d79950b-c574-4952-8620-ff635db5e8de-combined-ca-bundle\") pod \"placement-db-sync-jms86\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " pod="openstack/placement-db-sync-jms86" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.986965 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-gb5kd\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.986988 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-gb5kd\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.987068 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-gb5kd\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.987168 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cskxd\" (UniqueName: \"kubernetes.io/projected/1d79950b-c574-4952-8620-ff635db5e8de-kube-api-access-cskxd\") pod \"placement-db-sync-jms86\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " pod="openstack/placement-db-sync-jms86" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.987204 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l8q4\" (UniqueName: \"kubernetes.io/projected/37690909-5abe-4d42-9b66-d1398879fd15-kube-api-access-5l8q4\") pod \"dnsmasq-dns-68dcc9cf6f-gb5kd\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.987252 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d79950b-c574-4952-8620-ff635db5e8de-scripts\") pod \"placement-db-sync-jms86\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " pod="openstack/placement-db-sync-jms86" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.987295 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz7rb\" (UniqueName: \"kubernetes.io/projected/70e766dc-9f84-4d0c-af5b-3b044e06c09f-kube-api-access-hz7rb\") pod \"barbican-db-sync-29n2p\" (UID: \"70e766dc-9f84-4d0c-af5b-3b044e06c09f\") " pod="openstack/barbican-db-sync-29n2p" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.987327 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d79950b-c574-4952-8620-ff635db5e8de-config-data\") pod \"placement-db-sync-jms86\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " pod="openstack/placement-db-sync-jms86" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.987360 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/70e766dc-9f84-4d0c-af5b-3b044e06c09f-db-sync-config-data\") pod \"barbican-db-sync-29n2p\" (UID: \"70e766dc-9f84-4d0c-af5b-3b044e06c09f\") " pod="openstack/barbican-db-sync-29n2p" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.987894 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=46.05048031 podStartE2EDuration="51.987864868s" podCreationTimestamp="2026-01-28 20:58:25 +0000 UTC" firstStartedPulling="2026-01-28 20:59:08.353165076 +0000 UTC m=+1176.309351430" lastFinishedPulling="2026-01-28 20:59:14.290549634 +0000 UTC m=+1182.246735988" observedRunningTime="2026-01-28 20:59:16.729975171 +0000 UTC m=+1184.686161525" watchObservedRunningTime="2026-01-28 20:59:16.987864868 +0000 UTC m=+1184.944051222" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.989037 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-config\") pod \"dnsmasq-dns-68dcc9cf6f-gb5kd\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.989755 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcc9cf6f-gb5kd\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.990260 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcc9cf6f-gb5kd\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:16.990835 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-dns-svc\") pod \"dnsmasq-dns-68dcc9cf6f-gb5kd\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.017813 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/70e766dc-9f84-4d0c-af5b-3b044e06c09f-db-sync-config-data\") pod \"barbican-db-sync-29n2p\" (UID: \"70e766dc-9f84-4d0c-af5b-3b044e06c09f\") " pod="openstack/barbican-db-sync-29n2p" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.017862 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d79950b-c574-4952-8620-ff635db5e8de-scripts\") pod \"placement-db-sync-jms86\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " pod="openstack/placement-db-sync-jms86" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.021114 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e766dc-9f84-4d0c-af5b-3b044e06c09f-combined-ca-bundle\") pod \"barbican-db-sync-29n2p\" (UID: \"70e766dc-9f84-4d0c-af5b-3b044e06c09f\") " pod="openstack/barbican-db-sync-29n2p" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.023604 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d79950b-c574-4952-8620-ff635db5e8de-config-data\") pod \"placement-db-sync-jms86\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " pod="openstack/placement-db-sync-jms86" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.030334 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jms86"] Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.030395 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-gb5kd"] Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.035797 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d79950b-c574-4952-8620-ff635db5e8de-combined-ca-bundle\") pod \"placement-db-sync-jms86\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " pod="openstack/placement-db-sync-jms86" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.040496 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cskxd\" (UniqueName: \"kubernetes.io/projected/1d79950b-c574-4952-8620-ff635db5e8de-kube-api-access-cskxd\") pod \"placement-db-sync-jms86\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " pod="openstack/placement-db-sync-jms86" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.043361 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l8q4\" (UniqueName: \"kubernetes.io/projected/37690909-5abe-4d42-9b66-d1398879fd15-kube-api-access-5l8q4\") pod \"dnsmasq-dns-68dcc9cf6f-gb5kd\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.043412 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz7rb\" (UniqueName: \"kubernetes.io/projected/70e766dc-9f84-4d0c-af5b-3b044e06c09f-kube-api-access-hz7rb\") pod \"barbican-db-sync-29n2p\" (UID: \"70e766dc-9f84-4d0c-af5b-3b044e06c09f\") " pod="openstack/barbican-db-sync-29n2p" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.112446 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9v856"] Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.114624 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-29n2p" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.130826 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-gb5kd"] Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.131271 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.142016 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-d9h8x"] Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.143704 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.150235 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-d9h8x"] Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.152861 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.200130 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jms86" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.295017 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-config\") pod \"dnsmasq-dns-58dd9ff6bc-d9h8x\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.295135 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-d9h8x\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.295182 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55xtw\" (UniqueName: \"kubernetes.io/projected/3710bcff-5321-46a6-8763-94e622eb38cb-kube-api-access-55xtw\") pod \"dnsmasq-dns-58dd9ff6bc-d9h8x\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.295218 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-d9h8x\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.295313 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-d9h8x\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.295341 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-d9h8x\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.397220 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-d9h8x\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.397582 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-d9h8x\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.397608 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-d9h8x\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.397662 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-config\") pod \"dnsmasq-dns-58dd9ff6bc-d9h8x\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.397715 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-d9h8x\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.397729 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55xtw\" (UniqueName: \"kubernetes.io/projected/3710bcff-5321-46a6-8763-94e622eb38cb-kube-api-access-55xtw\") pod \"dnsmasq-dns-58dd9ff6bc-d9h8x\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.398795 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-d9h8x\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.399349 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-d9h8x\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.400010 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-config\") pod \"dnsmasq-dns-58dd9ff6bc-d9h8x\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.400557 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-d9h8x\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.400716 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-d9h8x\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.422583 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55xtw\" (UniqueName: \"kubernetes.io/projected/3710bcff-5321-46a6-8763-94e622eb38cb-kube-api-access-55xtw\") pod \"dnsmasq-dns-58dd9ff6bc-d9h8x\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.476697 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.680897 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9v856" event={"ID":"8dd916f9-77ae-48e4-86f1-e224e2f1dd6c","Type":"ContainerStarted","Data":"67006d1c6795615fb8ceda95b35ade7bd11941f66b5f0268526602e315e60f7e"} Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.681330 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9v856" event={"ID":"8dd916f9-77ae-48e4-86f1-e224e2f1dd6c","Type":"ContainerStarted","Data":"176c61603675cd020e70d77d4e9bf83f66b5cae5b8a48a3dc463254ab3d4e678"} Jan 28 20:59:17 crc kubenswrapper[4746]: I0128 20:59:17.710166 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-9v856" podStartSLOduration=2.710136436 podStartE2EDuration="2.710136436s" podCreationTimestamp="2026-01-28 20:59:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:59:17.704587527 +0000 UTC m=+1185.660773881" watchObservedRunningTime="2026-01-28 20:59:17.710136436 +0000 UTC m=+1185.666322780" Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.162073 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jhdzr"] Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.171799 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-rkvws"] Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.191831 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wwfkc"] Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.206155 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.219140 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-p9ghn"] Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.246119 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-gb5kd"] Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.258934 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jms86"] Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.280939 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-29n2p"] Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.288524 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-g7kpz"] Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.360629 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.402427 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-d9h8x"] Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.711657 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-p9ghn" event={"ID":"cce95230-2b72-4598-9d28-3a1465803567","Type":"ContainerStarted","Data":"9a71935b892fd6120995b5eb42c6d3debef679a8c19bebe2074b3ddd0243700e"} Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.716120 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" event={"ID":"3710bcff-5321-46a6-8763-94e622eb38cb","Type":"ContainerStarted","Data":"4ab4d76730e4d4819e4a7e6efd86c1db7824a7e3af2dec4e6924c55ae6c00ada"} Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.720713 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jhdzr" event={"ID":"d575464e-96b9-46bf-9ae9-37e25dafb223","Type":"ContainerStarted","Data":"5b81ad9abb59abd9f8f451c344b0874a1b516d93aabeaaf5427f757f88f1afac"} Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.724011 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" event={"ID":"37690909-5abe-4d42-9b66-d1398879fd15","Type":"ContainerStarted","Data":"3347b8a79a5590c1bf26d7b426f40be0f93a24c0f1c38137027da8922c227562"} Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.728374 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wwfkc" event={"ID":"a587d3d9-972c-47ae-8e29-5bfd977ff429","Type":"ContainerStarted","Data":"08ee86cb6fcc85ad85b5d6b47293b98bb17a56f48df6088960efbd11a5e808e8"} Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.730270 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jms86" event={"ID":"1d79950b-c574-4952-8620-ff635db5e8de","Type":"ContainerStarted","Data":"d57f309a75fd50699f5749a419ddbb64210cff0c9de91f7336112d123a4bb307"} Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.738967 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924","Type":"ContainerStarted","Data":"ca468a76c2b0fc437cfff1fe1d6721a46f3de9f302641d999ead09846737c3fe"} Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.740795 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g7kpz" event={"ID":"b820b96e-5237-4984-a3e9-246b04980cbb","Type":"ContainerStarted","Data":"bace3c13ffe904c5674c425f8137fd20313d3e39f537b02b0112e1579040b6ee"} Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.743277 4746 generic.go:334] "Generic (PLEG): container finished" podID="8dd916f9-77ae-48e4-86f1-e224e2f1dd6c" containerID="67006d1c6795615fb8ceda95b35ade7bd11941f66b5f0268526602e315e60f7e" exitCode=0 Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.743427 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9v856" event={"ID":"8dd916f9-77ae-48e4-86f1-e224e2f1dd6c","Type":"ContainerDied","Data":"67006d1c6795615fb8ceda95b35ade7bd11941f66b5f0268526602e315e60f7e"} Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.748415 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-rkvws" event={"ID":"7fdad9a4-c388-49e0-aff0-b73c137b9862","Type":"ContainerStarted","Data":"11cae4f9758827f4893100cecf4e6b12063b85459ac28bf13a3a6bf58945a26f"} Jan 28 20:59:18 crc kubenswrapper[4746]: I0128 20:59:18.749724 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-29n2p" event={"ID":"70e766dc-9f84-4d0c-af5b-3b044e06c09f","Type":"ContainerStarted","Data":"f8fc35da8ff29fd471e54b600102c93bb7f1dece67edccd18a4ebde23e0bf1ce"} Jan 28 20:59:19 crc kubenswrapper[4746]: I0128 20:59:19.766649 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g7kpz" event={"ID":"b820b96e-5237-4984-a3e9-246b04980cbb","Type":"ContainerStarted","Data":"3e8bf0fad3a221f9307149e72a8c7a4c16b411a4b0558781df4f71024476360a"} Jan 28 20:59:19 crc kubenswrapper[4746]: I0128 20:59:19.770228 4746 generic.go:334] "Generic (PLEG): container finished" podID="914308b3-0f5e-4716-bc87-948f8a8acfb3" containerID="d4996791d963ecd750f539082e489ee83a1e085a98ce40656fb05ae4b8c66f2a" exitCode=0 Jan 28 20:59:19 crc kubenswrapper[4746]: I0128 20:59:19.770279 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"914308b3-0f5e-4716-bc87-948f8a8acfb3","Type":"ContainerDied","Data":"d4996791d963ecd750f539082e489ee83a1e085a98ce40656fb05ae4b8c66f2a"} Jan 28 20:59:19 crc kubenswrapper[4746]: I0128 20:59:19.785570 4746 generic.go:334] "Generic (PLEG): container finished" podID="7fdad9a4-c388-49e0-aff0-b73c137b9862" containerID="a8df888fe89039bb2762de6a50015ca572f586da3f2ac69040f99ffb51566dac" exitCode=0 Jan 28 20:59:19 crc kubenswrapper[4746]: I0128 20:59:19.790195 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-rkvws" event={"ID":"7fdad9a4-c388-49e0-aff0-b73c137b9862","Type":"ContainerDied","Data":"a8df888fe89039bb2762de6a50015ca572f586da3f2ac69040f99ffb51566dac"} Jan 28 20:59:19 crc kubenswrapper[4746]: I0128 20:59:19.796949 4746 generic.go:334] "Generic (PLEG): container finished" podID="3710bcff-5321-46a6-8763-94e622eb38cb" containerID="c22169d83f3f2782b74332c459ce50634172feb1cacb9d57d5cbf093c2c11259" exitCode=0 Jan 28 20:59:19 crc kubenswrapper[4746]: I0128 20:59:19.797024 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" event={"ID":"3710bcff-5321-46a6-8763-94e622eb38cb","Type":"ContainerDied","Data":"c22169d83f3f2782b74332c459ce50634172feb1cacb9d57d5cbf093c2c11259"} Jan 28 20:59:19 crc kubenswrapper[4746]: I0128 20:59:19.800950 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-g7kpz" podStartSLOduration=3.800925387 podStartE2EDuration="3.800925387s" podCreationTimestamp="2026-01-28 20:59:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:59:19.790544597 +0000 UTC m=+1187.746730951" watchObservedRunningTime="2026-01-28 20:59:19.800925387 +0000 UTC m=+1187.757111731" Jan 28 20:59:19 crc kubenswrapper[4746]: I0128 20:59:19.801126 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jhdzr" event={"ID":"d575464e-96b9-46bf-9ae9-37e25dafb223","Type":"ContainerStarted","Data":"029111cbc3349dbbb9ed934ead4096eddb944845f11de68a6c2f8df6bb8f1d66"} Jan 28 20:59:19 crc kubenswrapper[4746]: I0128 20:59:19.805873 4746 generic.go:334] "Generic (PLEG): container finished" podID="37690909-5abe-4d42-9b66-d1398879fd15" containerID="0c3b27b4aa2aee796493bf7e6d91bdf07cba1cf8d7cb3b8b6b8fa10097aec927" exitCode=0 Jan 28 20:59:19 crc kubenswrapper[4746]: I0128 20:59:19.805946 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" event={"ID":"37690909-5abe-4d42-9b66-d1398879fd15","Type":"ContainerDied","Data":"0c3b27b4aa2aee796493bf7e6d91bdf07cba1cf8d7cb3b8b6b8fa10097aec927"} Jan 28 20:59:19 crc kubenswrapper[4746]: I0128 20:59:19.809637 4746 generic.go:334] "Generic (PLEG): container finished" podID="faabc487-475c-4f5b-b135-5a96d1ed9269" containerID="adf6407c69131aebac7a4d54d91e59942a3a262af5a8ce578e90d9e795210918" exitCode=0 Jan 28 20:59:19 crc kubenswrapper[4746]: I0128 20:59:19.809802 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dsp4s" event={"ID":"faabc487-475c-4f5b-b135-5a96d1ed9269","Type":"ContainerDied","Data":"adf6407c69131aebac7a4d54d91e59942a3a262af5a8ce578e90d9e795210918"} Jan 28 20:59:19 crc kubenswrapper[4746]: I0128 20:59:19.920504 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jhdzr" podStartSLOduration=4.920485712 podStartE2EDuration="4.920485712s" podCreationTimestamp="2026-01-28 20:59:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:59:19.914725647 +0000 UTC m=+1187.870912001" watchObservedRunningTime="2026-01-28 20:59:19.920485712 +0000 UTC m=+1187.876672056" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.538852 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9v856" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.612700 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.621623 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmzps\" (UniqueName: \"kubernetes.io/projected/8dd916f9-77ae-48e4-86f1-e224e2f1dd6c-kube-api-access-tmzps\") pod \"8dd916f9-77ae-48e4-86f1-e224e2f1dd6c\" (UID: \"8dd916f9-77ae-48e4-86f1-e224e2f1dd6c\") " Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.622123 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dd916f9-77ae-48e4-86f1-e224e2f1dd6c-operator-scripts\") pod \"8dd916f9-77ae-48e4-86f1-e224e2f1dd6c\" (UID: \"8dd916f9-77ae-48e4-86f1-e224e2f1dd6c\") " Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.622842 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dd916f9-77ae-48e4-86f1-e224e2f1dd6c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8dd916f9-77ae-48e4-86f1-e224e2f1dd6c" (UID: "8dd916f9-77ae-48e4-86f1-e224e2f1dd6c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.637227 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dd916f9-77ae-48e4-86f1-e224e2f1dd6c-kube-api-access-tmzps" (OuterVolumeSpecName: "kube-api-access-tmzps") pod "8dd916f9-77ae-48e4-86f1-e224e2f1dd6c" (UID: "8dd916f9-77ae-48e4-86f1-e224e2f1dd6c"). InnerVolumeSpecName "kube-api-access-tmzps". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.637938 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.723454 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-ovsdbserver-sb\") pod \"7fdad9a4-c388-49e0-aff0-b73c137b9862\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.723724 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-config\") pod \"7fdad9a4-c388-49e0-aff0-b73c137b9862\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.723857 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-ovsdbserver-sb\") pod \"37690909-5abe-4d42-9b66-d1398879fd15\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.724051 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l8q4\" (UniqueName: \"kubernetes.io/projected/37690909-5abe-4d42-9b66-d1398879fd15-kube-api-access-5l8q4\") pod \"37690909-5abe-4d42-9b66-d1398879fd15\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.724172 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-dns-svc\") pod \"37690909-5abe-4d42-9b66-d1398879fd15\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.724304 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-ovsdbserver-nb\") pod \"7fdad9a4-c388-49e0-aff0-b73c137b9862\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.724447 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-ovsdbserver-nb\") pod \"37690909-5abe-4d42-9b66-d1398879fd15\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.724619 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-dns-svc\") pod \"7fdad9a4-c388-49e0-aff0-b73c137b9862\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.724699 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-config\") pod \"37690909-5abe-4d42-9b66-d1398879fd15\" (UID: \"37690909-5abe-4d42-9b66-d1398879fd15\") " Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.724789 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p25td\" (UniqueName: \"kubernetes.io/projected/7fdad9a4-c388-49e0-aff0-b73c137b9862-kube-api-access-p25td\") pod \"7fdad9a4-c388-49e0-aff0-b73c137b9862\" (UID: \"7fdad9a4-c388-49e0-aff0-b73c137b9862\") " Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.725413 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8dd916f9-77ae-48e4-86f1-e224e2f1dd6c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.725552 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmzps\" (UniqueName: \"kubernetes.io/projected/8dd916f9-77ae-48e4-86f1-e224e2f1dd6c-kube-api-access-tmzps\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.733786 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fdad9a4-c388-49e0-aff0-b73c137b9862-kube-api-access-p25td" (OuterVolumeSpecName: "kube-api-access-p25td") pod "7fdad9a4-c388-49e0-aff0-b73c137b9862" (UID: "7fdad9a4-c388-49e0-aff0-b73c137b9862"). InnerVolumeSpecName "kube-api-access-p25td". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.763634 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37690909-5abe-4d42-9b66-d1398879fd15-kube-api-access-5l8q4" (OuterVolumeSpecName: "kube-api-access-5l8q4") pod "37690909-5abe-4d42-9b66-d1398879fd15" (UID: "37690909-5abe-4d42-9b66-d1398879fd15"). InnerVolumeSpecName "kube-api-access-5l8q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.800785 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7fdad9a4-c388-49e0-aff0-b73c137b9862" (UID: "7fdad9a4-c388-49e0-aff0-b73c137b9862"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.813204 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7fdad9a4-c388-49e0-aff0-b73c137b9862" (UID: "7fdad9a4-c388-49e0-aff0-b73c137b9862"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.813550 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "37690909-5abe-4d42-9b66-d1398879fd15" (UID: "37690909-5abe-4d42-9b66-d1398879fd15"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.827239 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l8q4\" (UniqueName: \"kubernetes.io/projected/37690909-5abe-4d42-9b66-d1398879fd15-kube-api-access-5l8q4\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.827271 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.827280 4746 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.827292 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p25td\" (UniqueName: \"kubernetes.io/projected/7fdad9a4-c388-49e0-aff0-b73c137b9862-kube-api-access-p25td\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.827300 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.827952 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37690909-5abe-4d42-9b66-d1398879fd15" (UID: "37690909-5abe-4d42-9b66-d1398879fd15"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.840769 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "37690909-5abe-4d42-9b66-d1398879fd15" (UID: "37690909-5abe-4d42-9b66-d1398879fd15"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.860752 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7fdad9a4-c388-49e0-aff0-b73c137b9862" (UID: "7fdad9a4-c388-49e0-aff0-b73c137b9862"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.871447 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-config" (OuterVolumeSpecName: "config") pod "37690909-5abe-4d42-9b66-d1398879fd15" (UID: "37690909-5abe-4d42-9b66-d1398879fd15"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.891017 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-config" (OuterVolumeSpecName: "config") pod "7fdad9a4-c388-49e0-aff0-b73c137b9862" (UID: "7fdad9a4-c388-49e0-aff0-b73c137b9862"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.929153 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.936446 4746 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.944445 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9v856" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.950357 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.953047 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.974155 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" event={"ID":"3710bcff-5321-46a6-8763-94e622eb38cb","Type":"ContainerStarted","Data":"5ba395585c8aa92dd55ae0dedeea33152ce0bd55c3762da0432ff57b87331f99"} Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.974394 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcc9cf6f-gb5kd" event={"ID":"37690909-5abe-4d42-9b66-d1398879fd15","Type":"ContainerDied","Data":"3347b8a79a5590c1bf26d7b426f40be0f93a24c0f1c38137027da8922c227562"} Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.974489 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9v856" event={"ID":"8dd916f9-77ae-48e4-86f1-e224e2f1dd6c","Type":"ContainerDied","Data":"176c61603675cd020e70d77d4e9bf83f66b5cae5b8a48a3dc463254ab3d4e678"} Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.974595 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="176c61603675cd020e70d77d4e9bf83f66b5cae5b8a48a3dc463254ab3d4e678" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.974681 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"914308b3-0f5e-4716-bc87-948f8a8acfb3","Type":"ContainerStarted","Data":"ba078dc5f6914ae047b980b8906708e0fd399fdc4ce72cb545254c78421275d1"} Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.974738 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-rkvws" event={"ID":"7fdad9a4-c388-49e0-aff0-b73c137b9862","Type":"ContainerDied","Data":"11cae4f9758827f4893100cecf4e6b12063b85459ac28bf13a3a6bf58945a26f"} Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.974834 4746 scope.go:117] "RemoveContainer" containerID="0c3b27b4aa2aee796493bf7e6d91bdf07cba1cf8d7cb3b8b6b8fa10097aec927" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.958234 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-rkvws" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.994352 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.995695 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" podStartSLOduration=3.995669453 podStartE2EDuration="3.995669453s" podCreationTimestamp="2026-01-28 20:59:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:59:20.949163693 +0000 UTC m=+1188.905350047" watchObservedRunningTime="2026-01-28 20:59:20.995669453 +0000 UTC m=+1188.951855807" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.996626 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fdad9a4-c388-49e0-aff0-b73c137b9862-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:20 crc kubenswrapper[4746]: I0128 20:59:20.997183 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37690909-5abe-4d42-9b66-d1398879fd15-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:21 crc kubenswrapper[4746]: I0128 20:59:21.077030 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-rkvws"] Jan 28 20:59:21 crc kubenswrapper[4746]: I0128 20:59:21.087508 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-rkvws"] Jan 28 20:59:21 crc kubenswrapper[4746]: I0128 20:59:21.120342 4746 scope.go:117] "RemoveContainer" containerID="a8df888fe89039bb2762de6a50015ca572f586da3f2ac69040f99ffb51566dac" Jan 28 20:59:21 crc kubenswrapper[4746]: I0128 20:59:21.153186 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-gb5kd"] Jan 28 20:59:21 crc kubenswrapper[4746]: I0128 20:59:21.180002 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68dcc9cf6f-gb5kd"] Jan 28 20:59:21 crc kubenswrapper[4746]: I0128 20:59:21.747733 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dsp4s" Jan 28 20:59:21 crc kubenswrapper[4746]: I0128 20:59:21.925885 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faabc487-475c-4f5b-b135-5a96d1ed9269-combined-ca-bundle\") pod \"faabc487-475c-4f5b-b135-5a96d1ed9269\" (UID: \"faabc487-475c-4f5b-b135-5a96d1ed9269\") " Jan 28 20:59:21 crc kubenswrapper[4746]: I0128 20:59:21.925967 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6qc5\" (UniqueName: \"kubernetes.io/projected/faabc487-475c-4f5b-b135-5a96d1ed9269-kube-api-access-r6qc5\") pod \"faabc487-475c-4f5b-b135-5a96d1ed9269\" (UID: \"faabc487-475c-4f5b-b135-5a96d1ed9269\") " Jan 28 20:59:21 crc kubenswrapper[4746]: I0128 20:59:21.929191 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/faabc487-475c-4f5b-b135-5a96d1ed9269-db-sync-config-data\") pod \"faabc487-475c-4f5b-b135-5a96d1ed9269\" (UID: \"faabc487-475c-4f5b-b135-5a96d1ed9269\") " Jan 28 20:59:21 crc kubenswrapper[4746]: I0128 20:59:21.929226 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faabc487-475c-4f5b-b135-5a96d1ed9269-config-data\") pod \"faabc487-475c-4f5b-b135-5a96d1ed9269\" (UID: \"faabc487-475c-4f5b-b135-5a96d1ed9269\") " Jan 28 20:59:21 crc kubenswrapper[4746]: I0128 20:59:21.935826 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faabc487-475c-4f5b-b135-5a96d1ed9269-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "faabc487-475c-4f5b-b135-5a96d1ed9269" (UID: "faabc487-475c-4f5b-b135-5a96d1ed9269"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:21 crc kubenswrapper[4746]: I0128 20:59:21.946914 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faabc487-475c-4f5b-b135-5a96d1ed9269-kube-api-access-r6qc5" (OuterVolumeSpecName: "kube-api-access-r6qc5") pod "faabc487-475c-4f5b-b135-5a96d1ed9269" (UID: "faabc487-475c-4f5b-b135-5a96d1ed9269"). InnerVolumeSpecName "kube-api-access-r6qc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:59:21 crc kubenswrapper[4746]: I0128 20:59:21.979290 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faabc487-475c-4f5b-b135-5a96d1ed9269-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "faabc487-475c-4f5b-b135-5a96d1ed9269" (UID: "faabc487-475c-4f5b-b135-5a96d1ed9269"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:21 crc kubenswrapper[4746]: I0128 20:59:21.986481 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dsp4s" event={"ID":"faabc487-475c-4f5b-b135-5a96d1ed9269","Type":"ContainerDied","Data":"adcf78c9f27a696bc9f434c166d6cfc01dfee6269abbb5b91e2272e9575b3219"} Jan 28 20:59:21 crc kubenswrapper[4746]: I0128 20:59:21.986519 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adcf78c9f27a696bc9f434c166d6cfc01dfee6269abbb5b91e2272e9575b3219" Jan 28 20:59:21 crc kubenswrapper[4746]: I0128 20:59:21.986576 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dsp4s" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.039713 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faabc487-475c-4f5b-b135-5a96d1ed9269-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.039763 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6qc5\" (UniqueName: \"kubernetes.io/projected/faabc487-475c-4f5b-b135-5a96d1ed9269-kube-api-access-r6qc5\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.039776 4746 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/faabc487-475c-4f5b-b135-5a96d1ed9269-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.040387 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9v856"] Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.047302 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9v856"] Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.127580 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faabc487-475c-4f5b-b135-5a96d1ed9269-config-data" (OuterVolumeSpecName: "config-data") pod "faabc487-475c-4f5b-b135-5a96d1ed9269" (UID: "faabc487-475c-4f5b-b135-5a96d1ed9269"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.142894 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faabc487-475c-4f5b-b135-5a96d1ed9269-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.343626 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-d9h8x"] Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.390553 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-kkq9p"] Jan 28 20:59:22 crc kubenswrapper[4746]: E0128 20:59:22.390998 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faabc487-475c-4f5b-b135-5a96d1ed9269" containerName="glance-db-sync" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.391014 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="faabc487-475c-4f5b-b135-5a96d1ed9269" containerName="glance-db-sync" Jan 28 20:59:22 crc kubenswrapper[4746]: E0128 20:59:22.391029 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd916f9-77ae-48e4-86f1-e224e2f1dd6c" containerName="mariadb-account-create-update" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.391036 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd916f9-77ae-48e4-86f1-e224e2f1dd6c" containerName="mariadb-account-create-update" Jan 28 20:59:22 crc kubenswrapper[4746]: E0128 20:59:22.391061 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37690909-5abe-4d42-9b66-d1398879fd15" containerName="init" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.391068 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="37690909-5abe-4d42-9b66-d1398879fd15" containerName="init" Jan 28 20:59:22 crc kubenswrapper[4746]: E0128 20:59:22.391095 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fdad9a4-c388-49e0-aff0-b73c137b9862" containerName="init" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.391101 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fdad9a4-c388-49e0-aff0-b73c137b9862" containerName="init" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.391386 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fdad9a4-c388-49e0-aff0-b73c137b9862" containerName="init" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.391398 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dd916f9-77ae-48e4-86f1-e224e2f1dd6c" containerName="mariadb-account-create-update" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.391417 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="37690909-5abe-4d42-9b66-d1398879fd15" containerName="init" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.391443 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="faabc487-475c-4f5b-b135-5a96d1ed9269" containerName="glance-db-sync" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.394974 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.421170 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-kkq9p"] Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.450482 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-kkq9p\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.463576 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-kkq9p\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.464019 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-kkq9p\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.464192 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sggwc\" (UniqueName: \"kubernetes.io/projected/3c3fee24-35f9-4695-b25d-b49430708c43-kube-api-access-sggwc\") pod \"dnsmasq-dns-785d8bcb8c-kkq9p\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.464399 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-kkq9p\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.464827 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-config\") pod \"dnsmasq-dns-785d8bcb8c-kkq9p\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.566830 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-kkq9p\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.567351 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sggwc\" (UniqueName: \"kubernetes.io/projected/3c3fee24-35f9-4695-b25d-b49430708c43-kube-api-access-sggwc\") pod \"dnsmasq-dns-785d8bcb8c-kkq9p\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.567510 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-kkq9p\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.567744 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-config\") pod \"dnsmasq-dns-785d8bcb8c-kkq9p\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.567913 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-kkq9p\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.568031 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-kkq9p\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.569235 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-kkq9p\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.569325 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-kkq9p\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.569450 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-kkq9p\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.570683 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-kkq9p\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.571266 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-config\") pod \"dnsmasq-dns-785d8bcb8c-kkq9p\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.593275 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sggwc\" (UniqueName: \"kubernetes.io/projected/3c3fee24-35f9-4695-b25d-b49430708c43-kube-api-access-sggwc\") pod \"dnsmasq-dns-785d8bcb8c-kkq9p\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.735650 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.884212 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37690909-5abe-4d42-9b66-d1398879fd15" path="/var/lib/kubelet/pods/37690909-5abe-4d42-9b66-d1398879fd15/volumes" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.884933 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fdad9a4-c388-49e0-aff0-b73c137b9862" path="/var/lib/kubelet/pods/7fdad9a4-c388-49e0-aff0-b73c137b9862/volumes" Jan 28 20:59:22 crc kubenswrapper[4746]: I0128 20:59:22.885568 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dd916f9-77ae-48e4-86f1-e224e2f1dd6c" path="/var/lib/kubelet/pods/8dd916f9-77ae-48e4-86f1-e224e2f1dd6c/volumes" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.005298 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" podUID="3710bcff-5321-46a6-8763-94e622eb38cb" containerName="dnsmasq-dns" containerID="cri-o://5ba395585c8aa92dd55ae0dedeea33152ce0bd55c3762da0432ff57b87331f99" gracePeriod=10 Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.218204 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.220160 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.223835 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sx27z" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.224008 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.224655 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.233130 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.293553 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-95469877-e687-4d8b-97fe-080814cf4383\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.293903 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be7f3403-801e-45de-9517-8b3ca91d9682-config-data\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.293940 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/be7f3403-801e-45de-9517-8b3ca91d9682-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.293973 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7f3403-801e-45de-9517-8b3ca91d9682-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.294042 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be7f3403-801e-45de-9517-8b3ca91d9682-scripts\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.294099 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be7f3403-801e-45de-9517-8b3ca91d9682-logs\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.294119 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shg2k\" (UniqueName: \"kubernetes.io/projected/be7f3403-801e-45de-9517-8b3ca91d9682-kube-api-access-shg2k\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.344306 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-kkq9p"] Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.397249 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be7f3403-801e-45de-9517-8b3ca91d9682-logs\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.397301 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shg2k\" (UniqueName: \"kubernetes.io/projected/be7f3403-801e-45de-9517-8b3ca91d9682-kube-api-access-shg2k\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.397367 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-95469877-e687-4d8b-97fe-080814cf4383\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.397411 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be7f3403-801e-45de-9517-8b3ca91d9682-config-data\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.397439 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/be7f3403-801e-45de-9517-8b3ca91d9682-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.397466 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7f3403-801e-45de-9517-8b3ca91d9682-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.397520 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be7f3403-801e-45de-9517-8b3ca91d9682-scripts\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.400073 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be7f3403-801e-45de-9517-8b3ca91d9682-logs\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.401764 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/be7f3403-801e-45de-9517-8b3ca91d9682-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.406669 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.406719 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-95469877-e687-4d8b-97fe-080814cf4383\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ca8c336a9061554abfcbd88e82c904ba958cd0f903c4270744870a313861497c/globalmount\"" pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.409404 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be7f3403-801e-45de-9517-8b3ca91d9682-scripts\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.411205 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7f3403-801e-45de-9517-8b3ca91d9682-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.413188 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be7f3403-801e-45de-9517-8b3ca91d9682-config-data\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.427783 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shg2k\" (UniqueName: \"kubernetes.io/projected/be7f3403-801e-45de-9517-8b3ca91d9682-kube-api-access-shg2k\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.491430 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-95469877-e687-4d8b-97fe-080814cf4383\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383\") pod \"glance-default-external-api-0\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.497850 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.499806 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.502872 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.513125 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.543499 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.602879 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.602929 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.603025 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.603107 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.603175 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.603232 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wzsg\" (UniqueName: \"kubernetes.io/projected/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-kube-api-access-6wzsg\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.603258 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-logs\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.705000 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wzsg\" (UniqueName: \"kubernetes.io/projected/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-kube-api-access-6wzsg\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.705060 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-logs\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.705124 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.705152 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.705240 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.705304 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.705368 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.708174 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-logs\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.708362 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.712139 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.712709 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.712750 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/04ac8a11379bea80fa995f6a81e21b4d865afcddeb1b26b6c4ebff6fe431a0b6/globalmount\"" pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.717794 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.719382 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.734343 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wzsg\" (UniqueName: \"kubernetes.io/projected/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-kube-api-access-6wzsg\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.777765 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\") pod \"glance-default-internal-api-0\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:23 crc kubenswrapper[4746]: I0128 20:59:23.892819 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 20:59:24 crc kubenswrapper[4746]: I0128 20:59:24.057353 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"914308b3-0f5e-4716-bc87-948f8a8acfb3","Type":"ContainerStarted","Data":"0bf8e97e8ef5f68f228708d889bdabb04e7c4e8cb53401b32f6fe443152976f5"} Jan 28 20:59:24 crc kubenswrapper[4746]: I0128 20:59:24.062233 4746 generic.go:334] "Generic (PLEG): container finished" podID="3710bcff-5321-46a6-8763-94e622eb38cb" containerID="5ba395585c8aa92dd55ae0dedeea33152ce0bd55c3762da0432ff57b87331f99" exitCode=0 Jan 28 20:59:24 crc kubenswrapper[4746]: I0128 20:59:24.062354 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" event={"ID":"3710bcff-5321-46a6-8763-94e622eb38cb","Type":"ContainerDied","Data":"5ba395585c8aa92dd55ae0dedeea33152ce0bd55c3762da0432ff57b87331f99"} Jan 28 20:59:24 crc kubenswrapper[4746]: I0128 20:59:24.068458 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" event={"ID":"3c3fee24-35f9-4695-b25d-b49430708c43","Type":"ContainerStarted","Data":"9ea3cec0589a36c1cb235bb18def62f3de008784dfaf83103c8a03b7d88cb517"} Jan 28 20:59:25 crc kubenswrapper[4746]: I0128 20:59:25.081712 4746 generic.go:334] "Generic (PLEG): container finished" podID="d575464e-96b9-46bf-9ae9-37e25dafb223" containerID="029111cbc3349dbbb9ed934ead4096eddb944845f11de68a6c2f8df6bb8f1d66" exitCode=0 Jan 28 20:59:25 crc kubenswrapper[4746]: I0128 20:59:25.081787 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jhdzr" event={"ID":"d575464e-96b9-46bf-9ae9-37e25dafb223","Type":"ContainerDied","Data":"029111cbc3349dbbb9ed934ead4096eddb944845f11de68a6c2f8df6bb8f1d66"} Jan 28 20:59:25 crc kubenswrapper[4746]: I0128 20:59:25.572974 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-msxbp"] Jan 28 20:59:25 crc kubenswrapper[4746]: I0128 20:59:25.574889 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-msxbp" Jan 28 20:59:25 crc kubenswrapper[4746]: I0128 20:59:25.577393 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 28 20:59:25 crc kubenswrapper[4746]: I0128 20:59:25.589002 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-msxbp"] Jan 28 20:59:25 crc kubenswrapper[4746]: I0128 20:59:25.743774 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcdrb\" (UniqueName: \"kubernetes.io/projected/854c60ea-889a-449a-a74b-39f6f973f52c-kube-api-access-qcdrb\") pod \"root-account-create-update-msxbp\" (UID: \"854c60ea-889a-449a-a74b-39f6f973f52c\") " pod="openstack/root-account-create-update-msxbp" Jan 28 20:59:25 crc kubenswrapper[4746]: I0128 20:59:25.743864 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/854c60ea-889a-449a-a74b-39f6f973f52c-operator-scripts\") pod \"root-account-create-update-msxbp\" (UID: \"854c60ea-889a-449a-a74b-39f6f973f52c\") " pod="openstack/root-account-create-update-msxbp" Jan 28 20:59:25 crc kubenswrapper[4746]: I0128 20:59:25.845292 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcdrb\" (UniqueName: \"kubernetes.io/projected/854c60ea-889a-449a-a74b-39f6f973f52c-kube-api-access-qcdrb\") pod \"root-account-create-update-msxbp\" (UID: \"854c60ea-889a-449a-a74b-39f6f973f52c\") " pod="openstack/root-account-create-update-msxbp" Jan 28 20:59:25 crc kubenswrapper[4746]: I0128 20:59:25.845387 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/854c60ea-889a-449a-a74b-39f6f973f52c-operator-scripts\") pod \"root-account-create-update-msxbp\" (UID: \"854c60ea-889a-449a-a74b-39f6f973f52c\") " pod="openstack/root-account-create-update-msxbp" Jan 28 20:59:25 crc kubenswrapper[4746]: I0128 20:59:25.846288 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/854c60ea-889a-449a-a74b-39f6f973f52c-operator-scripts\") pod \"root-account-create-update-msxbp\" (UID: \"854c60ea-889a-449a-a74b-39f6f973f52c\") " pod="openstack/root-account-create-update-msxbp" Jan 28 20:59:25 crc kubenswrapper[4746]: I0128 20:59:25.869109 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcdrb\" (UniqueName: \"kubernetes.io/projected/854c60ea-889a-449a-a74b-39f6f973f52c-kube-api-access-qcdrb\") pod \"root-account-create-update-msxbp\" (UID: \"854c60ea-889a-449a-a74b-39f6f973f52c\") " pod="openstack/root-account-create-update-msxbp" Jan 28 20:59:25 crc kubenswrapper[4746]: I0128 20:59:25.910592 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-msxbp" Jan 28 20:59:26 crc kubenswrapper[4746]: I0128 20:59:26.487734 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 20:59:26 crc kubenswrapper[4746]: I0128 20:59:26.562195 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.137904 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" event={"ID":"3710bcff-5321-46a6-8763-94e622eb38cb","Type":"ContainerDied","Data":"4ab4d76730e4d4819e4a7e6efd86c1db7824a7e3af2dec4e6924c55ae6c00ada"} Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.138724 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ab4d76730e4d4819e4a7e6efd86c1db7824a7e3af2dec4e6924c55ae6c00ada" Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.219807 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.373840 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55xtw\" (UniqueName: \"kubernetes.io/projected/3710bcff-5321-46a6-8763-94e622eb38cb-kube-api-access-55xtw\") pod \"3710bcff-5321-46a6-8763-94e622eb38cb\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.373983 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-ovsdbserver-nb\") pod \"3710bcff-5321-46a6-8763-94e622eb38cb\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.374023 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-config\") pod \"3710bcff-5321-46a6-8763-94e622eb38cb\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.374045 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-ovsdbserver-sb\") pod \"3710bcff-5321-46a6-8763-94e622eb38cb\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.374158 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-dns-svc\") pod \"3710bcff-5321-46a6-8763-94e622eb38cb\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.374204 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-dns-swift-storage-0\") pod \"3710bcff-5321-46a6-8763-94e622eb38cb\" (UID: \"3710bcff-5321-46a6-8763-94e622eb38cb\") " Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.380842 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3710bcff-5321-46a6-8763-94e622eb38cb-kube-api-access-55xtw" (OuterVolumeSpecName: "kube-api-access-55xtw") pod "3710bcff-5321-46a6-8763-94e622eb38cb" (UID: "3710bcff-5321-46a6-8763-94e622eb38cb"). InnerVolumeSpecName "kube-api-access-55xtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.439543 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3710bcff-5321-46a6-8763-94e622eb38cb" (UID: "3710bcff-5321-46a6-8763-94e622eb38cb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.441990 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3710bcff-5321-46a6-8763-94e622eb38cb" (UID: "3710bcff-5321-46a6-8763-94e622eb38cb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.445953 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3710bcff-5321-46a6-8763-94e622eb38cb" (UID: "3710bcff-5321-46a6-8763-94e622eb38cb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.450130 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-config" (OuterVolumeSpecName: "config") pod "3710bcff-5321-46a6-8763-94e622eb38cb" (UID: "3710bcff-5321-46a6-8763-94e622eb38cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.477250 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55xtw\" (UniqueName: \"kubernetes.io/projected/3710bcff-5321-46a6-8763-94e622eb38cb-kube-api-access-55xtw\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.477280 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.477289 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.477296 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.477304 4746 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.492354 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3710bcff-5321-46a6-8763-94e622eb38cb" (UID: "3710bcff-5321-46a6-8763-94e622eb38cb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:27 crc kubenswrapper[4746]: I0128 20:59:27.579119 4746 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3710bcff-5321-46a6-8763-94e622eb38cb-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:28 crc kubenswrapper[4746]: I0128 20:59:28.150188 4746 generic.go:334] "Generic (PLEG): container finished" podID="3c3fee24-35f9-4695-b25d-b49430708c43" containerID="5a93bb99a49c68abca554da1927ef539fe9de6b02067e921ad5160164c5dd2cf" exitCode=0 Jan 28 20:59:28 crc kubenswrapper[4746]: I0128 20:59:28.150265 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" event={"ID":"3c3fee24-35f9-4695-b25d-b49430708c43","Type":"ContainerDied","Data":"5a93bb99a49c68abca554da1927ef539fe9de6b02067e921ad5160164c5dd2cf"} Jan 28 20:59:28 crc kubenswrapper[4746]: I0128 20:59:28.150674 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-d9h8x" Jan 28 20:59:28 crc kubenswrapper[4746]: I0128 20:59:28.210605 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-d9h8x"] Jan 28 20:59:28 crc kubenswrapper[4746]: I0128 20:59:28.222551 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-d9h8x"] Jan 28 20:59:28 crc kubenswrapper[4746]: I0128 20:59:28.853486 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3710bcff-5321-46a6-8763-94e622eb38cb" path="/var/lib/kubelet/pods/3710bcff-5321-46a6-8763-94e622eb38cb/volumes" Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.277913 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.461620 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-config-data\") pod \"d575464e-96b9-46bf-9ae9-37e25dafb223\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.462130 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-scripts\") pod \"d575464e-96b9-46bf-9ae9-37e25dafb223\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.462182 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-fernet-keys\") pod \"d575464e-96b9-46bf-9ae9-37e25dafb223\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.464878 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-combined-ca-bundle\") pod \"d575464e-96b9-46bf-9ae9-37e25dafb223\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.465025 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5w7r\" (UniqueName: \"kubernetes.io/projected/d575464e-96b9-46bf-9ae9-37e25dafb223-kube-api-access-j5w7r\") pod \"d575464e-96b9-46bf-9ae9-37e25dafb223\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.465134 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-credential-keys\") pod \"d575464e-96b9-46bf-9ae9-37e25dafb223\" (UID: \"d575464e-96b9-46bf-9ae9-37e25dafb223\") " Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.469741 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d575464e-96b9-46bf-9ae9-37e25dafb223" (UID: "d575464e-96b9-46bf-9ae9-37e25dafb223"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.471138 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-scripts" (OuterVolumeSpecName: "scripts") pod "d575464e-96b9-46bf-9ae9-37e25dafb223" (UID: "d575464e-96b9-46bf-9ae9-37e25dafb223"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.471610 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d575464e-96b9-46bf-9ae9-37e25dafb223" (UID: "d575464e-96b9-46bf-9ae9-37e25dafb223"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.472665 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d575464e-96b9-46bf-9ae9-37e25dafb223-kube-api-access-j5w7r" (OuterVolumeSpecName: "kube-api-access-j5w7r") pod "d575464e-96b9-46bf-9ae9-37e25dafb223" (UID: "d575464e-96b9-46bf-9ae9-37e25dafb223"). InnerVolumeSpecName "kube-api-access-j5w7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.501038 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d575464e-96b9-46bf-9ae9-37e25dafb223" (UID: "d575464e-96b9-46bf-9ae9-37e25dafb223"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.506094 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-config-data" (OuterVolumeSpecName: "config-data") pod "d575464e-96b9-46bf-9ae9-37e25dafb223" (UID: "d575464e-96b9-46bf-9ae9-37e25dafb223"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.569740 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5w7r\" (UniqueName: \"kubernetes.io/projected/d575464e-96b9-46bf-9ae9-37e25dafb223-kube-api-access-j5w7r\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.569786 4746 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.569799 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.569811 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.569823 4746 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:29 crc kubenswrapper[4746]: I0128 20:59:29.569833 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d575464e-96b9-46bf-9ae9-37e25dafb223-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.168154 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jhdzr" event={"ID":"d575464e-96b9-46bf-9ae9-37e25dafb223","Type":"ContainerDied","Data":"5b81ad9abb59abd9f8f451c344b0874a1b516d93aabeaaf5427f757f88f1afac"} Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.168603 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b81ad9abb59abd9f8f451c344b0874a1b516d93aabeaaf5427f757f88f1afac" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.168269 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jhdzr" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.411712 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jhdzr"] Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.426833 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jhdzr"] Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.468949 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-lwqj2"] Jan 28 20:59:30 crc kubenswrapper[4746]: E0128 20:59:30.469339 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3710bcff-5321-46a6-8763-94e622eb38cb" containerName="dnsmasq-dns" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.469355 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="3710bcff-5321-46a6-8763-94e622eb38cb" containerName="dnsmasq-dns" Jan 28 20:59:30 crc kubenswrapper[4746]: E0128 20:59:30.469383 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d575464e-96b9-46bf-9ae9-37e25dafb223" containerName="keystone-bootstrap" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.469391 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="d575464e-96b9-46bf-9ae9-37e25dafb223" containerName="keystone-bootstrap" Jan 28 20:59:30 crc kubenswrapper[4746]: E0128 20:59:30.469407 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3710bcff-5321-46a6-8763-94e622eb38cb" containerName="init" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.469412 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="3710bcff-5321-46a6-8763-94e622eb38cb" containerName="init" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.469591 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="3710bcff-5321-46a6-8763-94e622eb38cb" containerName="dnsmasq-dns" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.469620 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="d575464e-96b9-46bf-9ae9-37e25dafb223" containerName="keystone-bootstrap" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.470337 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.477227 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lwqj2"] Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.515734 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.515963 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-j5js5" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.516006 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.515980 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.517172 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-config-data\") pod \"keystone-bootstrap-lwqj2\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.517589 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgtdv\" (UniqueName: \"kubernetes.io/projected/f81b35d0-5755-476e-a5c9-30036d654d53-kube-api-access-vgtdv\") pod \"keystone-bootstrap-lwqj2\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.517676 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-credential-keys\") pod \"keystone-bootstrap-lwqj2\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.517794 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-scripts\") pod \"keystone-bootstrap-lwqj2\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.517991 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-combined-ca-bundle\") pod \"keystone-bootstrap-lwqj2\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.518181 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-fernet-keys\") pod \"keystone-bootstrap-lwqj2\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.618913 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgtdv\" (UniqueName: \"kubernetes.io/projected/f81b35d0-5755-476e-a5c9-30036d654d53-kube-api-access-vgtdv\") pod \"keystone-bootstrap-lwqj2\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.618970 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-credential-keys\") pod \"keystone-bootstrap-lwqj2\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.618994 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-scripts\") pod \"keystone-bootstrap-lwqj2\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.619046 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-combined-ca-bundle\") pod \"keystone-bootstrap-lwqj2\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.619098 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-fernet-keys\") pod \"keystone-bootstrap-lwqj2\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.619172 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-config-data\") pod \"keystone-bootstrap-lwqj2\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.624444 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-scripts\") pod \"keystone-bootstrap-lwqj2\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.625190 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-config-data\") pod \"keystone-bootstrap-lwqj2\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.625351 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-credential-keys\") pod \"keystone-bootstrap-lwqj2\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.625417 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-fernet-keys\") pod \"keystone-bootstrap-lwqj2\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.629688 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-combined-ca-bundle\") pod \"keystone-bootstrap-lwqj2\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.639585 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgtdv\" (UniqueName: \"kubernetes.io/projected/f81b35d0-5755-476e-a5c9-30036d654d53-kube-api-access-vgtdv\") pod \"keystone-bootstrap-lwqj2\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.835903 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:30 crc kubenswrapper[4746]: I0128 20:59:30.847618 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d575464e-96b9-46bf-9ae9-37e25dafb223" path="/var/lib/kubelet/pods/d575464e-96b9-46bf-9ae9-37e25dafb223/volumes" Jan 28 20:59:36 crc kubenswrapper[4746]: I0128 20:59:36.230190 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-msxbp"] Jan 28 20:59:42 crc kubenswrapper[4746]: I0128 20:59:42.308199 4746 generic.go:334] "Generic (PLEG): container finished" podID="b820b96e-5237-4984-a3e9-246b04980cbb" containerID="3e8bf0fad3a221f9307149e72a8c7a4c16b411a4b0558781df4f71024476360a" exitCode=0 Jan 28 20:59:42 crc kubenswrapper[4746]: I0128 20:59:42.308368 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g7kpz" event={"ID":"b820b96e-5237-4984-a3e9-246b04980cbb","Type":"ContainerDied","Data":"3e8bf0fad3a221f9307149e72a8c7a4c16b411a4b0558781df4f71024476360a"} Jan 28 20:59:43 crc kubenswrapper[4746]: E0128 20:59:43.624251 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Jan 28 20:59:43 crc kubenswrapper[4746]: E0128 20:59:43.624660 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hz7rb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-29n2p_openstack(70e766dc-9f84-4d0c-af5b-3b044e06c09f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 20:59:43 crc kubenswrapper[4746]: E0128 20:59:43.626498 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-29n2p" podUID="70e766dc-9f84-4d0c-af5b-3b044e06c09f" Jan 28 20:59:43 crc kubenswrapper[4746]: W0128 20:59:43.996799 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod854c60ea_889a_449a_a74b_39f6f973f52c.slice/crio-6452d5f9caabb8c64cdd061784f9211b6c19b98a9a8d5cf5dd8d8f8fbc77b260 WatchSource:0}: Error finding container 6452d5f9caabb8c64cdd061784f9211b6c19b98a9a8d5cf5dd8d8f8fbc77b260: Status 404 returned error can't find the container with id 6452d5f9caabb8c64cdd061784f9211b6c19b98a9a8d5cf5dd8d8f8fbc77b260 Jan 28 20:59:44 crc kubenswrapper[4746]: I0128 20:59:44.332169 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-msxbp" event={"ID":"854c60ea-889a-449a-a74b-39f6f973f52c","Type":"ContainerStarted","Data":"6452d5f9caabb8c64cdd061784f9211b6c19b98a9a8d5cf5dd8d8f8fbc77b260"} Jan 28 20:59:44 crc kubenswrapper[4746]: E0128 20:59:44.333889 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-29n2p" podUID="70e766dc-9f84-4d0c-af5b-3b044e06c09f" Jan 28 20:59:45 crc kubenswrapper[4746]: I0128 20:59:45.057408 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 28 20:59:45 crc kubenswrapper[4746]: E0128 20:59:45.107855 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 28 20:59:45 crc kubenswrapper[4746]: E0128 20:59:45.108050 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hvpq6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-wwfkc_openstack(a587d3d9-972c-47ae-8e29-5bfd977ff429): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 20:59:45 crc kubenswrapper[4746]: E0128 20:59:45.109316 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-wwfkc" podUID="a587d3d9-972c-47ae-8e29-5bfd977ff429" Jan 28 20:59:45 crc kubenswrapper[4746]: E0128 20:59:45.344922 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-wwfkc" podUID="a587d3d9-972c-47ae-8e29-5bfd977ff429" Jan 28 20:59:45 crc kubenswrapper[4746]: I0128 20:59:45.871396 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 20:59:45 crc kubenswrapper[4746]: I0128 20:59:45.871480 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 20:59:47 crc kubenswrapper[4746]: I0128 20:59:47.365421 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g7kpz" Jan 28 20:59:47 crc kubenswrapper[4746]: I0128 20:59:47.406784 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g7kpz" event={"ID":"b820b96e-5237-4984-a3e9-246b04980cbb","Type":"ContainerDied","Data":"bace3c13ffe904c5674c425f8137fd20313d3e39f537b02b0112e1579040b6ee"} Jan 28 20:59:47 crc kubenswrapper[4746]: I0128 20:59:47.406822 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bace3c13ffe904c5674c425f8137fd20313d3e39f537b02b0112e1579040b6ee" Jan 28 20:59:47 crc kubenswrapper[4746]: I0128 20:59:47.406836 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g7kpz" Jan 28 20:59:47 crc kubenswrapper[4746]: I0128 20:59:47.413966 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b820b96e-5237-4984-a3e9-246b04980cbb-config\") pod \"b820b96e-5237-4984-a3e9-246b04980cbb\" (UID: \"b820b96e-5237-4984-a3e9-246b04980cbb\") " Jan 28 20:59:47 crc kubenswrapper[4746]: I0128 20:59:47.414060 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b820b96e-5237-4984-a3e9-246b04980cbb-combined-ca-bundle\") pod \"b820b96e-5237-4984-a3e9-246b04980cbb\" (UID: \"b820b96e-5237-4984-a3e9-246b04980cbb\") " Jan 28 20:59:47 crc kubenswrapper[4746]: I0128 20:59:47.414184 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84g7x\" (UniqueName: \"kubernetes.io/projected/b820b96e-5237-4984-a3e9-246b04980cbb-kube-api-access-84g7x\") pod \"b820b96e-5237-4984-a3e9-246b04980cbb\" (UID: \"b820b96e-5237-4984-a3e9-246b04980cbb\") " Jan 28 20:59:47 crc kubenswrapper[4746]: I0128 20:59:47.442839 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b820b96e-5237-4984-a3e9-246b04980cbb-kube-api-access-84g7x" (OuterVolumeSpecName: "kube-api-access-84g7x") pod "b820b96e-5237-4984-a3e9-246b04980cbb" (UID: "b820b96e-5237-4984-a3e9-246b04980cbb"). InnerVolumeSpecName "kube-api-access-84g7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:59:47 crc kubenswrapper[4746]: I0128 20:59:47.452634 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b820b96e-5237-4984-a3e9-246b04980cbb-config" (OuterVolumeSpecName: "config") pod "b820b96e-5237-4984-a3e9-246b04980cbb" (UID: "b820b96e-5237-4984-a3e9-246b04980cbb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:47 crc kubenswrapper[4746]: I0128 20:59:47.490646 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b820b96e-5237-4984-a3e9-246b04980cbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b820b96e-5237-4984-a3e9-246b04980cbb" (UID: "b820b96e-5237-4984-a3e9-246b04980cbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:47 crc kubenswrapper[4746]: I0128 20:59:47.516173 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84g7x\" (UniqueName: \"kubernetes.io/projected/b820b96e-5237-4984-a3e9-246b04980cbb-kube-api-access-84g7x\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:47 crc kubenswrapper[4746]: I0128 20:59:47.516207 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b820b96e-5237-4984-a3e9-246b04980cbb-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:47 crc kubenswrapper[4746]: I0128 20:59:47.516218 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b820b96e-5237-4984-a3e9-246b04980cbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:47 crc kubenswrapper[4746]: I0128 20:59:47.781729 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 20:59:47 crc kubenswrapper[4746]: I0128 20:59:47.856466 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.555791 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-kkq9p"] Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.615853 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-bd578f44d-rfllv"] Jan 28 20:59:48 crc kubenswrapper[4746]: E0128 20:59:48.616980 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b820b96e-5237-4984-a3e9-246b04980cbb" containerName="neutron-db-sync" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.617008 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="b820b96e-5237-4984-a3e9-246b04980cbb" containerName="neutron-db-sync" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.617361 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="b820b96e-5237-4984-a3e9-246b04980cbb" containerName="neutron-db-sync" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.621034 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bd578f44d-rfllv" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.626285 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.626693 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.626892 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-s8hft" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.632258 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.636335 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-8hh8h"] Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.644226 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvdq7\" (UniqueName: \"kubernetes.io/projected/875165a4-1092-4e9d-ae24-5044a726e174-kube-api-access-kvdq7\") pod \"neutron-bd578f44d-rfllv\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " pod="openstack/neutron-bd578f44d-rfllv" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.644292 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-ovndb-tls-certs\") pod \"neutron-bd578f44d-rfllv\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " pod="openstack/neutron-bd578f44d-rfllv" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.644329 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-httpd-config\") pod \"neutron-bd578f44d-rfllv\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " pod="openstack/neutron-bd578f44d-rfllv" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.644369 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-combined-ca-bundle\") pod \"neutron-bd578f44d-rfllv\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " pod="openstack/neutron-bd578f44d-rfllv" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.644849 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-config\") pod \"neutron-bd578f44d-rfllv\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " pod="openstack/neutron-bd578f44d-rfllv" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.646590 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.680341 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bd578f44d-rfllv"] Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.705832 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-8hh8h"] Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.750267 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvdq7\" (UniqueName: \"kubernetes.io/projected/875165a4-1092-4e9d-ae24-5044a726e174-kube-api-access-kvdq7\") pod \"neutron-bd578f44d-rfllv\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " pod="openstack/neutron-bd578f44d-rfllv" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.750645 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-ovndb-tls-certs\") pod \"neutron-bd578f44d-rfllv\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " pod="openstack/neutron-bd578f44d-rfllv" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.750684 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-8hh8h\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.750710 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-httpd-config\") pod \"neutron-bd578f44d-rfllv\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " pod="openstack/neutron-bd578f44d-rfllv" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.750736 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-combined-ca-bundle\") pod \"neutron-bd578f44d-rfllv\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " pod="openstack/neutron-bd578f44d-rfllv" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.750776 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-8hh8h\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.750805 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-8hh8h\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.750827 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-8hh8h\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.750846 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-config\") pod \"neutron-bd578f44d-rfllv\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " pod="openstack/neutron-bd578f44d-rfllv" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.750889 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-config\") pod \"dnsmasq-dns-55f844cf75-8hh8h\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.750911 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g669\" (UniqueName: \"kubernetes.io/projected/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-kube-api-access-4g669\") pod \"dnsmasq-dns-55f844cf75-8hh8h\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.759041 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-httpd-config\") pod \"neutron-bd578f44d-rfllv\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " pod="openstack/neutron-bd578f44d-rfllv" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.761016 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-config\") pod \"neutron-bd578f44d-rfllv\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " pod="openstack/neutron-bd578f44d-rfllv" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.762611 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-ovndb-tls-certs\") pod \"neutron-bd578f44d-rfllv\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " pod="openstack/neutron-bd578f44d-rfllv" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.763199 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-combined-ca-bundle\") pod \"neutron-bd578f44d-rfllv\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " pod="openstack/neutron-bd578f44d-rfllv" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.779056 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvdq7\" (UniqueName: \"kubernetes.io/projected/875165a4-1092-4e9d-ae24-5044a726e174-kube-api-access-kvdq7\") pod \"neutron-bd578f44d-rfllv\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " pod="openstack/neutron-bd578f44d-rfllv" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.862188 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-8hh8h\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.862311 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-8hh8h\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.862356 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-8hh8h\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.862388 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-8hh8h\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.862419 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-config\") pod \"dnsmasq-dns-55f844cf75-8hh8h\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.862447 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g669\" (UniqueName: \"kubernetes.io/projected/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-kube-api-access-4g669\") pod \"dnsmasq-dns-55f844cf75-8hh8h\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.867215 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-8hh8h\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.868483 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-8hh8h\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.877563 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-8hh8h\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.878105 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-8hh8h\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.879226 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-config\") pod \"dnsmasq-dns-55f844cf75-8hh8h\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.887129 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g669\" (UniqueName: \"kubernetes.io/projected/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-kube-api-access-4g669\") pod \"dnsmasq-dns-55f844cf75-8hh8h\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.974565 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bd578f44d-rfllv" Jan 28 20:59:48 crc kubenswrapper[4746]: I0128 20:59:48.985541 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:50 crc kubenswrapper[4746]: I0128 20:59:50.931884 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lwqj2"] Jan 28 20:59:50 crc kubenswrapper[4746]: I0128 20:59:50.995121 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-66b8c6d9b5-z7qnr"] Jan 28 20:59:50 crc kubenswrapper[4746]: I0128 20:59:50.996643 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.000066 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.001646 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.020412 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66b8c6d9b5-z7qnr"] Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.041692 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-combined-ca-bundle\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.041757 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkznq\" (UniqueName: \"kubernetes.io/projected/40f7281e-2e3c-4ce9-8b0f-876312390c0b-kube-api-access-dkznq\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.041789 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-public-tls-certs\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.041862 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-ovndb-tls-certs\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.041888 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-httpd-config\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.041938 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-internal-tls-certs\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.041984 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-config\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.147809 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-combined-ca-bundle\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.148577 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkznq\" (UniqueName: \"kubernetes.io/projected/40f7281e-2e3c-4ce9-8b0f-876312390c0b-kube-api-access-dkznq\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.148687 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-public-tls-certs\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.148879 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-ovndb-tls-certs\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.148937 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-httpd-config\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.149375 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-internal-tls-certs\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.149433 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-config\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.154914 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-public-tls-certs\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.155243 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-ovndb-tls-certs\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.155910 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-httpd-config\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.156546 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-internal-tls-certs\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.157232 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-combined-ca-bundle\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.158192 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-config\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.168726 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkznq\" (UniqueName: \"kubernetes.io/projected/40f7281e-2e3c-4ce9-8b0f-876312390c0b-kube-api-access-dkznq\") pod \"neutron-66b8c6d9b5-z7qnr\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.340850 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:51 crc kubenswrapper[4746]: E0128 20:59:51.386435 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Jan 28 20:59:51 crc kubenswrapper[4746]: E0128 20:59:51.386493 4746 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Jan 28 20:59:51 crc kubenswrapper[4746]: E0128 20:59:51.386628 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bm2k7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-p9ghn_openstack(cce95230-2b72-4598-9d28-3a1465803567): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 20:59:51 crc kubenswrapper[4746]: E0128 20:59:51.388366 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-p9ghn" podUID="cce95230-2b72-4598-9d28-3a1465803567" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.474344 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e3a2491c-4637-4cba-a11d-7c8dd8c703d1","Type":"ContainerStarted","Data":"db43805063895c8d2ee0a81f3d782da9c0c4a76f2cc971a8cd4481cc8c37c0d2"} Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.484209 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" event={"ID":"3c3fee24-35f9-4695-b25d-b49430708c43","Type":"ContainerStarted","Data":"4e9edbde0ef7a7f269bc15422a131f159583007fe51ecb82844c5ee7a049ebb9"} Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.484406 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" podUID="3c3fee24-35f9-4695-b25d-b49430708c43" containerName="dnsmasq-dns" containerID="cri-o://4e9edbde0ef7a7f269bc15422a131f159583007fe51ecb82844c5ee7a049ebb9" gracePeriod=10 Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.485234 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.492237 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lwqj2" event={"ID":"f81b35d0-5755-476e-a5c9-30036d654d53","Type":"ContainerStarted","Data":"cd23339f5df85b446202742504369732d4a27be983614bcb13cac53292b9e8cd"} Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.505121 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"be7f3403-801e-45de-9517-8b3ca91d9682","Type":"ContainerStarted","Data":"b8e4ac400720964fcbec31f5564ece3c113b3114169cec21addcb297ed3ec4e6"} Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.529267 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" podStartSLOduration=29.529241605 podStartE2EDuration="29.529241605s" podCreationTimestamp="2026-01-28 20:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:59:51.510967884 +0000 UTC m=+1219.467154228" watchObservedRunningTime="2026-01-28 20:59:51.529241605 +0000 UTC m=+1219.485427979" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.541344 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"914308b3-0f5e-4716-bc87-948f8a8acfb3","Type":"ContainerStarted","Data":"36c889c5008b6ee77215004bedd7435a02ca1c4eb8308bf41e72ba9d3e73befa"} Jan 28 20:59:51 crc kubenswrapper[4746]: E0128 20:59:51.578156 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-p9ghn" podUID="cce95230-2b72-4598-9d28-3a1465803567" Jan 28 20:59:51 crc kubenswrapper[4746]: I0128 20:59:51.619767 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=43.61974703 podStartE2EDuration="43.61974703s" podCreationTimestamp="2026-01-28 20:59:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:59:51.602541728 +0000 UTC m=+1219.558728082" watchObservedRunningTime="2026-01-28 20:59:51.61974703 +0000 UTC m=+1219.575933384" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.150487 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.247453 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-8hh8h"] Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.276112 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-ovsdbserver-sb\") pod \"3c3fee24-35f9-4695-b25d-b49430708c43\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.276295 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-dns-swift-storage-0\") pod \"3c3fee24-35f9-4695-b25d-b49430708c43\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.276352 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-ovsdbserver-nb\") pod \"3c3fee24-35f9-4695-b25d-b49430708c43\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.276378 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sggwc\" (UniqueName: \"kubernetes.io/projected/3c3fee24-35f9-4695-b25d-b49430708c43-kube-api-access-sggwc\") pod \"3c3fee24-35f9-4695-b25d-b49430708c43\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.276399 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-config\") pod \"3c3fee24-35f9-4695-b25d-b49430708c43\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.276488 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-dns-svc\") pod \"3c3fee24-35f9-4695-b25d-b49430708c43\" (UID: \"3c3fee24-35f9-4695-b25d-b49430708c43\") " Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.311012 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c3fee24-35f9-4695-b25d-b49430708c43-kube-api-access-sggwc" (OuterVolumeSpecName: "kube-api-access-sggwc") pod "3c3fee24-35f9-4695-b25d-b49430708c43" (UID: "3c3fee24-35f9-4695-b25d-b49430708c43"). InnerVolumeSpecName "kube-api-access-sggwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.334002 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bd578f44d-rfllv"] Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.358772 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3c3fee24-35f9-4695-b25d-b49430708c43" (UID: "3c3fee24-35f9-4695-b25d-b49430708c43"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.365469 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3c3fee24-35f9-4695-b25d-b49430708c43" (UID: "3c3fee24-35f9-4695-b25d-b49430708c43"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.380218 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sggwc\" (UniqueName: \"kubernetes.io/projected/3c3fee24-35f9-4695-b25d-b49430708c43-kube-api-access-sggwc\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.380247 4746 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.380256 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.433339 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66b8c6d9b5-z7qnr"] Jan 28 20:59:52 crc kubenswrapper[4746]: W0128 20:59:52.446951 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40f7281e_2e3c_4ce9_8b0f_876312390c0b.slice/crio-61c5f777b1fc0ee7cd1bb33efb0c01a97b06132e1dfad46546e508ed29e9122b WatchSource:0}: Error finding container 61c5f777b1fc0ee7cd1bb33efb0c01a97b06132e1dfad46546e508ed29e9122b: Status 404 returned error can't find the container with id 61c5f777b1fc0ee7cd1bb33efb0c01a97b06132e1dfad46546e508ed29e9122b Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.451697 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3c3fee24-35f9-4695-b25d-b49430708c43" (UID: "3c3fee24-35f9-4695-b25d-b49430708c43"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.483336 4746 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.532260 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-config" (OuterVolumeSpecName: "config") pod "3c3fee24-35f9-4695-b25d-b49430708c43" (UID: "3c3fee24-35f9-4695-b25d-b49430708c43"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.549038 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3c3fee24-35f9-4695-b25d-b49430708c43" (UID: "3c3fee24-35f9-4695-b25d-b49430708c43"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.556046 4746 generic.go:334] "Generic (PLEG): container finished" podID="3c3fee24-35f9-4695-b25d-b49430708c43" containerID="4e9edbde0ef7a7f269bc15422a131f159583007fe51ecb82844c5ee7a049ebb9" exitCode=0 Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.556136 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" event={"ID":"3c3fee24-35f9-4695-b25d-b49430708c43","Type":"ContainerDied","Data":"4e9edbde0ef7a7f269bc15422a131f159583007fe51ecb82844c5ee7a049ebb9"} Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.556168 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" event={"ID":"3c3fee24-35f9-4695-b25d-b49430708c43","Type":"ContainerDied","Data":"9ea3cec0589a36c1cb235bb18def62f3de008784dfaf83103c8a03b7d88cb517"} Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.556188 4746 scope.go:117] "RemoveContainer" containerID="4e9edbde0ef7a7f269bc15422a131f159583007fe51ecb82844c5ee7a049ebb9" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.556326 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-kkq9p" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.564792 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jms86" event={"ID":"1d79950b-c574-4952-8620-ff635db5e8de","Type":"ContainerStarted","Data":"d23c067486da07f978541c7dba8e6461c6957dc68fe5452d6fe0d4b93cf13ed7"} Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.593219 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.593258 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3fee24-35f9-4695-b25d-b49430708c43-config\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.598804 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jms86" podStartSLOduration=9.82478642 podStartE2EDuration="36.598785185s" podCreationTimestamp="2026-01-28 20:59:16 +0000 UTC" firstStartedPulling="2026-01-28 20:59:18.280306863 +0000 UTC m=+1186.236493227" lastFinishedPulling="2026-01-28 20:59:45.054305638 +0000 UTC m=+1213.010491992" observedRunningTime="2026-01-28 20:59:52.589942967 +0000 UTC m=+1220.546129321" watchObservedRunningTime="2026-01-28 20:59:52.598785185 +0000 UTC m=+1220.554971539" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.599966 4746 generic.go:334] "Generic (PLEG): container finished" podID="854c60ea-889a-449a-a74b-39f6f973f52c" containerID="3c603454762a8447520867b8a7b5bc8f97ead8fd9b37d66bcdf6542f41afe2d5" exitCode=0 Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.600050 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-msxbp" event={"ID":"854c60ea-889a-449a-a74b-39f6f973f52c","Type":"ContainerDied","Data":"3c603454762a8447520867b8a7b5bc8f97ead8fd9b37d66bcdf6542f41afe2d5"} Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.605193 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lwqj2" event={"ID":"f81b35d0-5755-476e-a5c9-30036d654d53","Type":"ContainerStarted","Data":"b2ac020924b66e52e098563168d3a468eaa3d99aa0608464dec14ec0357795ec"} Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.608317 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924","Type":"ContainerStarted","Data":"e3997fad01bd1a3636c4f7b48290559049e0c983129537cc89988b435cd7c658"} Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.610147 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" event={"ID":"ee1b35a7-970d-4abf-b645-eebbcadd7e8e","Type":"ContainerStarted","Data":"ef953ba0e39534e4f0d56311f6dc84a03fe95bc7ca5f0e6fb8a1dda880f72e34"} Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.618156 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66b8c6d9b5-z7qnr" event={"ID":"40f7281e-2e3c-4ce9-8b0f-876312390c0b","Type":"ContainerStarted","Data":"61c5f777b1fc0ee7cd1bb33efb0c01a97b06132e1dfad46546e508ed29e9122b"} Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.631851 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd578f44d-rfllv" event={"ID":"875165a4-1092-4e9d-ae24-5044a726e174","Type":"ContainerStarted","Data":"eaa4be35ac05930b0aaf65c487338144a247098d838392db5fbfbed3ea6cd2e6"} Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.662177 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-lwqj2" podStartSLOduration=22.662128858 podStartE2EDuration="22.662128858s" podCreationTimestamp="2026-01-28 20:59:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:59:52.647887226 +0000 UTC m=+1220.604073580" watchObservedRunningTime="2026-01-28 20:59:52.662128858 +0000 UTC m=+1220.618315232" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.678156 4746 scope.go:117] "RemoveContainer" containerID="5a93bb99a49c68abca554da1927ef539fe9de6b02067e921ad5160164c5dd2cf" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.799382 4746 scope.go:117] "RemoveContainer" containerID="4e9edbde0ef7a7f269bc15422a131f159583007fe51ecb82844c5ee7a049ebb9" Jan 28 20:59:52 crc kubenswrapper[4746]: E0128 20:59:52.801034 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e9edbde0ef7a7f269bc15422a131f159583007fe51ecb82844c5ee7a049ebb9\": container with ID starting with 4e9edbde0ef7a7f269bc15422a131f159583007fe51ecb82844c5ee7a049ebb9 not found: ID does not exist" containerID="4e9edbde0ef7a7f269bc15422a131f159583007fe51ecb82844c5ee7a049ebb9" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.801062 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e9edbde0ef7a7f269bc15422a131f159583007fe51ecb82844c5ee7a049ebb9"} err="failed to get container status \"4e9edbde0ef7a7f269bc15422a131f159583007fe51ecb82844c5ee7a049ebb9\": rpc error: code = NotFound desc = could not find container \"4e9edbde0ef7a7f269bc15422a131f159583007fe51ecb82844c5ee7a049ebb9\": container with ID starting with 4e9edbde0ef7a7f269bc15422a131f159583007fe51ecb82844c5ee7a049ebb9 not found: ID does not exist" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.801085 4746 scope.go:117] "RemoveContainer" containerID="5a93bb99a49c68abca554da1927ef539fe9de6b02067e921ad5160164c5dd2cf" Jan 28 20:59:52 crc kubenswrapper[4746]: E0128 20:59:52.802323 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a93bb99a49c68abca554da1927ef539fe9de6b02067e921ad5160164c5dd2cf\": container with ID starting with 5a93bb99a49c68abca554da1927ef539fe9de6b02067e921ad5160164c5dd2cf not found: ID does not exist" containerID="5a93bb99a49c68abca554da1927ef539fe9de6b02067e921ad5160164c5dd2cf" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.802351 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a93bb99a49c68abca554da1927ef539fe9de6b02067e921ad5160164c5dd2cf"} err="failed to get container status \"5a93bb99a49c68abca554da1927ef539fe9de6b02067e921ad5160164c5dd2cf\": rpc error: code = NotFound desc = could not find container \"5a93bb99a49c68abca554da1927ef539fe9de6b02067e921ad5160164c5dd2cf\": container with ID starting with 5a93bb99a49c68abca554da1927ef539fe9de6b02067e921ad5160164c5dd2cf not found: ID does not exist" Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.855006 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-kkq9p"] Jan 28 20:59:52 crc kubenswrapper[4746]: I0128 20:59:52.868439 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-kkq9p"] Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.664390 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e3a2491c-4637-4cba-a11d-7c8dd8c703d1","Type":"ContainerStarted","Data":"8b1171cc065d5d7915beb03ba398c2872dee6b0f54de2064335253d18a44f563"} Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.664807 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e3a2491c-4637-4cba-a11d-7c8dd8c703d1","Type":"ContainerStarted","Data":"92e177f5ba767ba3a0ccc9b14089d91ff587cfd6340355ea56c80ef975741ee1"} Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.664948 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e3a2491c-4637-4cba-a11d-7c8dd8c703d1" containerName="glance-log" containerID="cri-o://92e177f5ba767ba3a0ccc9b14089d91ff587cfd6340355ea56c80ef975741ee1" gracePeriod=30 Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.666704 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e3a2491c-4637-4cba-a11d-7c8dd8c703d1" containerName="glance-httpd" containerID="cri-o://8b1171cc065d5d7915beb03ba398c2872dee6b0f54de2064335253d18a44f563" gracePeriod=30 Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.690189 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"be7f3403-801e-45de-9517-8b3ca91d9682","Type":"ContainerStarted","Data":"835bc7c38d859b9196b288139c7a228017c05f4f15ad9d28b666a732288589e7"} Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.703551 4746 generic.go:334] "Generic (PLEG): container finished" podID="ee1b35a7-970d-4abf-b645-eebbcadd7e8e" containerID="348c53e2e4cc51ad0b2a0899e9d685d1ae75a36ea387f74a447d30fc80071eee" exitCode=0 Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.703831 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" event={"ID":"ee1b35a7-970d-4abf-b645-eebbcadd7e8e","Type":"ContainerDied","Data":"348c53e2e4cc51ad0b2a0899e9d685d1ae75a36ea387f74a447d30fc80071eee"} Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.710236 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66b8c6d9b5-z7qnr" event={"ID":"40f7281e-2e3c-4ce9-8b0f-876312390c0b","Type":"ContainerStarted","Data":"7b0a672bdbe84c0ab3283f2f4f2a082d7f780e806e7442f059d9e6da34f9ef64"} Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.710313 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66b8c6d9b5-z7qnr" event={"ID":"40f7281e-2e3c-4ce9-8b0f-876312390c0b","Type":"ContainerStarted","Data":"f4d3b5e2216f8a4fe2a3ab22e9e3ac51f770c2bc3e05d57ffd1bc03c025cbfbb"} Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.714018 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.724828 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd578f44d-rfllv" event={"ID":"875165a4-1092-4e9d-ae24-5044a726e174","Type":"ContainerStarted","Data":"b5b98064c5cf0b8cf0bae5f8b9c12ae68524d3be0f4240ff9272bdb1f0ba8e6e"} Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.724893 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd578f44d-rfllv" event={"ID":"875165a4-1092-4e9d-ae24-5044a726e174","Type":"ContainerStarted","Data":"6fc58bbe85e7f3c3353de5258e71e22d05d995e6b763b3ca70e96349116d341e"} Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.726111 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-bd578f44d-rfllv" Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.751521 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=31.751496081 podStartE2EDuration="31.751496081s" podCreationTimestamp="2026-01-28 20:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:59:53.687429088 +0000 UTC m=+1221.643615452" watchObservedRunningTime="2026-01-28 20:59:53.751496081 +0000 UTC m=+1221.707682445" Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.806910 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-bd578f44d-rfllv" podStartSLOduration=5.806879501 podStartE2EDuration="5.806879501s" podCreationTimestamp="2026-01-28 20:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:59:53.803830649 +0000 UTC m=+1221.760017003" watchObservedRunningTime="2026-01-28 20:59:53.806879501 +0000 UTC m=+1221.763065855" Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.815755 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-66b8c6d9b5-z7qnr" podStartSLOduration=3.815729679 podStartE2EDuration="3.815729679s" podCreationTimestamp="2026-01-28 20:59:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:59:53.773636787 +0000 UTC m=+1221.729823141" watchObservedRunningTime="2026-01-28 20:59:53.815729679 +0000 UTC m=+1221.771916033" Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.840385 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.840424 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.847326 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.893346 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 28 20:59:53 crc kubenswrapper[4746]: I0128 20:59:53.893441 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 28 20:59:54 crc kubenswrapper[4746]: I0128 20:59:54.731213 4746 generic.go:334] "Generic (PLEG): container finished" podID="e3a2491c-4637-4cba-a11d-7c8dd8c703d1" containerID="8b1171cc065d5d7915beb03ba398c2872dee6b0f54de2064335253d18a44f563" exitCode=143 Jan 28 20:59:54 crc kubenswrapper[4746]: I0128 20:59:54.731493 4746 generic.go:334] "Generic (PLEG): container finished" podID="e3a2491c-4637-4cba-a11d-7c8dd8c703d1" containerID="92e177f5ba767ba3a0ccc9b14089d91ff587cfd6340355ea56c80ef975741ee1" exitCode=143 Jan 28 20:59:54 crc kubenswrapper[4746]: I0128 20:59:54.731529 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e3a2491c-4637-4cba-a11d-7c8dd8c703d1","Type":"ContainerDied","Data":"8b1171cc065d5d7915beb03ba398c2872dee6b0f54de2064335253d18a44f563"} Jan 28 20:59:54 crc kubenswrapper[4746]: I0128 20:59:54.731556 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e3a2491c-4637-4cba-a11d-7c8dd8c703d1","Type":"ContainerDied","Data":"92e177f5ba767ba3a0ccc9b14089d91ff587cfd6340355ea56c80ef975741ee1"} Jan 28 20:59:54 crc kubenswrapper[4746]: I0128 20:59:54.734323 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="be7f3403-801e-45de-9517-8b3ca91d9682" containerName="glance-log" containerID="cri-o://835bc7c38d859b9196b288139c7a228017c05f4f15ad9d28b666a732288589e7" gracePeriod=30 Jan 28 20:59:54 crc kubenswrapper[4746]: I0128 20:59:54.734582 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"be7f3403-801e-45de-9517-8b3ca91d9682","Type":"ContainerStarted","Data":"91b72c15237fea2bc0c6ce6901566a6e8af4fa5443e2c5efbbebe6e5c8e9d81b"} Jan 28 20:59:54 crc kubenswrapper[4746]: I0128 20:59:54.735773 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="be7f3403-801e-45de-9517-8b3ca91d9682" containerName="glance-httpd" containerID="cri-o://91b72c15237fea2bc0c6ce6901566a6e8af4fa5443e2c5efbbebe6e5c8e9d81b" gracePeriod=30 Jan 28 20:59:54 crc kubenswrapper[4746]: I0128 20:59:54.743078 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Jan 28 20:59:54 crc kubenswrapper[4746]: I0128 20:59:54.778257 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=32.77823885 podStartE2EDuration="32.77823885s" podCreationTimestamp="2026-01-28 20:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:59:54.770665586 +0000 UTC m=+1222.726851940" watchObservedRunningTime="2026-01-28 20:59:54.77823885 +0000 UTC m=+1222.734425194" Jan 28 20:59:54 crc kubenswrapper[4746]: I0128 20:59:54.855303 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c3fee24-35f9-4695-b25d-b49430708c43" path="/var/lib/kubelet/pods/3c3fee24-35f9-4695-b25d-b49430708c43/volumes" Jan 28 20:59:55 crc kubenswrapper[4746]: I0128 20:59:55.771836 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-msxbp" event={"ID":"854c60ea-889a-449a-a74b-39f6f973f52c","Type":"ContainerDied","Data":"6452d5f9caabb8c64cdd061784f9211b6c19b98a9a8d5cf5dd8d8f8fbc77b260"} Jan 28 20:59:55 crc kubenswrapper[4746]: I0128 20:59:55.772055 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6452d5f9caabb8c64cdd061784f9211b6c19b98a9a8d5cf5dd8d8f8fbc77b260" Jan 28 20:59:55 crc kubenswrapper[4746]: I0128 20:59:55.774476 4746 generic.go:334] "Generic (PLEG): container finished" podID="be7f3403-801e-45de-9517-8b3ca91d9682" containerID="91b72c15237fea2bc0c6ce6901566a6e8af4fa5443e2c5efbbebe6e5c8e9d81b" exitCode=0 Jan 28 20:59:55 crc kubenswrapper[4746]: I0128 20:59:55.774512 4746 generic.go:334] "Generic (PLEG): container finished" podID="be7f3403-801e-45de-9517-8b3ca91d9682" containerID="835bc7c38d859b9196b288139c7a228017c05f4f15ad9d28b666a732288589e7" exitCode=143 Jan 28 20:59:55 crc kubenswrapper[4746]: I0128 20:59:55.774671 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"be7f3403-801e-45de-9517-8b3ca91d9682","Type":"ContainerDied","Data":"91b72c15237fea2bc0c6ce6901566a6e8af4fa5443e2c5efbbebe6e5c8e9d81b"} Jan 28 20:59:55 crc kubenswrapper[4746]: I0128 20:59:55.774720 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"be7f3403-801e-45de-9517-8b3ca91d9682","Type":"ContainerDied","Data":"835bc7c38d859b9196b288139c7a228017c05f4f15ad9d28b666a732288589e7"} Jan 28 20:59:55 crc kubenswrapper[4746]: I0128 20:59:55.982920 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-msxbp" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.076202 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.119573 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/854c60ea-889a-449a-a74b-39f6f973f52c-operator-scripts\") pod \"854c60ea-889a-449a-a74b-39f6f973f52c\" (UID: \"854c60ea-889a-449a-a74b-39f6f973f52c\") " Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.119642 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcdrb\" (UniqueName: \"kubernetes.io/projected/854c60ea-889a-449a-a74b-39f6f973f52c-kube-api-access-qcdrb\") pod \"854c60ea-889a-449a-a74b-39f6f973f52c\" (UID: \"854c60ea-889a-449a-a74b-39f6f973f52c\") " Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.121917 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/854c60ea-889a-449a-a74b-39f6f973f52c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "854c60ea-889a-449a-a74b-39f6f973f52c" (UID: "854c60ea-889a-449a-a74b-39f6f973f52c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.127392 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/854c60ea-889a-449a-a74b-39f6f973f52c-kube-api-access-qcdrb" (OuterVolumeSpecName: "kube-api-access-qcdrb") pod "854c60ea-889a-449a-a74b-39f6f973f52c" (UID: "854c60ea-889a-449a-a74b-39f6f973f52c"). InnerVolumeSpecName "kube-api-access-qcdrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.221897 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-config-data\") pod \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.222006 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-logs\") pod \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.222110 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-combined-ca-bundle\") pod \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.222140 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-httpd-run\") pod \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.222257 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-scripts\") pod \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.222303 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wzsg\" (UniqueName: \"kubernetes.io/projected/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-kube-api-access-6wzsg\") pod \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.222434 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\") pod \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\" (UID: \"e3a2491c-4637-4cba-a11d-7c8dd8c703d1\") " Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.222929 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcdrb\" (UniqueName: \"kubernetes.io/projected/854c60ea-889a-449a-a74b-39f6f973f52c-kube-api-access-qcdrb\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.222965 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/854c60ea-889a-449a-a74b-39f6f973f52c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.223624 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e3a2491c-4637-4cba-a11d-7c8dd8c703d1" (UID: "e3a2491c-4637-4cba-a11d-7c8dd8c703d1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.226078 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-logs" (OuterVolumeSpecName: "logs") pod "e3a2491c-4637-4cba-a11d-7c8dd8c703d1" (UID: "e3a2491c-4637-4cba-a11d-7c8dd8c703d1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.226468 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-scripts" (OuterVolumeSpecName: "scripts") pod "e3a2491c-4637-4cba-a11d-7c8dd8c703d1" (UID: "e3a2491c-4637-4cba-a11d-7c8dd8c703d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.231309 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-kube-api-access-6wzsg" (OuterVolumeSpecName: "kube-api-access-6wzsg") pod "e3a2491c-4637-4cba-a11d-7c8dd8c703d1" (UID: "e3a2491c-4637-4cba-a11d-7c8dd8c703d1"). InnerVolumeSpecName "kube-api-access-6wzsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.244073 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d" (OuterVolumeSpecName: "glance") pod "e3a2491c-4637-4cba-a11d-7c8dd8c703d1" (UID: "e3a2491c-4637-4cba-a11d-7c8dd8c703d1"). InnerVolumeSpecName "pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.263243 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3a2491c-4637-4cba-a11d-7c8dd8c703d1" (UID: "e3a2491c-4637-4cba-a11d-7c8dd8c703d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.290690 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-config-data" (OuterVolumeSpecName: "config-data") pod "e3a2491c-4637-4cba-a11d-7c8dd8c703d1" (UID: "e3a2491c-4637-4cba-a11d-7c8dd8c703d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.290897 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.324510 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.324535 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wzsg\" (UniqueName: \"kubernetes.io/projected/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-kube-api-access-6wzsg\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.324575 4746 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\") on node \"crc\" " Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.324589 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.324602 4746 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-logs\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.324610 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.324618 4746 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e3a2491c-4637-4cba-a11d-7c8dd8c703d1-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.362701 4746 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.362893 4746 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d") on node "crc" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.425819 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383\") pod \"be7f3403-801e-45de-9517-8b3ca91d9682\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.425893 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/be7f3403-801e-45de-9517-8b3ca91d9682-httpd-run\") pod \"be7f3403-801e-45de-9517-8b3ca91d9682\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.425983 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shg2k\" (UniqueName: \"kubernetes.io/projected/be7f3403-801e-45de-9517-8b3ca91d9682-kube-api-access-shg2k\") pod \"be7f3403-801e-45de-9517-8b3ca91d9682\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.426006 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be7f3403-801e-45de-9517-8b3ca91d9682-config-data\") pod \"be7f3403-801e-45de-9517-8b3ca91d9682\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.426054 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be7f3403-801e-45de-9517-8b3ca91d9682-logs\") pod \"be7f3403-801e-45de-9517-8b3ca91d9682\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.426165 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be7f3403-801e-45de-9517-8b3ca91d9682-scripts\") pod \"be7f3403-801e-45de-9517-8b3ca91d9682\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.426185 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7f3403-801e-45de-9517-8b3ca91d9682-combined-ca-bundle\") pod \"be7f3403-801e-45de-9517-8b3ca91d9682\" (UID: \"be7f3403-801e-45de-9517-8b3ca91d9682\") " Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.426595 4746 reconciler_common.go:293] "Volume detached for volume \"pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.426594 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be7f3403-801e-45de-9517-8b3ca91d9682-logs" (OuterVolumeSpecName: "logs") pod "be7f3403-801e-45de-9517-8b3ca91d9682" (UID: "be7f3403-801e-45de-9517-8b3ca91d9682"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.426875 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be7f3403-801e-45de-9517-8b3ca91d9682-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "be7f3403-801e-45de-9517-8b3ca91d9682" (UID: "be7f3403-801e-45de-9517-8b3ca91d9682"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.430241 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7f3403-801e-45de-9517-8b3ca91d9682-scripts" (OuterVolumeSpecName: "scripts") pod "be7f3403-801e-45de-9517-8b3ca91d9682" (UID: "be7f3403-801e-45de-9517-8b3ca91d9682"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.431202 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be7f3403-801e-45de-9517-8b3ca91d9682-kube-api-access-shg2k" (OuterVolumeSpecName: "kube-api-access-shg2k") pod "be7f3403-801e-45de-9517-8b3ca91d9682" (UID: "be7f3403-801e-45de-9517-8b3ca91d9682"). InnerVolumeSpecName "kube-api-access-shg2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.441387 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383" (OuterVolumeSpecName: "glance") pod "be7f3403-801e-45de-9517-8b3ca91d9682" (UID: "be7f3403-801e-45de-9517-8b3ca91d9682"). InnerVolumeSpecName "pvc-95469877-e687-4d8b-97fe-080814cf4383". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.450813 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7f3403-801e-45de-9517-8b3ca91d9682-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be7f3403-801e-45de-9517-8b3ca91d9682" (UID: "be7f3403-801e-45de-9517-8b3ca91d9682"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.475939 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be7f3403-801e-45de-9517-8b3ca91d9682-config-data" (OuterVolumeSpecName: "config-data") pod "be7f3403-801e-45de-9517-8b3ca91d9682" (UID: "be7f3403-801e-45de-9517-8b3ca91d9682"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.528261 4746 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be7f3403-801e-45de-9517-8b3ca91d9682-logs\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.528306 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be7f3403-801e-45de-9517-8b3ca91d9682-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.528321 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7f3403-801e-45de-9517-8b3ca91d9682-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.528364 4746 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-95469877-e687-4d8b-97fe-080814cf4383\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383\") on node \"crc\" " Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.528376 4746 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/be7f3403-801e-45de-9517-8b3ca91d9682-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.528387 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shg2k\" (UniqueName: \"kubernetes.io/projected/be7f3403-801e-45de-9517-8b3ca91d9682-kube-api-access-shg2k\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.528397 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be7f3403-801e-45de-9517-8b3ca91d9682-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.552553 4746 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.552934 4746 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-95469877-e687-4d8b-97fe-080814cf4383" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383") on node "crc" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.630089 4746 reconciler_common.go:293] "Volume detached for volume \"pvc-95469877-e687-4d8b-97fe-080814cf4383\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.788451 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.788438 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"be7f3403-801e-45de-9517-8b3ca91d9682","Type":"ContainerDied","Data":"b8e4ac400720964fcbec31f5564ece3c113b3114169cec21addcb297ed3ec4e6"} Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.790338 4746 scope.go:117] "RemoveContainer" containerID="91b72c15237fea2bc0c6ce6901566a6e8af4fa5443e2c5efbbebe6e5c8e9d81b" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.791183 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924","Type":"ContainerStarted","Data":"0662539b92dae318d78b622e435f24e6de96b918733159dd31099925e363ef5b"} Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.793176 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" event={"ID":"ee1b35a7-970d-4abf-b645-eebbcadd7e8e","Type":"ContainerStarted","Data":"34cbf343b96758686d17c89cafd8ba1ef5a3222f320d6831f7f4f740567d37ab"} Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.794077 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.796367 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e3a2491c-4637-4cba-a11d-7c8dd8c703d1","Type":"ContainerDied","Data":"db43805063895c8d2ee0a81f3d782da9c0c4a76f2cc971a8cd4481cc8c37c0d2"} Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.796505 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.803627 4746 generic.go:334] "Generic (PLEG): container finished" podID="1d79950b-c574-4952-8620-ff635db5e8de" containerID="d23c067486da07f978541c7dba8e6461c6957dc68fe5452d6fe0d4b93cf13ed7" exitCode=0 Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.803699 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jms86" event={"ID":"1d79950b-c574-4952-8620-ff635db5e8de","Type":"ContainerDied","Data":"d23c067486da07f978541c7dba8e6461c6957dc68fe5452d6fe0d4b93cf13ed7"} Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.808670 4746 generic.go:334] "Generic (PLEG): container finished" podID="f81b35d0-5755-476e-a5c9-30036d654d53" containerID="b2ac020924b66e52e098563168d3a468eaa3d99aa0608464dec14ec0357795ec" exitCode=0 Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.814348 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-msxbp" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.809616 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lwqj2" event={"ID":"f81b35d0-5755-476e-a5c9-30036d654d53","Type":"ContainerDied","Data":"b2ac020924b66e52e098563168d3a468eaa3d99aa0608464dec14ec0357795ec"} Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.821228 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" podStartSLOduration=8.821208303 podStartE2EDuration="8.821208303s" podCreationTimestamp="2026-01-28 20:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 20:59:56.809162288 +0000 UTC m=+1224.765348642" watchObservedRunningTime="2026-01-28 20:59:56.821208303 +0000 UTC m=+1224.777394657" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.824174 4746 scope.go:117] "RemoveContainer" containerID="835bc7c38d859b9196b288139c7a228017c05f4f15ad9d28b666a732288589e7" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.853661 4746 scope.go:117] "RemoveContainer" containerID="8b1171cc065d5d7915beb03ba398c2872dee6b0f54de2064335253d18a44f563" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.892352 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.900029 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.920197 4746 scope.go:117] "RemoveContainer" containerID="92e177f5ba767ba3a0ccc9b14089d91ff587cfd6340355ea56c80ef975741ee1" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.921006 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 20:59:56 crc kubenswrapper[4746]: E0128 20:59:56.921422 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3fee24-35f9-4695-b25d-b49430708c43" containerName="dnsmasq-dns" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.921441 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3fee24-35f9-4695-b25d-b49430708c43" containerName="dnsmasq-dns" Jan 28 20:59:56 crc kubenswrapper[4746]: E0128 20:59:56.921452 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be7f3403-801e-45de-9517-8b3ca91d9682" containerName="glance-httpd" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.921461 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="be7f3403-801e-45de-9517-8b3ca91d9682" containerName="glance-httpd" Jan 28 20:59:56 crc kubenswrapper[4746]: E0128 20:59:56.921474 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="854c60ea-889a-449a-a74b-39f6f973f52c" containerName="mariadb-account-create-update" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.921480 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="854c60ea-889a-449a-a74b-39f6f973f52c" containerName="mariadb-account-create-update" Jan 28 20:59:56 crc kubenswrapper[4746]: E0128 20:59:56.921495 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3fee24-35f9-4695-b25d-b49430708c43" containerName="init" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.921500 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3fee24-35f9-4695-b25d-b49430708c43" containerName="init" Jan 28 20:59:56 crc kubenswrapper[4746]: E0128 20:59:56.921519 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be7f3403-801e-45de-9517-8b3ca91d9682" containerName="glance-log" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.921525 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="be7f3403-801e-45de-9517-8b3ca91d9682" containerName="glance-log" Jan 28 20:59:56 crc kubenswrapper[4746]: E0128 20:59:56.921535 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a2491c-4637-4cba-a11d-7c8dd8c703d1" containerName="glance-log" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.921541 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a2491c-4637-4cba-a11d-7c8dd8c703d1" containerName="glance-log" Jan 28 20:59:56 crc kubenswrapper[4746]: E0128 20:59:56.921553 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a2491c-4637-4cba-a11d-7c8dd8c703d1" containerName="glance-httpd" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.921559 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a2491c-4637-4cba-a11d-7c8dd8c703d1" containerName="glance-httpd" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.921733 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="be7f3403-801e-45de-9517-8b3ca91d9682" containerName="glance-httpd" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.921751 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c3fee24-35f9-4695-b25d-b49430708c43" containerName="dnsmasq-dns" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.921762 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="be7f3403-801e-45de-9517-8b3ca91d9682" containerName="glance-log" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.921773 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a2491c-4637-4cba-a11d-7c8dd8c703d1" containerName="glance-httpd" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.921784 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="854c60ea-889a-449a-a74b-39f6f973f52c" containerName="mariadb-account-create-update" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.921795 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a2491c-4637-4cba-a11d-7c8dd8c703d1" containerName="glance-log" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.922810 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.926482 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.926602 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-sx27z" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.926672 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.926834 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.943407 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.954780 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.965463 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.974374 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.976702 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.978488 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.979485 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 28 20:59:56 crc kubenswrapper[4746]: I0128 20:59:56.995371 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.038127 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8355d3eb-8a59-4393-b04b-a44d9dd7824f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.038341 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.038429 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8355d3eb-8a59-4393-b04b-a44d9dd7824f-logs\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.038531 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-scripts\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.038730 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bae342c7-f51f-4da2-a419-61002cc82f59-logs\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.038789 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-95469877-e687-4d8b-97fe-080814cf4383\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.038854 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87vlc\" (UniqueName: \"kubernetes.io/projected/bae342c7-f51f-4da2-a419-61002cc82f59-kube-api-access-87vlc\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.038969 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.039057 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.039170 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.039281 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bae342c7-f51f-4da2-a419-61002cc82f59-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.039314 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.039368 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.039387 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-config-data\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.039405 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.039431 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhffj\" (UniqueName: \"kubernetes.io/projected/8355d3eb-8a59-4393-b04b-a44d9dd7824f-kube-api-access-zhffj\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.114329 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-msxbp"] Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.122643 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-msxbp"] Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.141204 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bae342c7-f51f-4da2-a419-61002cc82f59-logs\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.141531 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-95469877-e687-4d8b-97fe-080814cf4383\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.142242 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bae342c7-f51f-4da2-a419-61002cc82f59-logs\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.142298 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87vlc\" (UniqueName: \"kubernetes.io/projected/bae342c7-f51f-4da2-a419-61002cc82f59-kube-api-access-87vlc\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.142376 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.142415 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.142471 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.142523 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bae342c7-f51f-4da2-a419-61002cc82f59-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.142548 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.142591 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.142610 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-config-data\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.142628 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.142659 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhffj\" (UniqueName: \"kubernetes.io/projected/8355d3eb-8a59-4393-b04b-a44d9dd7824f-kube-api-access-zhffj\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.142680 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8355d3eb-8a59-4393-b04b-a44d9dd7824f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.142697 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.142726 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8355d3eb-8a59-4393-b04b-a44d9dd7824f-logs\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.142742 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-scripts\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.144428 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bae342c7-f51f-4da2-a419-61002cc82f59-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.144452 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8355d3eb-8a59-4393-b04b-a44d9dd7824f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.144733 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8355d3eb-8a59-4393-b04b-a44d9dd7824f-logs\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.147635 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-scripts\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.147660 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.147872 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.147906 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-95469877-e687-4d8b-97fe-080814cf4383\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ca8c336a9061554abfcbd88e82c904ba958cd0f903c4270744870a313861497c/globalmount\"" pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.148025 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.148068 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/04ac8a11379bea80fa995f6a81e21b4d865afcddeb1b26b6c4ebff6fe431a0b6/globalmount\"" pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.148296 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.148422 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-config-data\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.149477 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.152950 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.154415 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.163048 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.164035 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87vlc\" (UniqueName: \"kubernetes.io/projected/bae342c7-f51f-4da2-a419-61002cc82f59-kube-api-access-87vlc\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.169963 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhffj\" (UniqueName: \"kubernetes.io/projected/8355d3eb-8a59-4393-b04b-a44d9dd7824f-kube-api-access-zhffj\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.188849 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-95469877-e687-4d8b-97fe-080814cf4383\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383\") pod \"glance-default-external-api-0\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.201514 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\") pod \"glance-default-internal-api-0\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.240423 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.299783 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.775603 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.833558 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8355d3eb-8a59-4393-b04b-a44d9dd7824f","Type":"ContainerStarted","Data":"a27ca5c58be1de3f3f8262768402c61ffabf4e7622be35e73a021d98a7eb4b0c"} Jan 28 20:59:57 crc kubenswrapper[4746]: I0128 20:59:57.894862 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 20:59:57 crc kubenswrapper[4746]: W0128 20:59:57.908722 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbae342c7_f51f_4da2_a419_61002cc82f59.slice/crio-9dfdd9c2342569e8ca259d7dfebb6487fe980a383cbf54d15bbad64325b0791b WatchSource:0}: Error finding container 9dfdd9c2342569e8ca259d7dfebb6487fe980a383cbf54d15bbad64325b0791b: Status 404 returned error can't find the container with id 9dfdd9c2342569e8ca259d7dfebb6487fe980a383cbf54d15bbad64325b0791b Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.264524 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jms86" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.378898 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d79950b-c574-4952-8620-ff635db5e8de-combined-ca-bundle\") pod \"1d79950b-c574-4952-8620-ff635db5e8de\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.378981 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d79950b-c574-4952-8620-ff635db5e8de-scripts\") pod \"1d79950b-c574-4952-8620-ff635db5e8de\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.379004 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d79950b-c574-4952-8620-ff635db5e8de-config-data\") pod \"1d79950b-c574-4952-8620-ff635db5e8de\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.379070 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cskxd\" (UniqueName: \"kubernetes.io/projected/1d79950b-c574-4952-8620-ff635db5e8de-kube-api-access-cskxd\") pod \"1d79950b-c574-4952-8620-ff635db5e8de\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.379133 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d79950b-c574-4952-8620-ff635db5e8de-logs\") pod \"1d79950b-c574-4952-8620-ff635db5e8de\" (UID: \"1d79950b-c574-4952-8620-ff635db5e8de\") " Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.382311 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d79950b-c574-4952-8620-ff635db5e8de-logs" (OuterVolumeSpecName: "logs") pod "1d79950b-c574-4952-8620-ff635db5e8de" (UID: "1d79950b-c574-4952-8620-ff635db5e8de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.389236 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d79950b-c574-4952-8620-ff635db5e8de-scripts" (OuterVolumeSpecName: "scripts") pod "1d79950b-c574-4952-8620-ff635db5e8de" (UID: "1d79950b-c574-4952-8620-ff635db5e8de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.390689 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d79950b-c574-4952-8620-ff635db5e8de-kube-api-access-cskxd" (OuterVolumeSpecName: "kube-api-access-cskxd") pod "1d79950b-c574-4952-8620-ff635db5e8de" (UID: "1d79950b-c574-4952-8620-ff635db5e8de"). InnerVolumeSpecName "kube-api-access-cskxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.414224 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d79950b-c574-4952-8620-ff635db5e8de-config-data" (OuterVolumeSpecName: "config-data") pod "1d79950b-c574-4952-8620-ff635db5e8de" (UID: "1d79950b-c574-4952-8620-ff635db5e8de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.424006 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d79950b-c574-4952-8620-ff635db5e8de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d79950b-c574-4952-8620-ff635db5e8de" (UID: "1d79950b-c574-4952-8620-ff635db5e8de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.441231 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.482761 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgtdv\" (UniqueName: \"kubernetes.io/projected/f81b35d0-5755-476e-a5c9-30036d654d53-kube-api-access-vgtdv\") pod \"f81b35d0-5755-476e-a5c9-30036d654d53\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.482891 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-credential-keys\") pod \"f81b35d0-5755-476e-a5c9-30036d654d53\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.482956 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-scripts\") pod \"f81b35d0-5755-476e-a5c9-30036d654d53\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.483050 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-combined-ca-bundle\") pod \"f81b35d0-5755-476e-a5c9-30036d654d53\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.483235 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-fernet-keys\") pod \"f81b35d0-5755-476e-a5c9-30036d654d53\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.483280 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-config-data\") pod \"f81b35d0-5755-476e-a5c9-30036d654d53\" (UID: \"f81b35d0-5755-476e-a5c9-30036d654d53\") " Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.483872 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d79950b-c574-4952-8620-ff635db5e8de-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.483893 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d79950b-c574-4952-8620-ff635db5e8de-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.483904 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cskxd\" (UniqueName: \"kubernetes.io/projected/1d79950b-c574-4952-8620-ff635db5e8de-kube-api-access-cskxd\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.483915 4746 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d79950b-c574-4952-8620-ff635db5e8de-logs\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.483925 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d79950b-c574-4952-8620-ff635db5e8de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.529945 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f81b35d0-5755-476e-a5c9-30036d654d53" (UID: "f81b35d0-5755-476e-a5c9-30036d654d53"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.532622 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-scripts" (OuterVolumeSpecName: "scripts") pod "f81b35d0-5755-476e-a5c9-30036d654d53" (UID: "f81b35d0-5755-476e-a5c9-30036d654d53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.538687 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f81b35d0-5755-476e-a5c9-30036d654d53-kube-api-access-vgtdv" (OuterVolumeSpecName: "kube-api-access-vgtdv") pod "f81b35d0-5755-476e-a5c9-30036d654d53" (UID: "f81b35d0-5755-476e-a5c9-30036d654d53"). InnerVolumeSpecName "kube-api-access-vgtdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.542252 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f81b35d0-5755-476e-a5c9-30036d654d53" (UID: "f81b35d0-5755-476e-a5c9-30036d654d53"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.575233 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f81b35d0-5755-476e-a5c9-30036d654d53" (UID: "f81b35d0-5755-476e-a5c9-30036d654d53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.582203 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-config-data" (OuterVolumeSpecName: "config-data") pod "f81b35d0-5755-476e-a5c9-30036d654d53" (UID: "f81b35d0-5755-476e-a5c9-30036d654d53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.585205 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.585232 4746 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.585241 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.585251 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgtdv\" (UniqueName: \"kubernetes.io/projected/f81b35d0-5755-476e-a5c9-30036d654d53-kube-api-access-vgtdv\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.585261 4746 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.585268 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f81b35d0-5755-476e-a5c9-30036d654d53-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.890257 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="854c60ea-889a-449a-a74b-39f6f973f52c" path="/var/lib/kubelet/pods/854c60ea-889a-449a-a74b-39f6f973f52c/volumes" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.891883 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be7f3403-801e-45de-9517-8b3ca91d9682" path="/var/lib/kubelet/pods/be7f3403-801e-45de-9517-8b3ca91d9682/volumes" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.900656 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3a2491c-4637-4cba-a11d-7c8dd8c703d1" path="/var/lib/kubelet/pods/e3a2491c-4637-4cba-a11d-7c8dd8c703d1/volumes" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.919526 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jms86" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.919678 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jms86" event={"ID":"1d79950b-c574-4952-8620-ff635db5e8de","Type":"ContainerDied","Data":"d57f309a75fd50699f5749a419ddbb64210cff0c9de91f7336112d123a4bb307"} Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.919762 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d57f309a75fd50699f5749a419ddbb64210cff0c9de91f7336112d123a4bb307" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.938495 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lwqj2" event={"ID":"f81b35d0-5755-476e-a5c9-30036d654d53","Type":"ContainerDied","Data":"cd23339f5df85b446202742504369732d4a27be983614bcb13cac53292b9e8cd"} Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.938535 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd23339f5df85b446202742504369732d4a27be983614bcb13cac53292b9e8cd" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.938654 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lwqj2" Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.958323 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8355d3eb-8a59-4393-b04b-a44d9dd7824f","Type":"ContainerStarted","Data":"0f88e834003af2207446e657d01363eeba0cbb7e9726650317015c56dd154be4"} Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.971666 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bae342c7-f51f-4da2-a419-61002cc82f59","Type":"ContainerStarted","Data":"9dfdd9c2342569e8ca259d7dfebb6487fe980a383cbf54d15bbad64325b0791b"} Jan 28 20:59:58 crc kubenswrapper[4746]: I0128 20:59:58.996851 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wwfkc" event={"ID":"a587d3d9-972c-47ae-8e29-5bfd977ff429","Type":"ContainerStarted","Data":"bd6ee0278b68974bd3186415cd7d01885d9dea5c6ff0d0064894eda32cec0c8f"} Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.019970 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5755bdbcc4-rbmx8"] Jan 28 20:59:59 crc kubenswrapper[4746]: E0128 20:59:59.020452 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d79950b-c574-4952-8620-ff635db5e8de" containerName="placement-db-sync" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.020471 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d79950b-c574-4952-8620-ff635db5e8de" containerName="placement-db-sync" Jan 28 20:59:59 crc kubenswrapper[4746]: E0128 20:59:59.020495 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f81b35d0-5755-476e-a5c9-30036d654d53" containerName="keystone-bootstrap" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.020502 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81b35d0-5755-476e-a5c9-30036d654d53" containerName="keystone-bootstrap" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.020670 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d79950b-c574-4952-8620-ff635db5e8de" containerName="placement-db-sync" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.020700 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="f81b35d0-5755-476e-a5c9-30036d654d53" containerName="keystone-bootstrap" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.021679 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.025311 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.025529 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.025658 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.025764 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.027617 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-l5j6d" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.056825 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-wwfkc" podStartSLOduration=3.9192619090000003 podStartE2EDuration="43.056805878s" podCreationTimestamp="2026-01-28 20:59:16 +0000 UTC" firstStartedPulling="2026-01-28 20:59:18.233752211 +0000 UTC m=+1186.189938565" lastFinishedPulling="2026-01-28 20:59:57.37129618 +0000 UTC m=+1225.327482534" observedRunningTime="2026-01-28 20:59:59.024977371 +0000 UTC m=+1226.981163725" watchObservedRunningTime="2026-01-28 20:59:59.056805878 +0000 UTC m=+1227.012992232" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.057973 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5755bdbcc4-rbmx8"] Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.098421 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12408645-b253-4e59-bd2f-5a4ec243cabd-config-data\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.098499 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12408645-b253-4e59-bd2f-5a4ec243cabd-combined-ca-bundle\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.098558 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12408645-b253-4e59-bd2f-5a4ec243cabd-logs\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.098584 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12408645-b253-4e59-bd2f-5a4ec243cabd-scripts\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.098702 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12408645-b253-4e59-bd2f-5a4ec243cabd-public-tls-certs\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.098724 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12408645-b253-4e59-bd2f-5a4ec243cabd-internal-tls-certs\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.098817 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m65bx\" (UniqueName: \"kubernetes.io/projected/12408645-b253-4e59-bd2f-5a4ec243cabd-kube-api-access-m65bx\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.131953 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6ff88f78d4-bh6qm"] Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.133192 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.141485 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.141949 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.142082 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.142250 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-j5js5" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.142547 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.142702 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.180649 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6ff88f78d4-bh6qm"] Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.203680 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12408645-b253-4e59-bd2f-5a4ec243cabd-logs\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.203782 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12408645-b253-4e59-bd2f-5a4ec243cabd-scripts\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.203843 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12408645-b253-4e59-bd2f-5a4ec243cabd-public-tls-certs\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.203888 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12408645-b253-4e59-bd2f-5a4ec243cabd-internal-tls-certs\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.203943 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-combined-ca-bundle\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.204201 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-config-data\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.204564 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-scripts\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.204677 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m65bx\" (UniqueName: \"kubernetes.io/projected/12408645-b253-4e59-bd2f-5a4ec243cabd-kube-api-access-m65bx\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.204728 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-credential-keys\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.204786 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkk4s\" (UniqueName: \"kubernetes.io/projected/b8f1ba06-a425-4474-94a2-80c68832caac-kube-api-access-jkk4s\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.204946 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-public-tls-certs\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.205142 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12408645-b253-4e59-bd2f-5a4ec243cabd-config-data\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.205231 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-internal-tls-certs\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.205263 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12408645-b253-4e59-bd2f-5a4ec243cabd-combined-ca-bundle\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.205333 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-fernet-keys\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.206360 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12408645-b253-4e59-bd2f-5a4ec243cabd-logs\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.216878 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12408645-b253-4e59-bd2f-5a4ec243cabd-internal-tls-certs\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.238131 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12408645-b253-4e59-bd2f-5a4ec243cabd-combined-ca-bundle\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.239845 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12408645-b253-4e59-bd2f-5a4ec243cabd-scripts\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.240422 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12408645-b253-4e59-bd2f-5a4ec243cabd-public-tls-certs\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.240549 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12408645-b253-4e59-bd2f-5a4ec243cabd-config-data\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.255952 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m65bx\" (UniqueName: \"kubernetes.io/projected/12408645-b253-4e59-bd2f-5a4ec243cabd-kube-api-access-m65bx\") pod \"placement-5755bdbcc4-rbmx8\" (UID: \"12408645-b253-4e59-bd2f-5a4ec243cabd\") " pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.307862 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-public-tls-certs\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.308573 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-internal-tls-certs\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.308657 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-fernet-keys\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.308764 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-combined-ca-bundle\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.308872 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-config-data\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.309765 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-scripts\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.309882 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-credential-keys\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.309954 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkk4s\" (UniqueName: \"kubernetes.io/projected/b8f1ba06-a425-4474-94a2-80c68832caac-kube-api-access-jkk4s\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.313364 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-public-tls-certs\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.316521 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-combined-ca-bundle\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.317336 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-scripts\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.318074 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-fernet-keys\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.318260 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-credential-keys\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.318545 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-internal-tls-certs\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.318731 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8f1ba06-a425-4474-94a2-80c68832caac-config-data\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.342935 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkk4s\" (UniqueName: \"kubernetes.io/projected/b8f1ba06-a425-4474-94a2-80c68832caac-kube-api-access-jkk4s\") pod \"keystone-6ff88f78d4-bh6qm\" (UID: \"b8f1ba06-a425-4474-94a2-80c68832caac\") " pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.371218 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 20:59:59 crc kubenswrapper[4746]: I0128 20:59:59.464606 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.045277 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bae342c7-f51f-4da2-a419-61002cc82f59","Type":"ContainerStarted","Data":"22db722f2f2d19a10ec9c1f4b3aab062be23dcb8406902ca7d46a922791146a2"} Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.079906 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5755bdbcc4-rbmx8"] Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.198411 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw"] Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.204803 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.214561 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.214755 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.233748 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw"] Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.356178 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74f95799-0ea5-46f5-b2e7-7ef3370e9215-secret-volume\") pod \"collect-profiles-29493900-dxkdw\" (UID: \"74f95799-0ea5-46f5-b2e7-7ef3370e9215\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.356249 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjqlb\" (UniqueName: \"kubernetes.io/projected/74f95799-0ea5-46f5-b2e7-7ef3370e9215-kube-api-access-mjqlb\") pod \"collect-profiles-29493900-dxkdw\" (UID: \"74f95799-0ea5-46f5-b2e7-7ef3370e9215\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.356305 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74f95799-0ea5-46f5-b2e7-7ef3370e9215-config-volume\") pod \"collect-profiles-29493900-dxkdw\" (UID: \"74f95799-0ea5-46f5-b2e7-7ef3370e9215\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.363425 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6ff88f78d4-bh6qm"] Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.458452 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74f95799-0ea5-46f5-b2e7-7ef3370e9215-config-volume\") pod \"collect-profiles-29493900-dxkdw\" (UID: \"74f95799-0ea5-46f5-b2e7-7ef3370e9215\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.458621 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74f95799-0ea5-46f5-b2e7-7ef3370e9215-secret-volume\") pod \"collect-profiles-29493900-dxkdw\" (UID: \"74f95799-0ea5-46f5-b2e7-7ef3370e9215\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.458698 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjqlb\" (UniqueName: \"kubernetes.io/projected/74f95799-0ea5-46f5-b2e7-7ef3370e9215-kube-api-access-mjqlb\") pod \"collect-profiles-29493900-dxkdw\" (UID: \"74f95799-0ea5-46f5-b2e7-7ef3370e9215\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.459465 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74f95799-0ea5-46f5-b2e7-7ef3370e9215-config-volume\") pod \"collect-profiles-29493900-dxkdw\" (UID: \"74f95799-0ea5-46f5-b2e7-7ef3370e9215\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.480538 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjqlb\" (UniqueName: \"kubernetes.io/projected/74f95799-0ea5-46f5-b2e7-7ef3370e9215-kube-api-access-mjqlb\") pod \"collect-profiles-29493900-dxkdw\" (UID: \"74f95799-0ea5-46f5-b2e7-7ef3370e9215\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.493836 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74f95799-0ea5-46f5-b2e7-7ef3370e9215-secret-volume\") pod \"collect-profiles-29493900-dxkdw\" (UID: \"74f95799-0ea5-46f5-b2e7-7ef3370e9215\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.631033 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.747708 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-gknbb"] Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.748974 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gknbb" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.752421 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.768337 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gknbb"] Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.865860 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28c4d6e6-e370-43ea-855a-a108b80076f6-operator-scripts\") pod \"root-account-create-update-gknbb\" (UID: \"28c4d6e6-e370-43ea-855a-a108b80076f6\") " pod="openstack/root-account-create-update-gknbb" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.865917 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg5fk\" (UniqueName: \"kubernetes.io/projected/28c4d6e6-e370-43ea-855a-a108b80076f6-kube-api-access-hg5fk\") pod \"root-account-create-update-gknbb\" (UID: \"28c4d6e6-e370-43ea-855a-a108b80076f6\") " pod="openstack/root-account-create-update-gknbb" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.968113 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28c4d6e6-e370-43ea-855a-a108b80076f6-operator-scripts\") pod \"root-account-create-update-gknbb\" (UID: \"28c4d6e6-e370-43ea-855a-a108b80076f6\") " pod="openstack/root-account-create-update-gknbb" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.968189 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg5fk\" (UniqueName: \"kubernetes.io/projected/28c4d6e6-e370-43ea-855a-a108b80076f6-kube-api-access-hg5fk\") pod \"root-account-create-update-gknbb\" (UID: \"28c4d6e6-e370-43ea-855a-a108b80076f6\") " pod="openstack/root-account-create-update-gknbb" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.970651 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28c4d6e6-e370-43ea-855a-a108b80076f6-operator-scripts\") pod \"root-account-create-update-gknbb\" (UID: \"28c4d6e6-e370-43ea-855a-a108b80076f6\") " pod="openstack/root-account-create-update-gknbb" Jan 28 21:00:00 crc kubenswrapper[4746]: I0128 21:00:00.985751 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg5fk\" (UniqueName: \"kubernetes.io/projected/28c4d6e6-e370-43ea-855a-a108b80076f6-kube-api-access-hg5fk\") pod \"root-account-create-update-gknbb\" (UID: \"28c4d6e6-e370-43ea-855a-a108b80076f6\") " pod="openstack/root-account-create-update-gknbb" Jan 28 21:00:01 crc kubenswrapper[4746]: I0128 21:00:01.057919 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5755bdbcc4-rbmx8" event={"ID":"12408645-b253-4e59-bd2f-5a4ec243cabd","Type":"ContainerStarted","Data":"65fd584a86a80578973a85d04e3a98a7c46d3ff45c124d5c894934fbbd215dde"} Jan 28 21:00:01 crc kubenswrapper[4746]: I0128 21:00:01.108328 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gknbb" Jan 28 21:00:03 crc kubenswrapper[4746]: I0128 21:00:03.987230 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 21:00:04 crc kubenswrapper[4746]: I0128 21:00:04.057806 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-pdnv6"] Jan 28 21:00:04 crc kubenswrapper[4746]: I0128 21:00:04.058118 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-pdnv6" podUID="b1a60d3b-bcc7-47e3-94b0-12acae8ccfda" containerName="dnsmasq-dns" containerID="cri-o://c8423b419407b795d407f9fc1c460c07a6195201c2880c2f205d0e52f6c92d2c" gracePeriod=10 Jan 28 21:00:05 crc kubenswrapper[4746]: W0128 21:00:05.438591 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8f1ba06_a425_4474_94a2_80c68832caac.slice/crio-7b395b74d0eede6d3ac8583b7c15b333a012658e2aea3134e88c4f34749362c7 WatchSource:0}: Error finding container 7b395b74d0eede6d3ac8583b7c15b333a012658e2aea3134e88c4f34749362c7: Status 404 returned error can't find the container with id 7b395b74d0eede6d3ac8583b7c15b333a012658e2aea3134e88c4f34749362c7 Jan 28 21:00:06 crc kubenswrapper[4746]: I0128 21:00:06.112899 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6ff88f78d4-bh6qm" event={"ID":"b8f1ba06-a425-4474-94a2-80c68832caac","Type":"ContainerStarted","Data":"7b395b74d0eede6d3ac8583b7c15b333a012658e2aea3134e88c4f34749362c7"} Jan 28 21:00:06 crc kubenswrapper[4746]: I0128 21:00:06.237622 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gknbb"] Jan 28 21:00:06 crc kubenswrapper[4746]: I0128 21:00:06.305923 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-pdnv6" podUID="b1a60d3b-bcc7-47e3-94b0-12acae8ccfda" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: connect: connection refused" Jan 28 21:00:06 crc kubenswrapper[4746]: I0128 21:00:06.363120 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw"] Jan 28 21:00:07 crc kubenswrapper[4746]: I0128 21:00:07.151319 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8355d3eb-8a59-4393-b04b-a44d9dd7824f","Type":"ContainerStarted","Data":"1dba7a40666ee812b896899192e9f69d97b8af339400b44d5dcaa898d4fbb22b"} Jan 28 21:00:08 crc kubenswrapper[4746]: I0128 21:00:08.165495 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bae342c7-f51f-4da2-a419-61002cc82f59","Type":"ContainerStarted","Data":"25d5b3df9baae67da6b02e76a3bf75adbce74ffa20972d28ed3717793041ff40"} Jan 28 21:00:08 crc kubenswrapper[4746]: I0128 21:00:08.169921 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5755bdbcc4-rbmx8" event={"ID":"12408645-b253-4e59-bd2f-5a4ec243cabd","Type":"ContainerStarted","Data":"1df5e0f26320cb77683f756b7c8db75ee2f80c3b5b8e2608999a2411ee4e58d9"} Jan 28 21:00:08 crc kubenswrapper[4746]: I0128 21:00:08.171765 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6ff88f78d4-bh6qm" event={"ID":"b8f1ba06-a425-4474-94a2-80c68832caac","Type":"ContainerStarted","Data":"57830257005e8f32808f199afc6b8c2c8baa7bd30fa1c4e6bd64cbb0e0c9afc2"} Jan 28 21:00:08 crc kubenswrapper[4746]: I0128 21:00:08.171969 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 21:00:08 crc kubenswrapper[4746]: I0128 21:00:08.173920 4746 generic.go:334] "Generic (PLEG): container finished" podID="b1a60d3b-bcc7-47e3-94b0-12acae8ccfda" containerID="c8423b419407b795d407f9fc1c460c07a6195201c2880c2f205d0e52f6c92d2c" exitCode=0 Jan 28 21:00:08 crc kubenswrapper[4746]: I0128 21:00:08.175069 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-pdnv6" event={"ID":"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda","Type":"ContainerDied","Data":"c8423b419407b795d407f9fc1c460c07a6195201c2880c2f205d0e52f6c92d2c"} Jan 28 21:00:08 crc kubenswrapper[4746]: I0128 21:00:08.190993 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=12.190973844 podStartE2EDuration="12.190973844s" podCreationTimestamp="2026-01-28 20:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:08.190546363 +0000 UTC m=+1236.146732717" watchObservedRunningTime="2026-01-28 21:00:08.190973844 +0000 UTC m=+1236.147160208" Jan 28 21:00:08 crc kubenswrapper[4746]: I0128 21:00:08.229650 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6ff88f78d4-bh6qm" podStartSLOduration=9.229628904 podStartE2EDuration="9.229628904s" podCreationTimestamp="2026-01-28 20:59:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:08.222557214 +0000 UTC m=+1236.178743568" watchObservedRunningTime="2026-01-28 21:00:08.229628904 +0000 UTC m=+1236.185815258" Jan 28 21:00:08 crc kubenswrapper[4746]: I0128 21:00:08.257472 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=12.257445833 podStartE2EDuration="12.257445833s" podCreationTimestamp="2026-01-28 20:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:08.248931054 +0000 UTC m=+1236.205117428" watchObservedRunningTime="2026-01-28 21:00:08.257445833 +0000 UTC m=+1236.213632187" Jan 28 21:00:09 crc kubenswrapper[4746]: I0128 21:00:09.638116 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 21:00:09 crc kubenswrapper[4746]: I0128 21:00:09.677743 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-dns-svc\") pod \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " Jan 28 21:00:09 crc kubenswrapper[4746]: I0128 21:00:09.677806 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-ovsdbserver-sb\") pod \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " Jan 28 21:00:09 crc kubenswrapper[4746]: I0128 21:00:09.677998 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-ovsdbserver-nb\") pod \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " Jan 28 21:00:09 crc kubenswrapper[4746]: I0128 21:00:09.678037 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frf92\" (UniqueName: \"kubernetes.io/projected/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-kube-api-access-frf92\") pod \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " Jan 28 21:00:09 crc kubenswrapper[4746]: I0128 21:00:09.678134 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-config\") pod \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\" (UID: \"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda\") " Jan 28 21:00:09 crc kubenswrapper[4746]: I0128 21:00:09.702198 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-kube-api-access-frf92" (OuterVolumeSpecName: "kube-api-access-frf92") pod "b1a60d3b-bcc7-47e3-94b0-12acae8ccfda" (UID: "b1a60d3b-bcc7-47e3-94b0-12acae8ccfda"). InnerVolumeSpecName "kube-api-access-frf92". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:09 crc kubenswrapper[4746]: I0128 21:00:09.762052 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b1a60d3b-bcc7-47e3-94b0-12acae8ccfda" (UID: "b1a60d3b-bcc7-47e3-94b0-12acae8ccfda"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:09 crc kubenswrapper[4746]: I0128 21:00:09.763718 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-config" (OuterVolumeSpecName: "config") pod "b1a60d3b-bcc7-47e3-94b0-12acae8ccfda" (UID: "b1a60d3b-bcc7-47e3-94b0-12acae8ccfda"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:09 crc kubenswrapper[4746]: I0128 21:00:09.764346 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b1a60d3b-bcc7-47e3-94b0-12acae8ccfda" (UID: "b1a60d3b-bcc7-47e3-94b0-12acae8ccfda"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:09 crc kubenswrapper[4746]: I0128 21:00:09.779449 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b1a60d3b-bcc7-47e3-94b0-12acae8ccfda" (UID: "b1a60d3b-bcc7-47e3-94b0-12acae8ccfda"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:09 crc kubenswrapper[4746]: I0128 21:00:09.782222 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-config\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:09 crc kubenswrapper[4746]: I0128 21:00:09.782254 4746 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:09 crc kubenswrapper[4746]: I0128 21:00:09.782265 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:09 crc kubenswrapper[4746]: I0128 21:00:09.782278 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:09 crc kubenswrapper[4746]: I0128 21:00:09.782289 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frf92\" (UniqueName: \"kubernetes.io/projected/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda-kube-api-access-frf92\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.196743 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924","Type":"ContainerStarted","Data":"1c384874e5c5af51f0459fdf2f1159ba60ac6c044fcea36d3c6e079ade91f9ad"} Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.198005 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gknbb" event={"ID":"28c4d6e6-e370-43ea-855a-a108b80076f6","Type":"ContainerStarted","Data":"e2f9bc8207b8a6d49eb69dfd58db2119cf145886998787bfafc4df9d95d1eb66"} Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.198078 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gknbb" event={"ID":"28c4d6e6-e370-43ea-855a-a108b80076f6","Type":"ContainerStarted","Data":"74478651cfc8d12179fd23741c3250911e80d1b83c20809f6d961fd41c2ed48a"} Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.199551 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-29n2p" event={"ID":"70e766dc-9f84-4d0c-af5b-3b044e06c09f","Type":"ContainerStarted","Data":"c5ca0c4931baaab786455361cdeeb73786cf5eb3e4b9d0efcad67e061109a926"} Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.200860 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw" event={"ID":"74f95799-0ea5-46f5-b2e7-7ef3370e9215","Type":"ContainerStarted","Data":"59a82c4c52affbab10dbd5190aec777d3afa26f9c37a8f62e333d4a0dea63385"} Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.200896 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw" event={"ID":"74f95799-0ea5-46f5-b2e7-7ef3370e9215","Type":"ContainerStarted","Data":"06b1cb42574a2f791a92c20c02a0c1da02705efd00239ad7c0ce3ed7d257dd4a"} Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.206918 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5755bdbcc4-rbmx8" event={"ID":"12408645-b253-4e59-bd2f-5a4ec243cabd","Type":"ContainerStarted","Data":"42b563b46c2a25e356e07c6bdb893f0eb0ddd15ed3453dee751643571e596f4d"} Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.207046 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.207092 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.208525 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-p9ghn" event={"ID":"cce95230-2b72-4598-9d28-3a1465803567","Type":"ContainerStarted","Data":"3a0fc6e40b63c3d0054964bf86868df5bae172f80bc0b177d845ac3b30f9697f"} Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.214161 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-pdnv6" event={"ID":"b1a60d3b-bcc7-47e3-94b0-12acae8ccfda","Type":"ContainerDied","Data":"d6add7168af2abe8b03cc109943ce81073fed15fb20d52da0172097ad1356ba3"} Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.214238 4746 scope.go:117] "RemoveContainer" containerID="c8423b419407b795d407f9fc1c460c07a6195201c2880c2f205d0e52f6c92d2c" Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.214238 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-pdnv6" Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.225008 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-gknbb" podStartSLOduration=10.224992127 podStartE2EDuration="10.224992127s" podCreationTimestamp="2026-01-28 21:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:10.221363689 +0000 UTC m=+1238.177550043" watchObservedRunningTime="2026-01-28 21:00:10.224992127 +0000 UTC m=+1238.181178481" Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.247776 4746 scope.go:117] "RemoveContainer" containerID="35baa398a2ff8247262c262e8c9b0d34000d20d9745720e78ff444d5c8abb948" Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.251559 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw" podStartSLOduration=10.251540921 podStartE2EDuration="10.251540921s" podCreationTimestamp="2026-01-28 21:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:10.245717535 +0000 UTC m=+1238.201903889" watchObservedRunningTime="2026-01-28 21:00:10.251540921 +0000 UTC m=+1238.207727275" Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.263649 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-p9ghn" podStartSLOduration=2.788099482 podStartE2EDuration="54.263631527s" podCreationTimestamp="2026-01-28 20:59:16 +0000 UTC" firstStartedPulling="2026-01-28 20:59:18.265876915 +0000 UTC m=+1186.222063269" lastFinishedPulling="2026-01-28 21:00:09.74140895 +0000 UTC m=+1237.697595314" observedRunningTime="2026-01-28 21:00:10.261434957 +0000 UTC m=+1238.217621311" watchObservedRunningTime="2026-01-28 21:00:10.263631527 +0000 UTC m=+1238.219817871" Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.291868 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-29n2p" podStartSLOduration=2.9470145580000002 podStartE2EDuration="54.291850476s" podCreationTimestamp="2026-01-28 20:59:16 +0000 UTC" firstStartedPulling="2026-01-28 20:59:18.277771806 +0000 UTC m=+1186.233958160" lastFinishedPulling="2026-01-28 21:00:09.622607724 +0000 UTC m=+1237.578794078" observedRunningTime="2026-01-28 21:00:10.286176753 +0000 UTC m=+1238.242363127" watchObservedRunningTime="2026-01-28 21:00:10.291850476 +0000 UTC m=+1238.248036830" Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.309253 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5755bdbcc4-rbmx8" podStartSLOduration=12.309232793 podStartE2EDuration="12.309232793s" podCreationTimestamp="2026-01-28 20:59:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:10.30355154 +0000 UTC m=+1238.259737894" watchObservedRunningTime="2026-01-28 21:00:10.309232793 +0000 UTC m=+1238.265419147" Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.328314 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-pdnv6"] Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.339496 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-pdnv6"] Jan 28 21:00:10 crc kubenswrapper[4746]: I0128 21:00:10.849846 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1a60d3b-bcc7-47e3-94b0-12acae8ccfda" path="/var/lib/kubelet/pods/b1a60d3b-bcc7-47e3-94b0-12acae8ccfda/volumes" Jan 28 21:00:11 crc kubenswrapper[4746]: I0128 21:00:11.227115 4746 generic.go:334] "Generic (PLEG): container finished" podID="28c4d6e6-e370-43ea-855a-a108b80076f6" containerID="e2f9bc8207b8a6d49eb69dfd58db2119cf145886998787bfafc4df9d95d1eb66" exitCode=0 Jan 28 21:00:11 crc kubenswrapper[4746]: I0128 21:00:11.227449 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gknbb" event={"ID":"28c4d6e6-e370-43ea-855a-a108b80076f6","Type":"ContainerDied","Data":"e2f9bc8207b8a6d49eb69dfd58db2119cf145886998787bfafc4df9d95d1eb66"} Jan 28 21:00:11 crc kubenswrapper[4746]: I0128 21:00:11.230766 4746 generic.go:334] "Generic (PLEG): container finished" podID="74f95799-0ea5-46f5-b2e7-7ef3370e9215" containerID="59a82c4c52affbab10dbd5190aec777d3afa26f9c37a8f62e333d4a0dea63385" exitCode=0 Jan 28 21:00:11 crc kubenswrapper[4746]: I0128 21:00:11.230811 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw" event={"ID":"74f95799-0ea5-46f5-b2e7-7ef3370e9215","Type":"ContainerDied","Data":"59a82c4c52affbab10dbd5190aec777d3afa26f9c37a8f62e333d4a0dea63385"} Jan 28 21:00:12 crc kubenswrapper[4746]: I0128 21:00:12.260249 4746 generic.go:334] "Generic (PLEG): container finished" podID="a587d3d9-972c-47ae-8e29-5bfd977ff429" containerID="bd6ee0278b68974bd3186415cd7d01885d9dea5c6ff0d0064894eda32cec0c8f" exitCode=0 Jan 28 21:00:12 crc kubenswrapper[4746]: I0128 21:00:12.260486 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wwfkc" event={"ID":"a587d3d9-972c-47ae-8e29-5bfd977ff429","Type":"ContainerDied","Data":"bd6ee0278b68974bd3186415cd7d01885d9dea5c6ff0d0064894eda32cec0c8f"} Jan 28 21:00:12 crc kubenswrapper[4746]: I0128 21:00:12.968746 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gknbb" Jan 28 21:00:12 crc kubenswrapper[4746]: I0128 21:00:12.977347 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.070417 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74f95799-0ea5-46f5-b2e7-7ef3370e9215-config-volume\") pod \"74f95799-0ea5-46f5-b2e7-7ef3370e9215\" (UID: \"74f95799-0ea5-46f5-b2e7-7ef3370e9215\") " Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.070923 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74f95799-0ea5-46f5-b2e7-7ef3370e9215-secret-volume\") pod \"74f95799-0ea5-46f5-b2e7-7ef3370e9215\" (UID: \"74f95799-0ea5-46f5-b2e7-7ef3370e9215\") " Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.071010 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjqlb\" (UniqueName: \"kubernetes.io/projected/74f95799-0ea5-46f5-b2e7-7ef3370e9215-kube-api-access-mjqlb\") pod \"74f95799-0ea5-46f5-b2e7-7ef3370e9215\" (UID: \"74f95799-0ea5-46f5-b2e7-7ef3370e9215\") " Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.071040 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28c4d6e6-e370-43ea-855a-a108b80076f6-operator-scripts\") pod \"28c4d6e6-e370-43ea-855a-a108b80076f6\" (UID: \"28c4d6e6-e370-43ea-855a-a108b80076f6\") " Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.071148 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg5fk\" (UniqueName: \"kubernetes.io/projected/28c4d6e6-e370-43ea-855a-a108b80076f6-kube-api-access-hg5fk\") pod \"28c4d6e6-e370-43ea-855a-a108b80076f6\" (UID: \"28c4d6e6-e370-43ea-855a-a108b80076f6\") " Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.071558 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74f95799-0ea5-46f5-b2e7-7ef3370e9215-config-volume" (OuterVolumeSpecName: "config-volume") pod "74f95799-0ea5-46f5-b2e7-7ef3370e9215" (UID: "74f95799-0ea5-46f5-b2e7-7ef3370e9215"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.072710 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28c4d6e6-e370-43ea-855a-a108b80076f6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "28c4d6e6-e370-43ea-855a-a108b80076f6" (UID: "28c4d6e6-e370-43ea-855a-a108b80076f6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.077067 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74f95799-0ea5-46f5-b2e7-7ef3370e9215-kube-api-access-mjqlb" (OuterVolumeSpecName: "kube-api-access-mjqlb") pod "74f95799-0ea5-46f5-b2e7-7ef3370e9215" (UID: "74f95799-0ea5-46f5-b2e7-7ef3370e9215"). InnerVolumeSpecName "kube-api-access-mjqlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.077651 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f95799-0ea5-46f5-b2e7-7ef3370e9215-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "74f95799-0ea5-46f5-b2e7-7ef3370e9215" (UID: "74f95799-0ea5-46f5-b2e7-7ef3370e9215"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.091401 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28c4d6e6-e370-43ea-855a-a108b80076f6-kube-api-access-hg5fk" (OuterVolumeSpecName: "kube-api-access-hg5fk") pod "28c4d6e6-e370-43ea-855a-a108b80076f6" (UID: "28c4d6e6-e370-43ea-855a-a108b80076f6"). InnerVolumeSpecName "kube-api-access-hg5fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.173276 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjqlb\" (UniqueName: \"kubernetes.io/projected/74f95799-0ea5-46f5-b2e7-7ef3370e9215-kube-api-access-mjqlb\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.173319 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28c4d6e6-e370-43ea-855a-a108b80076f6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.173332 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg5fk\" (UniqueName: \"kubernetes.io/projected/28c4d6e6-e370-43ea-855a-a108b80076f6-kube-api-access-hg5fk\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.173344 4746 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74f95799-0ea5-46f5-b2e7-7ef3370e9215-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.173356 4746 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74f95799-0ea5-46f5-b2e7-7ef3370e9215-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.275348 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gknbb" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.275349 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gknbb" event={"ID":"28c4d6e6-e370-43ea-855a-a108b80076f6","Type":"ContainerDied","Data":"74478651cfc8d12179fd23741c3250911e80d1b83c20809f6d961fd41c2ed48a"} Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.275411 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74478651cfc8d12179fd23741c3250911e80d1b83c20809f6d961fd41c2ed48a" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.280835 4746 generic.go:334] "Generic (PLEG): container finished" podID="70e766dc-9f84-4d0c-af5b-3b044e06c09f" containerID="c5ca0c4931baaab786455361cdeeb73786cf5eb3e4b9d0efcad67e061109a926" exitCode=0 Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.280933 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-29n2p" event={"ID":"70e766dc-9f84-4d0c-af5b-3b044e06c09f","Type":"ContainerDied","Data":"c5ca0c4931baaab786455361cdeeb73786cf5eb3e4b9d0efcad67e061109a926"} Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.284145 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.284132 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw" event={"ID":"74f95799-0ea5-46f5-b2e7-7ef3370e9215","Type":"ContainerDied","Data":"06b1cb42574a2f791a92c20c02a0c1da02705efd00239ad7c0ce3ed7d257dd4a"} Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.285097 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06b1cb42574a2f791a92c20c02a0c1da02705efd00239ad7c0ce3ed7d257dd4a" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.670235 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wwfkc" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.790872 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-config-data\") pod \"a587d3d9-972c-47ae-8e29-5bfd977ff429\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.790990 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-db-sync-config-data\") pod \"a587d3d9-972c-47ae-8e29-5bfd977ff429\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.791214 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvpq6\" (UniqueName: \"kubernetes.io/projected/a587d3d9-972c-47ae-8e29-5bfd977ff429-kube-api-access-hvpq6\") pod \"a587d3d9-972c-47ae-8e29-5bfd977ff429\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.791297 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-scripts\") pod \"a587d3d9-972c-47ae-8e29-5bfd977ff429\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.791385 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a587d3d9-972c-47ae-8e29-5bfd977ff429-etc-machine-id\") pod \"a587d3d9-972c-47ae-8e29-5bfd977ff429\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.791460 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-combined-ca-bundle\") pod \"a587d3d9-972c-47ae-8e29-5bfd977ff429\" (UID: \"a587d3d9-972c-47ae-8e29-5bfd977ff429\") " Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.793470 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a587d3d9-972c-47ae-8e29-5bfd977ff429-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a587d3d9-972c-47ae-8e29-5bfd977ff429" (UID: "a587d3d9-972c-47ae-8e29-5bfd977ff429"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.797183 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a587d3d9-972c-47ae-8e29-5bfd977ff429" (UID: "a587d3d9-972c-47ae-8e29-5bfd977ff429"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.797718 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-scripts" (OuterVolumeSpecName: "scripts") pod "a587d3d9-972c-47ae-8e29-5bfd977ff429" (UID: "a587d3d9-972c-47ae-8e29-5bfd977ff429"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.798210 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a587d3d9-972c-47ae-8e29-5bfd977ff429-kube-api-access-hvpq6" (OuterVolumeSpecName: "kube-api-access-hvpq6") pod "a587d3d9-972c-47ae-8e29-5bfd977ff429" (UID: "a587d3d9-972c-47ae-8e29-5bfd977ff429"). InnerVolumeSpecName "kube-api-access-hvpq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.823726 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a587d3d9-972c-47ae-8e29-5bfd977ff429" (UID: "a587d3d9-972c-47ae-8e29-5bfd977ff429"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.851996 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-config-data" (OuterVolumeSpecName: "config-data") pod "a587d3d9-972c-47ae-8e29-5bfd977ff429" (UID: "a587d3d9-972c-47ae-8e29-5bfd977ff429"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.894683 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.894720 4746 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.894733 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvpq6\" (UniqueName: \"kubernetes.io/projected/a587d3d9-972c-47ae-8e29-5bfd977ff429-kube-api-access-hvpq6\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.894742 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.894752 4746 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a587d3d9-972c-47ae-8e29-5bfd977ff429-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:13 crc kubenswrapper[4746]: I0128 21:00:13.894760 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a587d3d9-972c-47ae-8e29-5bfd977ff429-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.295197 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wwfkc" event={"ID":"a587d3d9-972c-47ae-8e29-5bfd977ff429","Type":"ContainerDied","Data":"08ee86cb6fcc85ad85b5d6b47293b98bb17a56f48df6088960efbd11a5e808e8"} Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.296771 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08ee86cb6fcc85ad85b5d6b47293b98bb17a56f48df6088960efbd11a5e808e8" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.296988 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wwfkc" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.603372 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 21:00:14 crc kubenswrapper[4746]: E0128 21:00:14.604053 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f95799-0ea5-46f5-b2e7-7ef3370e9215" containerName="collect-profiles" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.604064 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f95799-0ea5-46f5-b2e7-7ef3370e9215" containerName="collect-profiles" Jan 28 21:00:14 crc kubenswrapper[4746]: E0128 21:00:14.604127 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28c4d6e6-e370-43ea-855a-a108b80076f6" containerName="mariadb-account-create-update" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.604134 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="28c4d6e6-e370-43ea-855a-a108b80076f6" containerName="mariadb-account-create-update" Jan 28 21:00:14 crc kubenswrapper[4746]: E0128 21:00:14.604153 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a587d3d9-972c-47ae-8e29-5bfd977ff429" containerName="cinder-db-sync" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.604169 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="a587d3d9-972c-47ae-8e29-5bfd977ff429" containerName="cinder-db-sync" Jan 28 21:00:14 crc kubenswrapper[4746]: E0128 21:00:14.604183 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a60d3b-bcc7-47e3-94b0-12acae8ccfda" containerName="dnsmasq-dns" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.604188 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a60d3b-bcc7-47e3-94b0-12acae8ccfda" containerName="dnsmasq-dns" Jan 28 21:00:14 crc kubenswrapper[4746]: E0128 21:00:14.604195 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a60d3b-bcc7-47e3-94b0-12acae8ccfda" containerName="init" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.604201 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a60d3b-bcc7-47e3-94b0-12acae8ccfda" containerName="init" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.604361 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1a60d3b-bcc7-47e3-94b0-12acae8ccfda" containerName="dnsmasq-dns" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.604376 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="28c4d6e6-e370-43ea-855a-a108b80076f6" containerName="mariadb-account-create-update" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.604383 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="a587d3d9-972c-47ae-8e29-5bfd977ff429" containerName="cinder-db-sync" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.604396 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="74f95799-0ea5-46f5-b2e7-7ef3370e9215" containerName="collect-profiles" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.605394 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.614356 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wqf28" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.614704 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.614913 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.615053 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.628753 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.686242 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b895b5785-8cmsg"] Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.687755 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.707119 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-8cmsg"] Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.726912 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/726e5b20-4725-4c19-9dac-42b68d0e181a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.726978 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.727006 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-scripts\") pod \"cinder-scheduler-0\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.727027 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.727122 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-config-data\") pod \"cinder-scheduler-0\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.727167 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vn27\" (UniqueName: \"kubernetes.io/projected/726e5b20-4725-4c19-9dac-42b68d0e181a-kube-api-access-7vn27\") pod \"cinder-scheduler-0\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.829610 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5b2v\" (UniqueName: \"kubernetes.io/projected/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-kube-api-access-t5b2v\") pod \"dnsmasq-dns-b895b5785-8cmsg\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.829669 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-config-data\") pod \"cinder-scheduler-0\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.830633 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vn27\" (UniqueName: \"kubernetes.io/projected/726e5b20-4725-4c19-9dac-42b68d0e181a-kube-api-access-7vn27\") pod \"cinder-scheduler-0\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.830706 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/726e5b20-4725-4c19-9dac-42b68d0e181a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.830966 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/726e5b20-4725-4c19-9dac-42b68d0e181a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.831108 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-29n2p" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.831605 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.831991 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-config\") pod \"dnsmasq-dns-b895b5785-8cmsg\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.832025 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-scripts\") pod \"cinder-scheduler-0\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.832382 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.832411 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-dns-swift-storage-0\") pod \"dnsmasq-dns-b895b5785-8cmsg\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.832460 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-ovsdbserver-sb\") pod \"dnsmasq-dns-b895b5785-8cmsg\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.832503 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-ovsdbserver-nb\") pod \"dnsmasq-dns-b895b5785-8cmsg\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.832520 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-dns-svc\") pod \"dnsmasq-dns-b895b5785-8cmsg\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.837966 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-scripts\") pod \"cinder-scheduler-0\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.838682 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-config-data\") pod \"cinder-scheduler-0\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.858350 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.877328 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.879671 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vn27\" (UniqueName: \"kubernetes.io/projected/726e5b20-4725-4c19-9dac-42b68d0e181a-kube-api-access-7vn27\") pod \"cinder-scheduler-0\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.942692 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e766dc-9f84-4d0c-af5b-3b044e06c09f-combined-ca-bundle\") pod \"70e766dc-9f84-4d0c-af5b-3b044e06c09f\" (UID: \"70e766dc-9f84-4d0c-af5b-3b044e06c09f\") " Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.942985 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz7rb\" (UniqueName: \"kubernetes.io/projected/70e766dc-9f84-4d0c-af5b-3b044e06c09f-kube-api-access-hz7rb\") pod \"70e766dc-9f84-4d0c-af5b-3b044e06c09f\" (UID: \"70e766dc-9f84-4d0c-af5b-3b044e06c09f\") " Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.943034 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/70e766dc-9f84-4d0c-af5b-3b044e06c09f-db-sync-config-data\") pod \"70e766dc-9f84-4d0c-af5b-3b044e06c09f\" (UID: \"70e766dc-9f84-4d0c-af5b-3b044e06c09f\") " Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.943547 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-ovsdbserver-nb\") pod \"dnsmasq-dns-b895b5785-8cmsg\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.943575 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-dns-svc\") pod \"dnsmasq-dns-b895b5785-8cmsg\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.943714 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5b2v\" (UniqueName: \"kubernetes.io/projected/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-kube-api-access-t5b2v\") pod \"dnsmasq-dns-b895b5785-8cmsg\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.943870 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-config\") pod \"dnsmasq-dns-b895b5785-8cmsg\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.943917 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-dns-swift-storage-0\") pod \"dnsmasq-dns-b895b5785-8cmsg\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.943960 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-ovsdbserver-sb\") pod \"dnsmasq-dns-b895b5785-8cmsg\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.947201 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-dns-svc\") pod \"dnsmasq-dns-b895b5785-8cmsg\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.947497 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-ovsdbserver-nb\") pod \"dnsmasq-dns-b895b5785-8cmsg\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.949836 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e766dc-9f84-4d0c-af5b-3b044e06c09f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "70e766dc-9f84-4d0c-af5b-3b044e06c09f" (UID: "70e766dc-9f84-4d0c-af5b-3b044e06c09f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.982876 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70e766dc-9f84-4d0c-af5b-3b044e06c09f-kube-api-access-hz7rb" (OuterVolumeSpecName: "kube-api-access-hz7rb") pod "70e766dc-9f84-4d0c-af5b-3b044e06c09f" (UID: "70e766dc-9f84-4d0c-af5b-3b044e06c09f"). InnerVolumeSpecName "kube-api-access-hz7rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.995050 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.995438 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-ovsdbserver-sb\") pod \"dnsmasq-dns-b895b5785-8cmsg\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:14 crc kubenswrapper[4746]: I0128 21:00:14.999824 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-dns-swift-storage-0\") pod \"dnsmasq-dns-b895b5785-8cmsg\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.013500 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-config\") pod \"dnsmasq-dns-b895b5785-8cmsg\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.017008 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5b2v\" (UniqueName: \"kubernetes.io/projected/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-kube-api-access-t5b2v\") pod \"dnsmasq-dns-b895b5785-8cmsg\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.033465 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70e766dc-9f84-4d0c-af5b-3b044e06c09f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70e766dc-9f84-4d0c-af5b-3b044e06c09f" (UID: "70e766dc-9f84-4d0c-af5b-3b044e06c09f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.045817 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz7rb\" (UniqueName: \"kubernetes.io/projected/70e766dc-9f84-4d0c-af5b-3b044e06c09f-kube-api-access-hz7rb\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.045852 4746 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/70e766dc-9f84-4d0c-af5b-3b044e06c09f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.045862 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e766dc-9f84-4d0c-af5b-3b044e06c09f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.122046 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.148701 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 28 21:00:15 crc kubenswrapper[4746]: E0128 21:00:15.149127 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e766dc-9f84-4d0c-af5b-3b044e06c09f" containerName="barbican-db-sync" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.149145 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e766dc-9f84-4d0c-af5b-3b044e06c09f" containerName="barbican-db-sync" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.149420 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e766dc-9f84-4d0c-af5b-3b044e06c09f" containerName="barbican-db-sync" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.150401 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.150492 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.155551 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.253406 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.253882 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lw6q\" (UniqueName: \"kubernetes.io/projected/7fdd9345-765c-4290-beee-f02752b34ee8-kube-api-access-7lw6q\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.254036 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fdd9345-765c-4290-beee-f02752b34ee8-logs\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.254071 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-config-data\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.283791 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-scripts\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.283982 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-config-data-custom\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.284110 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7fdd9345-765c-4290-beee-f02752b34ee8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.375018 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-29n2p" event={"ID":"70e766dc-9f84-4d0c-af5b-3b044e06c09f","Type":"ContainerDied","Data":"f8fc35da8ff29fd471e54b600102c93bb7f1dece67edccd18a4ebde23e0bf1ce"} Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.375064 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8fc35da8ff29fd471e54b600102c93bb7f1dece67edccd18a4ebde23e0bf1ce" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.375168 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-29n2p" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.388316 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.388378 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lw6q\" (UniqueName: \"kubernetes.io/projected/7fdd9345-765c-4290-beee-f02752b34ee8-kube-api-access-7lw6q\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.388485 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fdd9345-765c-4290-beee-f02752b34ee8-logs\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.388511 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-config-data\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.388537 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-scripts\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.388578 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-config-data-custom\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.388612 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7fdd9345-765c-4290-beee-f02752b34ee8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.388777 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7fdd9345-765c-4290-beee-f02752b34ee8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.389193 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fdd9345-765c-4290-beee-f02752b34ee8-logs\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.405348 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.407867 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-config-data\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.430907 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-scripts\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.435561 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lw6q\" (UniqueName: \"kubernetes.io/projected/7fdd9345-765c-4290-beee-f02752b34ee8-kube-api-access-7lw6q\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.436514 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-config-data-custom\") pod \"cinder-api-0\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.482598 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.648741 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-54d56bfd95-zhg7t"] Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.654440 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54d56bfd95-zhg7t" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.660007 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-54d56bfd95-zhg7t"] Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.662785 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.663043 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.663212 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-cqvw2" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.695155 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-74d8954788-pqmtp"] Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.697265 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.708248 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.735333 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-74d8954788-pqmtp"] Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.790124 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-8cmsg"] Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.806358 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26db0906-ba06-4d40-b864-c7d956379296-config-data-custom\") pod \"barbican-worker-54d56bfd95-zhg7t\" (UID: \"26db0906-ba06-4d40-b864-c7d956379296\") " pod="openstack/barbican-worker-54d56bfd95-zhg7t" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.806966 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n86sb\" (UniqueName: \"kubernetes.io/projected/26db0906-ba06-4d40-b864-c7d956379296-kube-api-access-n86sb\") pod \"barbican-worker-54d56bfd95-zhg7t\" (UID: \"26db0906-ba06-4d40-b864-c7d956379296\") " pod="openstack/barbican-worker-54d56bfd95-zhg7t" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.807010 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26db0906-ba06-4d40-b864-c7d956379296-config-data\") pod \"barbican-worker-54d56bfd95-zhg7t\" (UID: \"26db0906-ba06-4d40-b864-c7d956379296\") " pod="openstack/barbican-worker-54d56bfd95-zhg7t" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.807049 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d6c401d-18ee-432b-992c-749c69887786-config-data\") pod \"barbican-keystone-listener-74d8954788-pqmtp\" (UID: \"9d6c401d-18ee-432b-992c-749c69887786\") " pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.807090 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d6c401d-18ee-432b-992c-749c69887786-combined-ca-bundle\") pod \"barbican-keystone-listener-74d8954788-pqmtp\" (UID: \"9d6c401d-18ee-432b-992c-749c69887786\") " pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.807111 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d6c401d-18ee-432b-992c-749c69887786-config-data-custom\") pod \"barbican-keystone-listener-74d8954788-pqmtp\" (UID: \"9d6c401d-18ee-432b-992c-749c69887786\") " pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.807135 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26db0906-ba06-4d40-b864-c7d956379296-logs\") pod \"barbican-worker-54d56bfd95-zhg7t\" (UID: \"26db0906-ba06-4d40-b864-c7d956379296\") " pod="openstack/barbican-worker-54d56bfd95-zhg7t" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.807152 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qdgw\" (UniqueName: \"kubernetes.io/projected/9d6c401d-18ee-432b-992c-749c69887786-kube-api-access-7qdgw\") pod \"barbican-keystone-listener-74d8954788-pqmtp\" (UID: \"9d6c401d-18ee-432b-992c-749c69887786\") " pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.807215 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26db0906-ba06-4d40-b864-c7d956379296-combined-ca-bundle\") pod \"barbican-worker-54d56bfd95-zhg7t\" (UID: \"26db0906-ba06-4d40-b864-c7d956379296\") " pod="openstack/barbican-worker-54d56bfd95-zhg7t" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.807232 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d6c401d-18ee-432b-992c-749c69887786-logs\") pod \"barbican-keystone-listener-74d8954788-pqmtp\" (UID: \"9d6c401d-18ee-432b-992c-749c69887786\") " pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.814514 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.851926 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7fncg"] Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.853683 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.864405 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7fncg"] Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.872213 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.872258 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.884479 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6dcb95875d-qz4xq"] Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.886241 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.890411 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.904250 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6dcb95875d-qz4xq"] Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.918882 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26db0906-ba06-4d40-b864-c7d956379296-logs\") pod \"barbican-worker-54d56bfd95-zhg7t\" (UID: \"26db0906-ba06-4d40-b864-c7d956379296\") " pod="openstack/barbican-worker-54d56bfd95-zhg7t" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.919234 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qdgw\" (UniqueName: \"kubernetes.io/projected/9d6c401d-18ee-432b-992c-749c69887786-kube-api-access-7qdgw\") pod \"barbican-keystone-listener-74d8954788-pqmtp\" (UID: \"9d6c401d-18ee-432b-992c-749c69887786\") " pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.919406 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26db0906-ba06-4d40-b864-c7d956379296-combined-ca-bundle\") pod \"barbican-worker-54d56bfd95-zhg7t\" (UID: \"26db0906-ba06-4d40-b864-c7d956379296\") " pod="openstack/barbican-worker-54d56bfd95-zhg7t" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.919511 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d6c401d-18ee-432b-992c-749c69887786-logs\") pod \"barbican-keystone-listener-74d8954788-pqmtp\" (UID: \"9d6c401d-18ee-432b-992c-749c69887786\") " pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.920691 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26db0906-ba06-4d40-b864-c7d956379296-logs\") pod \"barbican-worker-54d56bfd95-zhg7t\" (UID: \"26db0906-ba06-4d40-b864-c7d956379296\") " pod="openstack/barbican-worker-54d56bfd95-zhg7t" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.920984 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d6c401d-18ee-432b-992c-749c69887786-logs\") pod \"barbican-keystone-listener-74d8954788-pqmtp\" (UID: \"9d6c401d-18ee-432b-992c-749c69887786\") " pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.923226 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26db0906-ba06-4d40-b864-c7d956379296-config-data-custom\") pod \"barbican-worker-54d56bfd95-zhg7t\" (UID: \"26db0906-ba06-4d40-b864-c7d956379296\") " pod="openstack/barbican-worker-54d56bfd95-zhg7t" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.923430 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n86sb\" (UniqueName: \"kubernetes.io/projected/26db0906-ba06-4d40-b864-c7d956379296-kube-api-access-n86sb\") pod \"barbican-worker-54d56bfd95-zhg7t\" (UID: \"26db0906-ba06-4d40-b864-c7d956379296\") " pod="openstack/barbican-worker-54d56bfd95-zhg7t" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.923627 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26db0906-ba06-4d40-b864-c7d956379296-config-data\") pod \"barbican-worker-54d56bfd95-zhg7t\" (UID: \"26db0906-ba06-4d40-b864-c7d956379296\") " pod="openstack/barbican-worker-54d56bfd95-zhg7t" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.923783 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d6c401d-18ee-432b-992c-749c69887786-config-data\") pod \"barbican-keystone-listener-74d8954788-pqmtp\" (UID: \"9d6c401d-18ee-432b-992c-749c69887786\") " pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.923894 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d6c401d-18ee-432b-992c-749c69887786-combined-ca-bundle\") pod \"barbican-keystone-listener-74d8954788-pqmtp\" (UID: \"9d6c401d-18ee-432b-992c-749c69887786\") " pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.923991 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d6c401d-18ee-432b-992c-749c69887786-config-data-custom\") pod \"barbican-keystone-listener-74d8954788-pqmtp\" (UID: \"9d6c401d-18ee-432b-992c-749c69887786\") " pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.929052 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26db0906-ba06-4d40-b864-c7d956379296-combined-ca-bundle\") pod \"barbican-worker-54d56bfd95-zhg7t\" (UID: \"26db0906-ba06-4d40-b864-c7d956379296\") " pod="openstack/barbican-worker-54d56bfd95-zhg7t" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.929929 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d6c401d-18ee-432b-992c-749c69887786-config-data-custom\") pod \"barbican-keystone-listener-74d8954788-pqmtp\" (UID: \"9d6c401d-18ee-432b-992c-749c69887786\") " pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.934170 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d6c401d-18ee-432b-992c-749c69887786-combined-ca-bundle\") pod \"barbican-keystone-listener-74d8954788-pqmtp\" (UID: \"9d6c401d-18ee-432b-992c-749c69887786\") " pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.936928 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26db0906-ba06-4d40-b864-c7d956379296-config-data-custom\") pod \"barbican-worker-54d56bfd95-zhg7t\" (UID: \"26db0906-ba06-4d40-b864-c7d956379296\") " pod="openstack/barbican-worker-54d56bfd95-zhg7t" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.939285 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26db0906-ba06-4d40-b864-c7d956379296-config-data\") pod \"barbican-worker-54d56bfd95-zhg7t\" (UID: \"26db0906-ba06-4d40-b864-c7d956379296\") " pod="openstack/barbican-worker-54d56bfd95-zhg7t" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.939308 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d6c401d-18ee-432b-992c-749c69887786-config-data\") pod \"barbican-keystone-listener-74d8954788-pqmtp\" (UID: \"9d6c401d-18ee-432b-992c-749c69887786\") " pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.940168 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qdgw\" (UniqueName: \"kubernetes.io/projected/9d6c401d-18ee-432b-992c-749c69887786-kube-api-access-7qdgw\") pod \"barbican-keystone-listener-74d8954788-pqmtp\" (UID: \"9d6c401d-18ee-432b-992c-749c69887786\") " pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.975445 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n86sb\" (UniqueName: \"kubernetes.io/projected/26db0906-ba06-4d40-b864-c7d956379296-kube-api-access-n86sb\") pod \"barbican-worker-54d56bfd95-zhg7t\" (UID: \"26db0906-ba06-4d40-b864-c7d956379296\") " pod="openstack/barbican-worker-54d56bfd95-zhg7t" Jan 28 21:00:15 crc kubenswrapper[4746]: I0128 21:00:15.989522 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54d56bfd95-zhg7t" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.026029 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-config-data\") pod \"barbican-api-6dcb95875d-qz4xq\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.026091 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84fk6\" (UniqueName: \"kubernetes.io/projected/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-kube-api-access-84fk6\") pod \"barbican-api-6dcb95875d-qz4xq\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.026119 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-config-data-custom\") pod \"barbican-api-6dcb95875d-qz4xq\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.026185 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-combined-ca-bundle\") pod \"barbican-api-6dcb95875d-qz4xq\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.026226 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-7fncg\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.026278 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-config\") pod \"dnsmasq-dns-5c9776ccc5-7fncg\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.026321 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqdnm\" (UniqueName: \"kubernetes.io/projected/d730a144-0107-409b-9abe-1e316143afc9-kube-api-access-hqdnm\") pod \"dnsmasq-dns-5c9776ccc5-7fncg\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.026340 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-7fncg\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.026358 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-7fncg\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.026398 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-logs\") pod \"barbican-api-6dcb95875d-qz4xq\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.026414 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-7fncg\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.033494 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-8cmsg"] Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.044029 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.130963 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-logs\") pod \"barbican-api-6dcb95875d-qz4xq\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.131061 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-7fncg\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.131158 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-config-data\") pod \"barbican-api-6dcb95875d-qz4xq\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.131199 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84fk6\" (UniqueName: \"kubernetes.io/projected/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-kube-api-access-84fk6\") pod \"barbican-api-6dcb95875d-qz4xq\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.131240 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-config-data-custom\") pod \"barbican-api-6dcb95875d-qz4xq\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.131310 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-combined-ca-bundle\") pod \"barbican-api-6dcb95875d-qz4xq\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.131372 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-7fncg\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.131430 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-config\") pod \"dnsmasq-dns-5c9776ccc5-7fncg\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.131470 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqdnm\" (UniqueName: \"kubernetes.io/projected/d730a144-0107-409b-9abe-1e316143afc9-kube-api-access-hqdnm\") pod \"dnsmasq-dns-5c9776ccc5-7fncg\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.131502 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-7fncg\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.131537 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-7fncg\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.131534 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-logs\") pod \"barbican-api-6dcb95875d-qz4xq\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.132833 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-7fncg\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.133007 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-config\") pod \"dnsmasq-dns-5c9776ccc5-7fncg\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.134015 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-7fncg\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.134250 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-7fncg\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.134991 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-7fncg\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.138873 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-combined-ca-bundle\") pod \"barbican-api-6dcb95875d-qz4xq\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.140216 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-config-data\") pod \"barbican-api-6dcb95875d-qz4xq\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.140505 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-config-data-custom\") pod \"barbican-api-6dcb95875d-qz4xq\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.153721 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqdnm\" (UniqueName: \"kubernetes.io/projected/d730a144-0107-409b-9abe-1e316143afc9-kube-api-access-hqdnm\") pod \"dnsmasq-dns-5c9776ccc5-7fncg\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.154715 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84fk6\" (UniqueName: \"kubernetes.io/projected/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-kube-api-access-84fk6\") pod \"barbican-api-6dcb95875d-qz4xq\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.176073 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.354043 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.393468 4746 generic.go:334] "Generic (PLEG): container finished" podID="cce95230-2b72-4598-9d28-3a1465803567" containerID="3a0fc6e40b63c3d0054964bf86868df5bae172f80bc0b177d845ac3b30f9697f" exitCode=0 Jan 28 21:00:16 crc kubenswrapper[4746]: I0128 21:00:16.393545 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-p9ghn" event={"ID":"cce95230-2b72-4598-9d28-3a1465803567","Type":"ContainerDied","Data":"3a0fc6e40b63c3d0054964bf86868df5bae172f80bc0b177d845ac3b30f9697f"} Jan 28 21:00:17 crc kubenswrapper[4746]: I0128 21:00:17.032257 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 28 21:00:17 crc kubenswrapper[4746]: I0128 21:00:17.137997 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-gknbb"] Jan 28 21:00:17 crc kubenswrapper[4746]: I0128 21:00:17.146547 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-gknbb"] Jan 28 21:00:17 crc kubenswrapper[4746]: I0128 21:00:17.241226 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 28 21:00:17 crc kubenswrapper[4746]: I0128 21:00:17.241729 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 28 21:00:17 crc kubenswrapper[4746]: I0128 21:00:17.284852 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 28 21:00:17 crc kubenswrapper[4746]: I0128 21:00:17.300911 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 28 21:00:17 crc kubenswrapper[4746]: I0128 21:00:17.300949 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 28 21:00:17 crc kubenswrapper[4746]: I0128 21:00:17.414036 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 28 21:00:17 crc kubenswrapper[4746]: I0128 21:00:17.427482 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 28 21:00:17 crc kubenswrapper[4746]: I0128 21:00:17.460344 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 28 21:00:17 crc kubenswrapper[4746]: I0128 21:00:17.461719 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 28 21:00:17 crc kubenswrapper[4746]: I0128 21:00:17.469822 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 28 21:00:18 crc kubenswrapper[4746]: I0128 21:00:18.421106 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 28 21:00:18 crc kubenswrapper[4746]: I0128 21:00:18.421363 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 28 21:00:18 crc kubenswrapper[4746]: I0128 21:00:18.854192 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28c4d6e6-e370-43ea-855a-a108b80076f6" path="/var/lib/kubelet/pods/28c4d6e6-e370-43ea-855a-a108b80076f6/volumes" Jan 28 21:00:18 crc kubenswrapper[4746]: I0128 21:00:18.986678 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-bd578f44d-rfllv" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.283190 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66b8c6d9b5-z7qnr"] Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.283814 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66b8c6d9b5-z7qnr" podUID="40f7281e-2e3c-4ce9-8b0f-876312390c0b" containerName="neutron-api" containerID="cri-o://f4d3b5e2216f8a4fe2a3ab22e9e3ac51f770c2bc3e05d57ffd1bc03c025cbfbb" gracePeriod=30 Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.283960 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66b8c6d9b5-z7qnr" podUID="40f7281e-2e3c-4ce9-8b0f-876312390c0b" containerName="neutron-httpd" containerID="cri-o://7b0a672bdbe84c0ab3283f2f4f2a082d7f780e806e7442f059d9e6da34f9ef64" gracePeriod=30 Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.303872 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-66b8c6d9b5-z7qnr" podUID="40f7281e-2e3c-4ce9-8b0f-876312390c0b" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.173:9696/\": EOF" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.337599 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5569c8497f-nhjcs"] Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.339568 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.429879 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c2d514-3bdc-4969-a429-0aac820c8e77-combined-ca-bundle\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.429928 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9c2d514-3bdc-4969-a429-0aac820c8e77-ovndb-tls-certs\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.430024 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9c2d514-3bdc-4969-a429-0aac820c8e77-internal-tls-certs\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.430063 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d9c2d514-3bdc-4969-a429-0aac820c8e77-httpd-config\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.430165 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d9c2d514-3bdc-4969-a429-0aac820c8e77-config\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.430240 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9c2d514-3bdc-4969-a429-0aac820c8e77-public-tls-certs\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.430278 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzvpk\" (UniqueName: \"kubernetes.io/projected/d9c2d514-3bdc-4969-a429-0aac820c8e77-kube-api-access-vzvpk\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.433842 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5569c8497f-nhjcs"] Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.434810 4746 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.434811 4746 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.532278 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d9c2d514-3bdc-4969-a429-0aac820c8e77-config\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.532395 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9c2d514-3bdc-4969-a429-0aac820c8e77-public-tls-certs\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.532460 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzvpk\" (UniqueName: \"kubernetes.io/projected/d9c2d514-3bdc-4969-a429-0aac820c8e77-kube-api-access-vzvpk\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.532587 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c2d514-3bdc-4969-a429-0aac820c8e77-combined-ca-bundle\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.532610 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9c2d514-3bdc-4969-a429-0aac820c8e77-ovndb-tls-certs\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.532700 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9c2d514-3bdc-4969-a429-0aac820c8e77-internal-tls-certs\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.532735 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d9c2d514-3bdc-4969-a429-0aac820c8e77-httpd-config\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.545909 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9c2d514-3bdc-4969-a429-0aac820c8e77-ovndb-tls-certs\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.545959 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c2d514-3bdc-4969-a429-0aac820c8e77-combined-ca-bundle\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.546297 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d9c2d514-3bdc-4969-a429-0aac820c8e77-config\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.554696 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d9c2d514-3bdc-4969-a429-0aac820c8e77-httpd-config\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.574737 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9c2d514-3bdc-4969-a429-0aac820c8e77-public-tls-certs\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.575034 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzvpk\" (UniqueName: \"kubernetes.io/projected/d9c2d514-3bdc-4969-a429-0aac820c8e77-kube-api-access-vzvpk\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.588001 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9c2d514-3bdc-4969-a429-0aac820c8e77-internal-tls-certs\") pod \"neutron-5569c8497f-nhjcs\" (UID: \"d9c2d514-3bdc-4969-a429-0aac820c8e77\") " pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:19 crc kubenswrapper[4746]: I0128 21:00:19.691026 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:20 crc kubenswrapper[4746]: I0128 21:00:20.102271 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 28 21:00:20 crc kubenswrapper[4746]: I0128 21:00:20.385455 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 28 21:00:20 crc kubenswrapper[4746]: I0128 21:00:20.407134 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 28 21:00:20 crc kubenswrapper[4746]: I0128 21:00:20.457874 4746 generic.go:334] "Generic (PLEG): container finished" podID="40f7281e-2e3c-4ce9-8b0f-876312390c0b" containerID="7b0a672bdbe84c0ab3283f2f4f2a082d7f780e806e7442f059d9e6da34f9ef64" exitCode=0 Jan 28 21:00:20 crc kubenswrapper[4746]: I0128 21:00:20.458626 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66b8c6d9b5-z7qnr" event={"ID":"40f7281e-2e3c-4ce9-8b0f-876312390c0b","Type":"ContainerDied","Data":"7b0a672bdbe84c0ab3283f2f4f2a082d7f780e806e7442f059d9e6da34f9ef64"} Jan 28 21:00:20 crc kubenswrapper[4746]: I0128 21:00:20.458698 4746 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 21:00:20 crc kubenswrapper[4746]: I0128 21:00:20.933691 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.342216 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-66b8c6d9b5-z7qnr" podUID="40f7281e-2e3c-4ce9-8b0f-876312390c0b" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.173:9696/\": dial tcp 10.217.0.173:9696: connect: connection refused" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.472311 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b895b5785-8cmsg" event={"ID":"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470","Type":"ContainerStarted","Data":"806ba0ccdd5f45bb4eaaadeea0485a2468a66ceba22b7e621b29d8b8392c04f5"} Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.485991 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-79599f5dcd-btgz7"] Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.487825 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.493806 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.494037 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.514243 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79599f5dcd-btgz7"] Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.577164 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fa661e3-776e-42b0-83db-374d372232ad-config-data\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.577215 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fa661e3-776e-42b0-83db-374d372232ad-config-data-custom\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.577245 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa661e3-776e-42b0-83db-374d372232ad-internal-tls-certs\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.577266 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa661e3-776e-42b0-83db-374d372232ad-public-tls-certs\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.577294 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fa661e3-776e-42b0-83db-374d372232ad-combined-ca-bundle\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.577376 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fa661e3-776e-42b0-83db-374d372232ad-logs\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.577433 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrd29\" (UniqueName: \"kubernetes.io/projected/8fa661e3-776e-42b0-83db-374d372232ad-kube-api-access-qrd29\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.680589 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fa661e3-776e-42b0-83db-374d372232ad-logs\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.680746 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrd29\" (UniqueName: \"kubernetes.io/projected/8fa661e3-776e-42b0-83db-374d372232ad-kube-api-access-qrd29\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.680823 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fa661e3-776e-42b0-83db-374d372232ad-config-data\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.680861 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fa661e3-776e-42b0-83db-374d372232ad-config-data-custom\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.680892 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa661e3-776e-42b0-83db-374d372232ad-internal-tls-certs\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.680921 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa661e3-776e-42b0-83db-374d372232ad-public-tls-certs\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.680953 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fa661e3-776e-42b0-83db-374d372232ad-combined-ca-bundle\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.682276 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fa661e3-776e-42b0-83db-374d372232ad-logs\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.691361 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fa661e3-776e-42b0-83db-374d372232ad-config-data\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.694600 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa661e3-776e-42b0-83db-374d372232ad-public-tls-certs\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.695025 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa661e3-776e-42b0-83db-374d372232ad-internal-tls-certs\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.695453 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fa661e3-776e-42b0-83db-374d372232ad-config-data-custom\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.703609 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fa661e3-776e-42b0-83db-374d372232ad-combined-ca-bundle\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.720661 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrd29\" (UniqueName: \"kubernetes.io/projected/8fa661e3-776e-42b0-83db-374d372232ad-kube-api-access-qrd29\") pod \"barbican-api-79599f5dcd-btgz7\" (UID: \"8fa661e3-776e-42b0-83db-374d372232ad\") " pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:21 crc kubenswrapper[4746]: I0128 21:00:21.822263 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:22 crc kubenswrapper[4746]: I0128 21:00:22.176191 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-55g9q"] Jan 28 21:00:22 crc kubenswrapper[4746]: I0128 21:00:22.177571 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-55g9q" Jan 28 21:00:22 crc kubenswrapper[4746]: I0128 21:00:22.181402 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 28 21:00:22 crc kubenswrapper[4746]: I0128 21:00:22.187932 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-55g9q"] Jan 28 21:00:22 crc kubenswrapper[4746]: I0128 21:00:22.294616 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rzkk\" (UniqueName: \"kubernetes.io/projected/766b5979-538e-4a54-a1b5-3351e3988f70-kube-api-access-9rzkk\") pod \"root-account-create-update-55g9q\" (UID: \"766b5979-538e-4a54-a1b5-3351e3988f70\") " pod="openstack/root-account-create-update-55g9q" Jan 28 21:00:22 crc kubenswrapper[4746]: I0128 21:00:22.294798 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/766b5979-538e-4a54-a1b5-3351e3988f70-operator-scripts\") pod \"root-account-create-update-55g9q\" (UID: \"766b5979-538e-4a54-a1b5-3351e3988f70\") " pod="openstack/root-account-create-update-55g9q" Jan 28 21:00:22 crc kubenswrapper[4746]: I0128 21:00:22.396232 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/766b5979-538e-4a54-a1b5-3351e3988f70-operator-scripts\") pod \"root-account-create-update-55g9q\" (UID: \"766b5979-538e-4a54-a1b5-3351e3988f70\") " pod="openstack/root-account-create-update-55g9q" Jan 28 21:00:22 crc kubenswrapper[4746]: I0128 21:00:22.396324 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rzkk\" (UniqueName: \"kubernetes.io/projected/766b5979-538e-4a54-a1b5-3351e3988f70-kube-api-access-9rzkk\") pod \"root-account-create-update-55g9q\" (UID: \"766b5979-538e-4a54-a1b5-3351e3988f70\") " pod="openstack/root-account-create-update-55g9q" Jan 28 21:00:22 crc kubenswrapper[4746]: I0128 21:00:22.397650 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/766b5979-538e-4a54-a1b5-3351e3988f70-operator-scripts\") pod \"root-account-create-update-55g9q\" (UID: \"766b5979-538e-4a54-a1b5-3351e3988f70\") " pod="openstack/root-account-create-update-55g9q" Jan 28 21:00:22 crc kubenswrapper[4746]: I0128 21:00:22.424670 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rzkk\" (UniqueName: \"kubernetes.io/projected/766b5979-538e-4a54-a1b5-3351e3988f70-kube-api-access-9rzkk\") pod \"root-account-create-update-55g9q\" (UID: \"766b5979-538e-4a54-a1b5-3351e3988f70\") " pod="openstack/root-account-create-update-55g9q" Jan 28 21:00:22 crc kubenswrapper[4746]: I0128 21:00:22.513396 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-55g9q" Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.397892 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.443360 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cce95230-2b72-4598-9d28-3a1465803567-scripts\") pod \"cce95230-2b72-4598-9d28-3a1465803567\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.443608 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cce95230-2b72-4598-9d28-3a1465803567-config-data\") pod \"cce95230-2b72-4598-9d28-3a1465803567\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.443644 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/cce95230-2b72-4598-9d28-3a1465803567-certs\") pod \"cce95230-2b72-4598-9d28-3a1465803567\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.443680 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce95230-2b72-4598-9d28-3a1465803567-combined-ca-bundle\") pod \"cce95230-2b72-4598-9d28-3a1465803567\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.443785 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm2k7\" (UniqueName: \"kubernetes.io/projected/cce95230-2b72-4598-9d28-3a1465803567-kube-api-access-bm2k7\") pod \"cce95230-2b72-4598-9d28-3a1465803567\" (UID: \"cce95230-2b72-4598-9d28-3a1465803567\") " Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.447754 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cce95230-2b72-4598-9d28-3a1465803567-kube-api-access-bm2k7" (OuterVolumeSpecName: "kube-api-access-bm2k7") pod "cce95230-2b72-4598-9d28-3a1465803567" (UID: "cce95230-2b72-4598-9d28-3a1465803567"). InnerVolumeSpecName "kube-api-access-bm2k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.451334 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cce95230-2b72-4598-9d28-3a1465803567-certs" (OuterVolumeSpecName: "certs") pod "cce95230-2b72-4598-9d28-3a1465803567" (UID: "cce95230-2b72-4598-9d28-3a1465803567"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.451615 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cce95230-2b72-4598-9d28-3a1465803567-scripts" (OuterVolumeSpecName: "scripts") pod "cce95230-2b72-4598-9d28-3a1465803567" (UID: "cce95230-2b72-4598-9d28-3a1465803567"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.482802 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cce95230-2b72-4598-9d28-3a1465803567-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cce95230-2b72-4598-9d28-3a1465803567" (UID: "cce95230-2b72-4598-9d28-3a1465803567"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.482897 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cce95230-2b72-4598-9d28-3a1465803567-config-data" (OuterVolumeSpecName: "config-data") pod "cce95230-2b72-4598-9d28-3a1465803567" (UID: "cce95230-2b72-4598-9d28-3a1465803567"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.504662 4746 generic.go:334] "Generic (PLEG): container finished" podID="40f7281e-2e3c-4ce9-8b0f-876312390c0b" containerID="f4d3b5e2216f8a4fe2a3ab22e9e3ac51f770c2bc3e05d57ffd1bc03c025cbfbb" exitCode=0 Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.504745 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66b8c6d9b5-z7qnr" event={"ID":"40f7281e-2e3c-4ce9-8b0f-876312390c0b","Type":"ContainerDied","Data":"f4d3b5e2216f8a4fe2a3ab22e9e3ac51f770c2bc3e05d57ffd1bc03c025cbfbb"} Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.507128 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-p9ghn" event={"ID":"cce95230-2b72-4598-9d28-3a1465803567","Type":"ContainerDied","Data":"9a71935b892fd6120995b5eb42c6d3debef679a8c19bebe2074b3ddd0243700e"} Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.507160 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a71935b892fd6120995b5eb42c6d3debef679a8c19bebe2074b3ddd0243700e" Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.507223 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-p9ghn" Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.511108 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"726e5b20-4725-4c19-9dac-42b68d0e181a","Type":"ContainerStarted","Data":"80afe71102801c364cfc9116319912bdce369eaab4832c5814a331fb4bc491fe"} Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.546247 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm2k7\" (UniqueName: \"kubernetes.io/projected/cce95230-2b72-4598-9d28-3a1465803567-kube-api-access-bm2k7\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.546283 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cce95230-2b72-4598-9d28-3a1465803567-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.546292 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cce95230-2b72-4598-9d28-3a1465803567-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.546300 4746 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/cce95230-2b72-4598-9d28-3a1465803567-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:23 crc kubenswrapper[4746]: I0128 21:00:23.546308 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cce95230-2b72-4598-9d28-3a1465803567-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.522320 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-lrs2d"] Jan 28 21:00:24 crc kubenswrapper[4746]: E0128 21:00:24.523138 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce95230-2b72-4598-9d28-3a1465803567" containerName="cloudkitty-db-sync" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.523159 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce95230-2b72-4598-9d28-3a1465803567" containerName="cloudkitty-db-sync" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.523423 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="cce95230-2b72-4598-9d28-3a1465803567" containerName="cloudkitty-db-sync" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.524272 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.526356 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.526764 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.526912 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.542678 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-bjgvh" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.543219 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.545104 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-lrs2d"] Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.598979 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-scripts\") pod \"cloudkitty-storageinit-lrs2d\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.599157 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l27gx\" (UniqueName: \"kubernetes.io/projected/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-kube-api-access-l27gx\") pod \"cloudkitty-storageinit-lrs2d\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.599383 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-config-data\") pod \"cloudkitty-storageinit-lrs2d\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.599529 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-certs\") pod \"cloudkitty-storageinit-lrs2d\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.599570 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-combined-ca-bundle\") pod \"cloudkitty-storageinit-lrs2d\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.709409 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l27gx\" (UniqueName: \"kubernetes.io/projected/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-kube-api-access-l27gx\") pod \"cloudkitty-storageinit-lrs2d\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.709539 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-config-data\") pod \"cloudkitty-storageinit-lrs2d\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.709614 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-certs\") pod \"cloudkitty-storageinit-lrs2d\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.709650 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-combined-ca-bundle\") pod \"cloudkitty-storageinit-lrs2d\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.709757 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-scripts\") pod \"cloudkitty-storageinit-lrs2d\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.715882 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-scripts\") pod \"cloudkitty-storageinit-lrs2d\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.722257 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-combined-ca-bundle\") pod \"cloudkitty-storageinit-lrs2d\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.726757 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-certs\") pod \"cloudkitty-storageinit-lrs2d\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.730158 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-config-data\") pod \"cloudkitty-storageinit-lrs2d\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.771470 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.781336 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l27gx\" (UniqueName: \"kubernetes.io/projected/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-kube-api-access-l27gx\") pod \"cloudkitty-storageinit-lrs2d\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.892862 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.899494 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 21:00:24 crc kubenswrapper[4746]: I0128 21:00:24.934616 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5569c8497f-nhjcs"] Jan 28 21:00:24 crc kubenswrapper[4746]: W0128 21:00:24.960503 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9c2d514_3bdc_4969_a429_0aac820c8e77.slice/crio-161c908c404b0a287d8839777ba906e38cdffeeb7a6f74e212104eba2d7662a1 WatchSource:0}: Error finding container 161c908c404b0a287d8839777ba906e38cdffeeb7a6f74e212104eba2d7662a1: Status 404 returned error can't find the container with id 161c908c404b0a287d8839777ba906e38cdffeeb7a6f74e212104eba2d7662a1 Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.017061 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-combined-ca-bundle\") pod \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.017653 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-ovndb-tls-certs\") pod \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.017726 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-internal-tls-certs\") pod \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.017835 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-public-tls-certs\") pod \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.017867 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkznq\" (UniqueName: \"kubernetes.io/projected/40f7281e-2e3c-4ce9-8b0f-876312390c0b-kube-api-access-dkznq\") pod \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.017907 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-httpd-config\") pod \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.018046 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-config\") pod \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\" (UID: \"40f7281e-2e3c-4ce9-8b0f-876312390c0b\") " Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.037183 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40f7281e-2e3c-4ce9-8b0f-876312390c0b-kube-api-access-dkznq" (OuterVolumeSpecName: "kube-api-access-dkznq") pod "40f7281e-2e3c-4ce9-8b0f-876312390c0b" (UID: "40f7281e-2e3c-4ce9-8b0f-876312390c0b"). InnerVolumeSpecName "kube-api-access-dkznq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.042196 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "40f7281e-2e3c-4ce9-8b0f-876312390c0b" (UID: "40f7281e-2e3c-4ce9-8b0f-876312390c0b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.114405 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-config" (OuterVolumeSpecName: "config") pod "40f7281e-2e3c-4ce9-8b0f-876312390c0b" (UID: "40f7281e-2e3c-4ce9-8b0f-876312390c0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.116330 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40f7281e-2e3c-4ce9-8b0f-876312390c0b" (UID: "40f7281e-2e3c-4ce9-8b0f-876312390c0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.118580 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "40f7281e-2e3c-4ce9-8b0f-876312390c0b" (UID: "40f7281e-2e3c-4ce9-8b0f-876312390c0b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.126423 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-config\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.128623 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkznq\" (UniqueName: \"kubernetes.io/projected/40f7281e-2e3c-4ce9-8b0f-876312390c0b-kube-api-access-dkznq\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.128681 4746 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.145301 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "40f7281e-2e3c-4ce9-8b0f-876312390c0b" (UID: "40f7281e-2e3c-4ce9-8b0f-876312390c0b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.168300 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "40f7281e-2e3c-4ce9-8b0f-876312390c0b" (UID: "40f7281e-2e3c-4ce9-8b0f-876312390c0b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.230370 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.230405 4746 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.230415 4746 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.230423 4746 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40f7281e-2e3c-4ce9-8b0f-876312390c0b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.498175 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-54d56bfd95-zhg7t"] Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.521558 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6dcb95875d-qz4xq"] Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.573298 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7fncg"] Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.606458 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54d56bfd95-zhg7t" event={"ID":"26db0906-ba06-4d40-b864-c7d956379296","Type":"ContainerStarted","Data":"043b61e5b74cb5bcfb3d6846b982ebaa484a9df9574b8cc2e671d6e883128439"} Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.613293 4746 generic.go:334] "Generic (PLEG): container finished" podID="24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470" containerID="99e283f1a5c1df2cd476d59196170097684f7435c77ccf62772d830d0b212755" exitCode=0 Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.613367 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b895b5785-8cmsg" event={"ID":"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470","Type":"ContainerDied","Data":"99e283f1a5c1df2cd476d59196170097684f7435c77ccf62772d830d0b212755"} Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.628539 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924","Type":"ContainerStarted","Data":"bd42c88b4838603ef89398820eb1e912e2df2d39c4c8759aa10077f76b88b731"} Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.628674 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerName="ceilometer-central-agent" containerID="cri-o://e3997fad01bd1a3636c4f7b48290559049e0c983129537cc89988b435cd7c658" gracePeriod=30 Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.628721 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerName="sg-core" containerID="cri-o://1c384874e5c5af51f0459fdf2f1159ba60ac6c044fcea36d3c6e079ade91f9ad" gracePeriod=30 Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.628761 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerName="ceilometer-notification-agent" containerID="cri-o://0662539b92dae318d78b622e435f24e6de96b918733159dd31099925e363ef5b" gracePeriod=30 Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.628706 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.628745 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerName="proxy-httpd" containerID="cri-o://bd42c88b4838603ef89398820eb1e912e2df2d39c4c8759aa10077f76b88b731" gracePeriod=30 Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.641064 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79599f5dcd-btgz7"] Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.671506 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dcb95875d-qz4xq" event={"ID":"ea56fe21-dd12-486e-a4e9-2af7cb7b9387","Type":"ContainerStarted","Data":"e8f82981b2a7185afefdd8be003aac5f283992841076c71eee0af3d9fa599e88"} Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.682925 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5569c8497f-nhjcs" event={"ID":"d9c2d514-3bdc-4969-a429-0aac820c8e77","Type":"ContainerStarted","Data":"161c908c404b0a287d8839777ba906e38cdffeeb7a6f74e212104eba2d7662a1"} Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.723485 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" event={"ID":"d730a144-0107-409b-9abe-1e316143afc9","Type":"ContainerStarted","Data":"2645007e619376b57c4674e24facd7aebcbc59137a1a5137801479b19218508f"} Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.735230 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66b8c6d9b5-z7qnr" event={"ID":"40f7281e-2e3c-4ce9-8b0f-876312390c0b","Type":"ContainerDied","Data":"61c5f777b1fc0ee7cd1bb33efb0c01a97b06132e1dfad46546e508ed29e9122b"} Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.735406 4746 scope.go:117] "RemoveContainer" containerID="7b0a672bdbe84c0ab3283f2f4f2a082d7f780e806e7442f059d9e6da34f9ef64" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.735512 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66b8c6d9b5-z7qnr" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.736835 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-55g9q"] Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.755834 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7fdd9345-765c-4290-beee-f02752b34ee8","Type":"ContainerStarted","Data":"f2bfdb01d2ca9582b2bcf1c28302a70d2b2ffe12aba7bd2a44ebcb319537c712"} Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.763915 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-74d8954788-pqmtp"] Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.769717 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.772591594 podStartE2EDuration="1m10.769695709s" podCreationTimestamp="2026-01-28 20:59:15 +0000 UTC" firstStartedPulling="2026-01-28 20:59:18.193284432 +0000 UTC m=+1186.149470786" lastFinishedPulling="2026-01-28 21:00:24.190388547 +0000 UTC m=+1252.146574901" observedRunningTime="2026-01-28 21:00:25.705981275 +0000 UTC m=+1253.662167629" watchObservedRunningTime="2026-01-28 21:00:25.769695709 +0000 UTC m=+1253.725882063" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.822037 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-lrs2d"] Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.845822 4746 scope.go:117] "RemoveContainer" containerID="f4d3b5e2216f8a4fe2a3ab22e9e3ac51f770c2bc3e05d57ffd1bc03c025cbfbb" Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.845962 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66b8c6d9b5-z7qnr"] Jan 28 21:00:25 crc kubenswrapper[4746]: I0128 21:00:25.853799 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-66b8c6d9b5-z7qnr"] Jan 28 21:00:25 crc kubenswrapper[4746]: W0128 21:00:25.973319 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3252e1be_6e47_4264_a4d2_ba4099c9f3c0.slice/crio-502fae4e88115565750253ab6f9988cb004981959d59d2ba640fd0ef3586df94 WatchSource:0}: Error finding container 502fae4e88115565750253ab6f9988cb004981959d59d2ba640fd0ef3586df94: Status 404 returned error can't find the container with id 502fae4e88115565750253ab6f9988cb004981959d59d2ba640fd0ef3586df94 Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.238357 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.352858 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-dns-swift-storage-0\") pod \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.352925 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5b2v\" (UniqueName: \"kubernetes.io/projected/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-kube-api-access-t5b2v\") pod \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.352997 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-ovsdbserver-nb\") pod \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.353192 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-dns-svc\") pod \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.353255 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-config\") pod \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.353321 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-ovsdbserver-sb\") pod \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\" (UID: \"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470\") " Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.360994 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-kube-api-access-t5b2v" (OuterVolumeSpecName: "kube-api-access-t5b2v") pod "24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470" (UID: "24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470"). InnerVolumeSpecName "kube-api-access-t5b2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.454956 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470" (UID: "24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.490221 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.490261 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5b2v\" (UniqueName: \"kubernetes.io/projected/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-kube-api-access-t5b2v\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.498723 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470" (UID: "24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.508456 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470" (UID: "24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.556965 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470" (UID: "24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.559309 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-config" (OuterVolumeSpecName: "config") pod "24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470" (UID: "24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.594029 4746 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.594057 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.594068 4746 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.594087 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470-config\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.801754 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dcb95875d-qz4xq" event={"ID":"ea56fe21-dd12-486e-a4e9-2af7cb7b9387","Type":"ContainerStarted","Data":"c64bfb4b8a459b76abaf202a730fc58370844fe34425ab49b2fb2b2f77e9b7f7"} Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.802145 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.802159 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dcb95875d-qz4xq" event={"ID":"ea56fe21-dd12-486e-a4e9-2af7cb7b9387","Type":"ContainerStarted","Data":"15361ac8791076a3c1cf04c0bd4b99b14ed3f96b51ede463fe6767f6242bbd80"} Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.805256 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79599f5dcd-btgz7" event={"ID":"8fa661e3-776e-42b0-83db-374d372232ad","Type":"ContainerStarted","Data":"81dbb1b5bcb00d01ccbef3c1a2a96d160589ae3c5327bee59d68f7834b0c2552"} Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.805341 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79599f5dcd-btgz7" event={"ID":"8fa661e3-776e-42b0-83db-374d372232ad","Type":"ContainerStarted","Data":"b21da846baedd8e31036d795bd168ef77e297c1588a30f7410f1625e487319c4"} Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.806851 4746 generic.go:334] "Generic (PLEG): container finished" podID="d730a144-0107-409b-9abe-1e316143afc9" containerID="050b9c14f0d8e90528e086d11503acb753c25c68d2fa52f5e415a11ca20961c1" exitCode=0 Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.806924 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" event={"ID":"d730a144-0107-409b-9abe-1e316143afc9","Type":"ContainerDied","Data":"050b9c14f0d8e90528e086d11503acb753c25c68d2fa52f5e415a11ca20961c1"} Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.808940 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" event={"ID":"9d6c401d-18ee-432b-992c-749c69887786","Type":"ContainerStarted","Data":"9df2380e1e74ddc4a722c80640dc5aaf8f1810a908185cc2b4074507ba04e57e"} Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.814209 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-lrs2d" event={"ID":"3252e1be-6e47-4264-a4d2-ba4099c9f3c0","Type":"ContainerStarted","Data":"40b2d2ea4815db4c2fec0bd255e8c9e9ad80dbc33d89a14cc2fc53328b9606b3"} Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.814264 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-lrs2d" event={"ID":"3252e1be-6e47-4264-a4d2-ba4099c9f3c0","Type":"ContainerStarted","Data":"502fae4e88115565750253ab6f9988cb004981959d59d2ba640fd0ef3586df94"} Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.839190 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6dcb95875d-qz4xq" podStartSLOduration=11.839171426 podStartE2EDuration="11.839171426s" podCreationTimestamp="2026-01-28 21:00:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:26.822331123 +0000 UTC m=+1254.778517477" watchObservedRunningTime="2026-01-28 21:00:26.839171426 +0000 UTC m=+1254.795357800" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.864057 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40f7281e-2e3c-4ce9-8b0f-876312390c0b" path="/var/lib/kubelet/pods/40f7281e-2e3c-4ce9-8b0f-876312390c0b/volumes" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.876975 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-lrs2d" podStartSLOduration=2.876959313 podStartE2EDuration="2.876959313s" podCreationTimestamp="2026-01-28 21:00:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:26.842166127 +0000 UTC m=+1254.798352481" watchObservedRunningTime="2026-01-28 21:00:26.876959313 +0000 UTC m=+1254.833145667" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.891467 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5569c8497f-nhjcs" event={"ID":"d9c2d514-3bdc-4969-a429-0aac820c8e77","Type":"ContainerStarted","Data":"90b70ee05fb9447939fab24fd7f6e04bb5e473b63098f8638d6058ec78cd2722"} Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.891512 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5569c8497f-nhjcs" event={"ID":"d9c2d514-3bdc-4969-a429-0aac820c8e77","Type":"ContainerStarted","Data":"8f0c5d71bed623448403864c52454f009a5aa52ed5873ba015e1d1818bb08bef"} Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.892195 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.894369 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7fdd9345-765c-4290-beee-f02752b34ee8","Type":"ContainerStarted","Data":"b7762c8eddb23e4e1cfa105a2d4a9325209d894f5cb4028d733b21789e3c40c1"} Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.976791 4746 generic.go:334] "Generic (PLEG): container finished" podID="766b5979-538e-4a54-a1b5-3351e3988f70" containerID="fac7d70a5b957886a2ba853350ecac76e6383948d810c424b865d3c3f9334bfa" exitCode=0 Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.977100 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-55g9q" event={"ID":"766b5979-538e-4a54-a1b5-3351e3988f70","Type":"ContainerDied","Data":"fac7d70a5b957886a2ba853350ecac76e6383948d810c424b865d3c3f9334bfa"} Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.977183 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-55g9q" event={"ID":"766b5979-538e-4a54-a1b5-3351e3988f70","Type":"ContainerStarted","Data":"670d8d720ca847c520995bbdf2c96e915dc4b7e55aa4cd7ed4579045220a76cd"} Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.978688 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5569c8497f-nhjcs" podStartSLOduration=7.978668219 podStartE2EDuration="7.978668219s" podCreationTimestamp="2026-01-28 21:00:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:26.929425565 +0000 UTC m=+1254.885611919" watchObservedRunningTime="2026-01-28 21:00:26.978668219 +0000 UTC m=+1254.934854573" Jan 28 21:00:26 crc kubenswrapper[4746]: I0128 21:00:26.988252 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"726e5b20-4725-4c19-9dac-42b68d0e181a","Type":"ContainerStarted","Data":"c358fe175c62284f9277b975bf38d8ec6d0552e688e5245a0625ecfbf2e4b6ba"} Jan 28 21:00:27 crc kubenswrapper[4746]: I0128 21:00:27.009927 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b895b5785-8cmsg" event={"ID":"24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470","Type":"ContainerDied","Data":"806ba0ccdd5f45bb4eaaadeea0485a2468a66ceba22b7e621b29d8b8392c04f5"} Jan 28 21:00:27 crc kubenswrapper[4746]: I0128 21:00:27.009984 4746 scope.go:117] "RemoveContainer" containerID="99e283f1a5c1df2cd476d59196170097684f7435c77ccf62772d830d0b212755" Jan 28 21:00:27 crc kubenswrapper[4746]: I0128 21:00:27.010163 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-8cmsg" Jan 28 21:00:27 crc kubenswrapper[4746]: I0128 21:00:27.018526 4746 generic.go:334] "Generic (PLEG): container finished" podID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerID="bd42c88b4838603ef89398820eb1e912e2df2d39c4c8759aa10077f76b88b731" exitCode=0 Jan 28 21:00:27 crc kubenswrapper[4746]: I0128 21:00:27.018555 4746 generic.go:334] "Generic (PLEG): container finished" podID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerID="1c384874e5c5af51f0459fdf2f1159ba60ac6c044fcea36d3c6e079ade91f9ad" exitCode=2 Jan 28 21:00:27 crc kubenswrapper[4746]: I0128 21:00:27.018562 4746 generic.go:334] "Generic (PLEG): container finished" podID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerID="e3997fad01bd1a3636c4f7b48290559049e0c983129537cc89988b435cd7c658" exitCode=0 Jan 28 21:00:27 crc kubenswrapper[4746]: I0128 21:00:27.018588 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924","Type":"ContainerDied","Data":"bd42c88b4838603ef89398820eb1e912e2df2d39c4c8759aa10077f76b88b731"} Jan 28 21:00:27 crc kubenswrapper[4746]: I0128 21:00:27.019151 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924","Type":"ContainerDied","Data":"1c384874e5c5af51f0459fdf2f1159ba60ac6c044fcea36d3c6e079ade91f9ad"} Jan 28 21:00:27 crc kubenswrapper[4746]: I0128 21:00:27.019173 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924","Type":"ContainerDied","Data":"e3997fad01bd1a3636c4f7b48290559049e0c983129537cc89988b435cd7c658"} Jan 28 21:00:27 crc kubenswrapper[4746]: I0128 21:00:27.075126 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-8cmsg"] Jan 28 21:00:27 crc kubenswrapper[4746]: I0128 21:00:27.099969 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-8cmsg"] Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.049326 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7fdd9345-765c-4290-beee-f02752b34ee8","Type":"ContainerStarted","Data":"8623e64b4abbd4310fcafd6f21923040f10f1cf657f2e89f7afd7f0a2ea89bd6"} Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.049615 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7fdd9345-765c-4290-beee-f02752b34ee8" containerName="cinder-api-log" containerID="cri-o://b7762c8eddb23e4e1cfa105a2d4a9325209d894f5cb4028d733b21789e3c40c1" gracePeriod=30 Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.050155 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.050170 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7fdd9345-765c-4290-beee-f02752b34ee8" containerName="cinder-api" containerID="cri-o://8623e64b4abbd4310fcafd6f21923040f10f1cf657f2e89f7afd7f0a2ea89bd6" gracePeriod=30 Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.054564 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"726e5b20-4725-4c19-9dac-42b68d0e181a","Type":"ContainerStarted","Data":"0265e6865880168295fe6b0ef61dc05766773a2c0617568a8509325abe102174"} Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.064420 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79599f5dcd-btgz7" event={"ID":"8fa661e3-776e-42b0-83db-374d372232ad","Type":"ContainerStarted","Data":"77c9981301a2e0a34b831e328158202898d2e85ae417c8aa8896ed2cf91cb1cb"} Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.065200 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.065228 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.068555 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" event={"ID":"d730a144-0107-409b-9abe-1e316143afc9","Type":"ContainerStarted","Data":"3ff294de60bb526f0e36528008951905cb66f06f67099f304d07a2b85c2f1ab1"} Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.068713 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.068976 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.077410 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=14.077377932 podStartE2EDuration="14.077377932s" podCreationTimestamp="2026-01-28 21:00:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:28.075201054 +0000 UTC m=+1256.031387428" watchObservedRunningTime="2026-01-28 21:00:28.077377932 +0000 UTC m=+1256.033564286" Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.094839 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=12.71995141 podStartE2EDuration="14.094824002s" podCreationTimestamp="2026-01-28 21:00:14 +0000 UTC" firstStartedPulling="2026-01-28 21:00:23.184400698 +0000 UTC m=+1251.140587062" lastFinishedPulling="2026-01-28 21:00:24.5592733 +0000 UTC m=+1252.515459654" observedRunningTime="2026-01-28 21:00:28.093059455 +0000 UTC m=+1256.049245809" watchObservedRunningTime="2026-01-28 21:00:28.094824002 +0000 UTC m=+1256.051010356" Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.114360 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-79599f5dcd-btgz7" podStartSLOduration=7.114343227 podStartE2EDuration="7.114343227s" podCreationTimestamp="2026-01-28 21:00:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:28.111360737 +0000 UTC m=+1256.067547091" watchObservedRunningTime="2026-01-28 21:00:28.114343227 +0000 UTC m=+1256.070529581" Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.136617 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" podStartSLOduration=13.136597696 podStartE2EDuration="13.136597696s" podCreationTimestamp="2026-01-28 21:00:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:28.132142775 +0000 UTC m=+1256.088329129" watchObservedRunningTime="2026-01-28 21:00:28.136597696 +0000 UTC m=+1256.092784060" Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.768437 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-55g9q" Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.869969 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470" path="/var/lib/kubelet/pods/24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470/volumes" Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.875736 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rzkk\" (UniqueName: \"kubernetes.io/projected/766b5979-538e-4a54-a1b5-3351e3988f70-kube-api-access-9rzkk\") pod \"766b5979-538e-4a54-a1b5-3351e3988f70\" (UID: \"766b5979-538e-4a54-a1b5-3351e3988f70\") " Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.876663 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/766b5979-538e-4a54-a1b5-3351e3988f70-operator-scripts\") pod \"766b5979-538e-4a54-a1b5-3351e3988f70\" (UID: \"766b5979-538e-4a54-a1b5-3351e3988f70\") " Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.878445 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/766b5979-538e-4a54-a1b5-3351e3988f70-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "766b5979-538e-4a54-a1b5-3351e3988f70" (UID: "766b5979-538e-4a54-a1b5-3351e3988f70"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.899637 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/766b5979-538e-4a54-a1b5-3351e3988f70-kube-api-access-9rzkk" (OuterVolumeSpecName: "kube-api-access-9rzkk") pod "766b5979-538e-4a54-a1b5-3351e3988f70" (UID: "766b5979-538e-4a54-a1b5-3351e3988f70"). InnerVolumeSpecName "kube-api-access-9rzkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.980875 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rzkk\" (UniqueName: \"kubernetes.io/projected/766b5979-538e-4a54-a1b5-3351e3988f70-kube-api-access-9rzkk\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:28 crc kubenswrapper[4746]: I0128 21:00:28.980908 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/766b5979-538e-4a54-a1b5-3351e3988f70-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.059166 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.060834 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.120151 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" event={"ID":"9d6c401d-18ee-432b-992c-749c69887786","Type":"ContainerStarted","Data":"9e6d25c95f97df1d3980720fc8daabb58d9599a3e382cebb85680445d8b8c516"} Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.131979 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.131967 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7fdd9345-765c-4290-beee-f02752b34ee8","Type":"ContainerDied","Data":"8623e64b4abbd4310fcafd6f21923040f10f1cf657f2e89f7afd7f0a2ea89bd6"} Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.132068 4746 scope.go:117] "RemoveContainer" containerID="8623e64b4abbd4310fcafd6f21923040f10f1cf657f2e89f7afd7f0a2ea89bd6" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.132281 4746 generic.go:334] "Generic (PLEG): container finished" podID="7fdd9345-765c-4290-beee-f02752b34ee8" containerID="8623e64b4abbd4310fcafd6f21923040f10f1cf657f2e89f7afd7f0a2ea89bd6" exitCode=0 Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.132303 4746 generic.go:334] "Generic (PLEG): container finished" podID="7fdd9345-765c-4290-beee-f02752b34ee8" containerID="b7762c8eddb23e4e1cfa105a2d4a9325209d894f5cb4028d733b21789e3c40c1" exitCode=143 Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.132345 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7fdd9345-765c-4290-beee-f02752b34ee8","Type":"ContainerDied","Data":"b7762c8eddb23e4e1cfa105a2d4a9325209d894f5cb4028d733b21789e3c40c1"} Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.132359 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7fdd9345-765c-4290-beee-f02752b34ee8","Type":"ContainerDied","Data":"f2bfdb01d2ca9582b2bcf1c28302a70d2b2ffe12aba7bd2a44ebcb319537c712"} Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.139867 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54d56bfd95-zhg7t" event={"ID":"26db0906-ba06-4d40-b864-c7d956379296","Type":"ContainerStarted","Data":"ffa4166808650034861d72b887f7c58c1076da85d55fd3e65ab017fa0341919c"} Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.146485 4746 generic.go:334] "Generic (PLEG): container finished" podID="3252e1be-6e47-4264-a4d2-ba4099c9f3c0" containerID="40b2d2ea4815db4c2fec0bd255e8c9e9ad80dbc33d89a14cc2fc53328b9606b3" exitCode=0 Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.146538 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-lrs2d" event={"ID":"3252e1be-6e47-4264-a4d2-ba4099c9f3c0","Type":"ContainerDied","Data":"40b2d2ea4815db4c2fec0bd255e8c9e9ad80dbc33d89a14cc2fc53328b9606b3"} Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.156482 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-55g9q" event={"ID":"766b5979-538e-4a54-a1b5-3351e3988f70","Type":"ContainerDied","Data":"670d8d720ca847c520995bbdf2c96e915dc4b7e55aa4cd7ed4579045220a76cd"} Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.156526 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="670d8d720ca847c520995bbdf2c96e915dc4b7e55aa4cd7ed4579045220a76cd" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.156603 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-55g9q" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.159287 4746 scope.go:117] "RemoveContainer" containerID="b7762c8eddb23e4e1cfa105a2d4a9325209d894f5cb4028d733b21789e3c40c1" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.182179 4746 generic.go:334] "Generic (PLEG): container finished" podID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerID="0662539b92dae318d78b622e435f24e6de96b918733159dd31099925e363ef5b" exitCode=0 Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.183821 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-scripts\") pod \"7fdd9345-765c-4290-beee-f02752b34ee8\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.183860 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7fdd9345-765c-4290-beee-f02752b34ee8-etc-machine-id\") pod \"7fdd9345-765c-4290-beee-f02752b34ee8\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.183880 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-config-data\") pod \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.183903 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-log-httpd\") pod \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.183924 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-sg-core-conf-yaml\") pod \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.183959 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-combined-ca-bundle\") pod \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.184001 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-scripts\") pod \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.184016 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lw6q\" (UniqueName: \"kubernetes.io/projected/7fdd9345-765c-4290-beee-f02752b34ee8-kube-api-access-7lw6q\") pod \"7fdd9345-765c-4290-beee-f02752b34ee8\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.184060 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-config-data\") pod \"7fdd9345-765c-4290-beee-f02752b34ee8\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.184149 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjbtc\" (UniqueName: \"kubernetes.io/projected/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-kube-api-access-rjbtc\") pod \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.184188 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fdd9345-765c-4290-beee-f02752b34ee8-logs\") pod \"7fdd9345-765c-4290-beee-f02752b34ee8\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.184311 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-run-httpd\") pod \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\" (UID: \"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924\") " Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.184336 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-config-data-custom\") pod \"7fdd9345-765c-4290-beee-f02752b34ee8\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.184359 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-combined-ca-bundle\") pod \"7fdd9345-765c-4290-beee-f02752b34ee8\" (UID: \"7fdd9345-765c-4290-beee-f02752b34ee8\") " Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.182330 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.182343 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924","Type":"ContainerDied","Data":"0662539b92dae318d78b622e435f24e6de96b918733159dd31099925e363ef5b"} Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.186272 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9abc9de-4ca9-4ba4-afca-bb6b5bf50924","Type":"ContainerDied","Data":"ca468a76c2b0fc437cfff1fe1d6721a46f3de9f302641d999ead09846737c3fe"} Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.187676 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fdd9345-765c-4290-beee-f02752b34ee8-logs" (OuterVolumeSpecName: "logs") pod "7fdd9345-765c-4290-beee-f02752b34ee8" (UID: "7fdd9345-765c-4290-beee-f02752b34ee8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.187706 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" (UID: "e9abc9de-4ca9-4ba4-afca-bb6b5bf50924"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.188001 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" (UID: "e9abc9de-4ca9-4ba4-afca-bb6b5bf50924"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.188038 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7fdd9345-765c-4290-beee-f02752b34ee8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7fdd9345-765c-4290-beee-f02752b34ee8" (UID: "7fdd9345-765c-4290-beee-f02752b34ee8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.200212 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-scripts" (OuterVolumeSpecName: "scripts") pod "7fdd9345-765c-4290-beee-f02752b34ee8" (UID: "7fdd9345-765c-4290-beee-f02752b34ee8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.201028 4746 scope.go:117] "RemoveContainer" containerID="8623e64b4abbd4310fcafd6f21923040f10f1cf657f2e89f7afd7f0a2ea89bd6" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.201397 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fdd9345-765c-4290-beee-f02752b34ee8-kube-api-access-7lw6q" (OuterVolumeSpecName: "kube-api-access-7lw6q") pod "7fdd9345-765c-4290-beee-f02752b34ee8" (UID: "7fdd9345-765c-4290-beee-f02752b34ee8"). InnerVolumeSpecName "kube-api-access-7lw6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.201575 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-scripts" (OuterVolumeSpecName: "scripts") pod "e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" (UID: "e9abc9de-4ca9-4ba4-afca-bb6b5bf50924"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.201710 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-kube-api-access-rjbtc" (OuterVolumeSpecName: "kube-api-access-rjbtc") pod "e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" (UID: "e9abc9de-4ca9-4ba4-afca-bb6b5bf50924"). InnerVolumeSpecName "kube-api-access-rjbtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:29 crc kubenswrapper[4746]: E0128 21:00:29.201816 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8623e64b4abbd4310fcafd6f21923040f10f1cf657f2e89f7afd7f0a2ea89bd6\": container with ID starting with 8623e64b4abbd4310fcafd6f21923040f10f1cf657f2e89f7afd7f0a2ea89bd6 not found: ID does not exist" containerID="8623e64b4abbd4310fcafd6f21923040f10f1cf657f2e89f7afd7f0a2ea89bd6" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.201854 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8623e64b4abbd4310fcafd6f21923040f10f1cf657f2e89f7afd7f0a2ea89bd6"} err="failed to get container status \"8623e64b4abbd4310fcafd6f21923040f10f1cf657f2e89f7afd7f0a2ea89bd6\": rpc error: code = NotFound desc = could not find container \"8623e64b4abbd4310fcafd6f21923040f10f1cf657f2e89f7afd7f0a2ea89bd6\": container with ID starting with 8623e64b4abbd4310fcafd6f21923040f10f1cf657f2e89f7afd7f0a2ea89bd6 not found: ID does not exist" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.201881 4746 scope.go:117] "RemoveContainer" containerID="b7762c8eddb23e4e1cfa105a2d4a9325209d894f5cb4028d733b21789e3c40c1" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.202227 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7fdd9345-765c-4290-beee-f02752b34ee8" (UID: "7fdd9345-765c-4290-beee-f02752b34ee8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:29 crc kubenswrapper[4746]: E0128 21:00:29.202994 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7762c8eddb23e4e1cfa105a2d4a9325209d894f5cb4028d733b21789e3c40c1\": container with ID starting with b7762c8eddb23e4e1cfa105a2d4a9325209d894f5cb4028d733b21789e3c40c1 not found: ID does not exist" containerID="b7762c8eddb23e4e1cfa105a2d4a9325209d894f5cb4028d733b21789e3c40c1" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.203024 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7762c8eddb23e4e1cfa105a2d4a9325209d894f5cb4028d733b21789e3c40c1"} err="failed to get container status \"b7762c8eddb23e4e1cfa105a2d4a9325209d894f5cb4028d733b21789e3c40c1\": rpc error: code = NotFound desc = could not find container \"b7762c8eddb23e4e1cfa105a2d4a9325209d894f5cb4028d733b21789e3c40c1\": container with ID starting with b7762c8eddb23e4e1cfa105a2d4a9325209d894f5cb4028d733b21789e3c40c1 not found: ID does not exist" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.203042 4746 scope.go:117] "RemoveContainer" containerID="8623e64b4abbd4310fcafd6f21923040f10f1cf657f2e89f7afd7f0a2ea89bd6" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.203383 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8623e64b4abbd4310fcafd6f21923040f10f1cf657f2e89f7afd7f0a2ea89bd6"} err="failed to get container status \"8623e64b4abbd4310fcafd6f21923040f10f1cf657f2e89f7afd7f0a2ea89bd6\": rpc error: code = NotFound desc = could not find container \"8623e64b4abbd4310fcafd6f21923040f10f1cf657f2e89f7afd7f0a2ea89bd6\": container with ID starting with 8623e64b4abbd4310fcafd6f21923040f10f1cf657f2e89f7afd7f0a2ea89bd6 not found: ID does not exist" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.203416 4746 scope.go:117] "RemoveContainer" containerID="b7762c8eddb23e4e1cfa105a2d4a9325209d894f5cb4028d733b21789e3c40c1" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.203659 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7762c8eddb23e4e1cfa105a2d4a9325209d894f5cb4028d733b21789e3c40c1"} err="failed to get container status \"b7762c8eddb23e4e1cfa105a2d4a9325209d894f5cb4028d733b21789e3c40c1\": rpc error: code = NotFound desc = could not find container \"b7762c8eddb23e4e1cfa105a2d4a9325209d894f5cb4028d733b21789e3c40c1\": container with ID starting with b7762c8eddb23e4e1cfa105a2d4a9325209d894f5cb4028d733b21789e3c40c1 not found: ID does not exist" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.203685 4746 scope.go:117] "RemoveContainer" containerID="bd42c88b4838603ef89398820eb1e912e2df2d39c4c8759aa10077f76b88b731" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.227151 4746 scope.go:117] "RemoveContainer" containerID="1c384874e5c5af51f0459fdf2f1159ba60ac6c044fcea36d3c6e079ade91f9ad" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.247474 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fdd9345-765c-4290-beee-f02752b34ee8" (UID: "7fdd9345-765c-4290-beee-f02752b34ee8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.250998 4746 scope.go:117] "RemoveContainer" containerID="0662539b92dae318d78b622e435f24e6de96b918733159dd31099925e363ef5b" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.260694 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" (UID: "e9abc9de-4ca9-4ba4-afca-bb6b5bf50924"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.267554 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-config-data" (OuterVolumeSpecName: "config-data") pod "7fdd9345-765c-4290-beee-f02752b34ee8" (UID: "7fdd9345-765c-4290-beee-f02752b34ee8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.273186 4746 scope.go:117] "RemoveContainer" containerID="e3997fad01bd1a3636c4f7b48290559049e0c983129537cc89988b435cd7c658" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.286203 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.286225 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lw6q\" (UniqueName: \"kubernetes.io/projected/7fdd9345-765c-4290-beee-f02752b34ee8-kube-api-access-7lw6q\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.286234 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.286244 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjbtc\" (UniqueName: \"kubernetes.io/projected/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-kube-api-access-rjbtc\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.286252 4746 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fdd9345-765c-4290-beee-f02752b34ee8-logs\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.286260 4746 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.286267 4746 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.286275 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.286283 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fdd9345-765c-4290-beee-f02752b34ee8-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.286293 4746 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7fdd9345-765c-4290-beee-f02752b34ee8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.286301 4746 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.286308 4746 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.293381 4746 scope.go:117] "RemoveContainer" containerID="bd42c88b4838603ef89398820eb1e912e2df2d39c4c8759aa10077f76b88b731" Jan 28 21:00:29 crc kubenswrapper[4746]: E0128 21:00:29.293758 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd42c88b4838603ef89398820eb1e912e2df2d39c4c8759aa10077f76b88b731\": container with ID starting with bd42c88b4838603ef89398820eb1e912e2df2d39c4c8759aa10077f76b88b731 not found: ID does not exist" containerID="bd42c88b4838603ef89398820eb1e912e2df2d39c4c8759aa10077f76b88b731" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.293801 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd42c88b4838603ef89398820eb1e912e2df2d39c4c8759aa10077f76b88b731"} err="failed to get container status \"bd42c88b4838603ef89398820eb1e912e2df2d39c4c8759aa10077f76b88b731\": rpc error: code = NotFound desc = could not find container \"bd42c88b4838603ef89398820eb1e912e2df2d39c4c8759aa10077f76b88b731\": container with ID starting with bd42c88b4838603ef89398820eb1e912e2df2d39c4c8759aa10077f76b88b731 not found: ID does not exist" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.293825 4746 scope.go:117] "RemoveContainer" containerID="1c384874e5c5af51f0459fdf2f1159ba60ac6c044fcea36d3c6e079ade91f9ad" Jan 28 21:00:29 crc kubenswrapper[4746]: E0128 21:00:29.294216 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c384874e5c5af51f0459fdf2f1159ba60ac6c044fcea36d3c6e079ade91f9ad\": container with ID starting with 1c384874e5c5af51f0459fdf2f1159ba60ac6c044fcea36d3c6e079ade91f9ad not found: ID does not exist" containerID="1c384874e5c5af51f0459fdf2f1159ba60ac6c044fcea36d3c6e079ade91f9ad" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.294243 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c384874e5c5af51f0459fdf2f1159ba60ac6c044fcea36d3c6e079ade91f9ad"} err="failed to get container status \"1c384874e5c5af51f0459fdf2f1159ba60ac6c044fcea36d3c6e079ade91f9ad\": rpc error: code = NotFound desc = could not find container \"1c384874e5c5af51f0459fdf2f1159ba60ac6c044fcea36d3c6e079ade91f9ad\": container with ID starting with 1c384874e5c5af51f0459fdf2f1159ba60ac6c044fcea36d3c6e079ade91f9ad not found: ID does not exist" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.294257 4746 scope.go:117] "RemoveContainer" containerID="0662539b92dae318d78b622e435f24e6de96b918733159dd31099925e363ef5b" Jan 28 21:00:29 crc kubenswrapper[4746]: E0128 21:00:29.294517 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0662539b92dae318d78b622e435f24e6de96b918733159dd31099925e363ef5b\": container with ID starting with 0662539b92dae318d78b622e435f24e6de96b918733159dd31099925e363ef5b not found: ID does not exist" containerID="0662539b92dae318d78b622e435f24e6de96b918733159dd31099925e363ef5b" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.294558 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0662539b92dae318d78b622e435f24e6de96b918733159dd31099925e363ef5b"} err="failed to get container status \"0662539b92dae318d78b622e435f24e6de96b918733159dd31099925e363ef5b\": rpc error: code = NotFound desc = could not find container \"0662539b92dae318d78b622e435f24e6de96b918733159dd31099925e363ef5b\": container with ID starting with 0662539b92dae318d78b622e435f24e6de96b918733159dd31099925e363ef5b not found: ID does not exist" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.294588 4746 scope.go:117] "RemoveContainer" containerID="e3997fad01bd1a3636c4f7b48290559049e0c983129537cc89988b435cd7c658" Jan 28 21:00:29 crc kubenswrapper[4746]: E0128 21:00:29.294815 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3997fad01bd1a3636c4f7b48290559049e0c983129537cc89988b435cd7c658\": container with ID starting with e3997fad01bd1a3636c4f7b48290559049e0c983129537cc89988b435cd7c658 not found: ID does not exist" containerID="e3997fad01bd1a3636c4f7b48290559049e0c983129537cc89988b435cd7c658" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.294834 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3997fad01bd1a3636c4f7b48290559049e0c983129537cc89988b435cd7c658"} err="failed to get container status \"e3997fad01bd1a3636c4f7b48290559049e0c983129537cc89988b435cd7c658\": rpc error: code = NotFound desc = could not find container \"e3997fad01bd1a3636c4f7b48290559049e0c983129537cc89988b435cd7c658\": container with ID starting with e3997fad01bd1a3636c4f7b48290559049e0c983129537cc89988b435cd7c658 not found: ID does not exist" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.299566 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" (UID: "e9abc9de-4ca9-4ba4-afca-bb6b5bf50924"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.353864 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-config-data" (OuterVolumeSpecName: "config-data") pod "e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" (UID: "e9abc9de-4ca9-4ba4-afca-bb6b5bf50924"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.387770 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.387810 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.466134 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.488924 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.514706 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 28 21:00:29 crc kubenswrapper[4746]: E0128 21:00:29.515250 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="766b5979-538e-4a54-a1b5-3351e3988f70" containerName="mariadb-account-create-update" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515280 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="766b5979-538e-4a54-a1b5-3351e3988f70" containerName="mariadb-account-create-update" Jan 28 21:00:29 crc kubenswrapper[4746]: E0128 21:00:29.515301 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fdd9345-765c-4290-beee-f02752b34ee8" containerName="cinder-api" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515312 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fdd9345-765c-4290-beee-f02752b34ee8" containerName="cinder-api" Jan 28 21:00:29 crc kubenswrapper[4746]: E0128 21:00:29.515328 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fdd9345-765c-4290-beee-f02752b34ee8" containerName="cinder-api-log" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515339 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fdd9345-765c-4290-beee-f02752b34ee8" containerName="cinder-api-log" Jan 28 21:00:29 crc kubenswrapper[4746]: E0128 21:00:29.515349 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerName="ceilometer-notification-agent" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515359 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerName="ceilometer-notification-agent" Jan 28 21:00:29 crc kubenswrapper[4746]: E0128 21:00:29.515384 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470" containerName="init" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515396 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470" containerName="init" Jan 28 21:00:29 crc kubenswrapper[4746]: E0128 21:00:29.515419 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40f7281e-2e3c-4ce9-8b0f-876312390c0b" containerName="neutron-api" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515430 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f7281e-2e3c-4ce9-8b0f-876312390c0b" containerName="neutron-api" Jan 28 21:00:29 crc kubenswrapper[4746]: E0128 21:00:29.515451 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerName="ceilometer-central-agent" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515461 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerName="ceilometer-central-agent" Jan 28 21:00:29 crc kubenswrapper[4746]: E0128 21:00:29.515479 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerName="sg-core" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515491 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerName="sg-core" Jan 28 21:00:29 crc kubenswrapper[4746]: E0128 21:00:29.515523 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerName="proxy-httpd" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515532 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerName="proxy-httpd" Jan 28 21:00:29 crc kubenswrapper[4746]: E0128 21:00:29.515561 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40f7281e-2e3c-4ce9-8b0f-876312390c0b" containerName="neutron-httpd" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515570 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f7281e-2e3c-4ce9-8b0f-876312390c0b" containerName="neutron-httpd" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515809 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="766b5979-538e-4a54-a1b5-3351e3988f70" containerName="mariadb-account-create-update" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515833 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerName="ceilometer-notification-agent" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515843 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerName="sg-core" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515855 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerName="ceilometer-central-agent" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515871 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" containerName="proxy-httpd" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515889 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fdd9345-765c-4290-beee-f02752b34ee8" containerName="cinder-api" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515907 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fdd9345-765c-4290-beee-f02752b34ee8" containerName="cinder-api-log" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515925 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a8ff3b-2ce2-4603-a1c0-76ca3fe3a470" containerName="init" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515939 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="40f7281e-2e3c-4ce9-8b0f-876312390c0b" containerName="neutron-api" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.515952 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="40f7281e-2e3c-4ce9-8b0f-876312390c0b" containerName="neutron-httpd" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.517288 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.525485 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.527816 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.528161 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.541233 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.580168 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.592473 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a613bc41-1308-4925-a2df-026f6622f0c2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.592535 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a613bc41-1308-4925-a2df-026f6622f0c2-config-data\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.592570 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a613bc41-1308-4925-a2df-026f6622f0c2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.592642 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a613bc41-1308-4925-a2df-026f6622f0c2-config-data-custom\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.592676 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a613bc41-1308-4925-a2df-026f6622f0c2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.592790 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpcx9\" (UniqueName: \"kubernetes.io/projected/a613bc41-1308-4925-a2df-026f6622f0c2-kube-api-access-kpcx9\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.592832 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a613bc41-1308-4925-a2df-026f6622f0c2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.592866 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a613bc41-1308-4925-a2df-026f6622f0c2-logs\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.592919 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a613bc41-1308-4925-a2df-026f6622f0c2-scripts\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.611250 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.628569 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.631475 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.638798 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.639037 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.661146 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.694360 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a613bc41-1308-4925-a2df-026f6622f0c2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.694406 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a613bc41-1308-4925-a2df-026f6622f0c2-config-data\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.694432 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a613bc41-1308-4925-a2df-026f6622f0c2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.694476 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a613bc41-1308-4925-a2df-026f6622f0c2-config-data-custom\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.694499 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a613bc41-1308-4925-a2df-026f6622f0c2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.694566 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpcx9\" (UniqueName: \"kubernetes.io/projected/a613bc41-1308-4925-a2df-026f6622f0c2-kube-api-access-kpcx9\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.694589 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a613bc41-1308-4925-a2df-026f6622f0c2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.694610 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a613bc41-1308-4925-a2df-026f6622f0c2-logs\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.694640 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a613bc41-1308-4925-a2df-026f6622f0c2-scripts\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.699759 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a613bc41-1308-4925-a2df-026f6622f0c2-scripts\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.699779 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a613bc41-1308-4925-a2df-026f6622f0c2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.700110 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a613bc41-1308-4925-a2df-026f6622f0c2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.700674 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a613bc41-1308-4925-a2df-026f6622f0c2-logs\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.702239 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a613bc41-1308-4925-a2df-026f6622f0c2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.708021 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a613bc41-1308-4925-a2df-026f6622f0c2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.709041 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a613bc41-1308-4925-a2df-026f6622f0c2-config-data-custom\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.709684 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a613bc41-1308-4925-a2df-026f6622f0c2-config-data\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.724987 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpcx9\" (UniqueName: \"kubernetes.io/projected/a613bc41-1308-4925-a2df-026f6622f0c2-kube-api-access-kpcx9\") pod \"cinder-api-0\" (UID: \"a613bc41-1308-4925-a2df-026f6622f0c2\") " pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.796014 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-config-data\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.796112 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.796176 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-scripts\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.796191 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.796277 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbzz9\" (UniqueName: \"kubernetes.io/projected/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-kube-api-access-nbzz9\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.796348 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-run-httpd\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.796370 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-log-httpd\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.840119 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.902964 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.903394 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-scripts\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.903419 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.903983 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbzz9\" (UniqueName: \"kubernetes.io/projected/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-kube-api-access-nbzz9\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.904051 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-run-httpd\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.904096 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-log-httpd\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.904118 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-config-data\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.907604 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-run-httpd\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.908180 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-log-httpd\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.908184 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-scripts\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.909259 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-config-data\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.911863 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.917803 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.927502 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbzz9\" (UniqueName: \"kubernetes.io/projected/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-kube-api-access-nbzz9\") pod \"ceilometer-0\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " pod="openstack/ceilometer-0" Jan 28 21:00:29 crc kubenswrapper[4746]: I0128 21:00:29.995937 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 28 21:00:30 crc kubenswrapper[4746]: I0128 21:00:30.114900 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:00:30 crc kubenswrapper[4746]: I0128 21:00:30.204012 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" event={"ID":"9d6c401d-18ee-432b-992c-749c69887786","Type":"ContainerStarted","Data":"bde154be64433c8c91c767520f664730d01fb85d94bd826296c0842b41155841"} Jan 28 21:00:30 crc kubenswrapper[4746]: I0128 21:00:30.236793 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-74d8954788-pqmtp" podStartSLOduration=12.400076954 podStartE2EDuration="15.236747177s" podCreationTimestamp="2026-01-28 21:00:15 +0000 UTC" firstStartedPulling="2026-01-28 21:00:25.767006736 +0000 UTC m=+1253.723193090" lastFinishedPulling="2026-01-28 21:00:28.603676959 +0000 UTC m=+1256.559863313" observedRunningTime="2026-01-28 21:00:30.225145044 +0000 UTC m=+1258.181331408" watchObservedRunningTime="2026-01-28 21:00:30.236747177 +0000 UTC m=+1258.192933531" Jan 28 21:00:30 crc kubenswrapper[4746]: I0128 21:00:30.240518 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54d56bfd95-zhg7t" event={"ID":"26db0906-ba06-4d40-b864-c7d956379296","Type":"ContainerStarted","Data":"73918518c2781eebdea006d2ff1b6fd2c911c81b0f032bf997ddac546a61a5d4"} Jan 28 21:00:30 crc kubenswrapper[4746]: I0128 21:00:30.266904 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-54d56bfd95-zhg7t" podStartSLOduration=12.202465929 podStartE2EDuration="15.266885687s" podCreationTimestamp="2026-01-28 21:00:15 +0000 UTC" firstStartedPulling="2026-01-28 21:00:25.540784712 +0000 UTC m=+1253.496971066" lastFinishedPulling="2026-01-28 21:00:28.60520447 +0000 UTC m=+1256.561390824" observedRunningTime="2026-01-28 21:00:30.259674744 +0000 UTC m=+1258.215861098" watchObservedRunningTime="2026-01-28 21:00:30.266885687 +0000 UTC m=+1258.223072041" Jan 28 21:00:30 crc kubenswrapper[4746]: I0128 21:00:30.448235 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 28 21:00:30 crc kubenswrapper[4746]: I0128 21:00:30.754927 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:00:30 crc kubenswrapper[4746]: I0128 21:00:30.800452 4746 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 21:00:30 crc kubenswrapper[4746]: I0128 21:00:30.855267 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fdd9345-765c-4290-beee-f02752b34ee8" path="/var/lib/kubelet/pods/7fdd9345-765c-4290-beee-f02752b34ee8/volumes" Jan 28 21:00:30 crc kubenswrapper[4746]: I0128 21:00:30.856218 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9abc9de-4ca9-4ba4-afca-bb6b5bf50924" path="/var/lib/kubelet/pods/e9abc9de-4ca9-4ba4-afca-bb6b5bf50924/volumes" Jan 28 21:00:30 crc kubenswrapper[4746]: I0128 21:00:30.936524 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.076074 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5755bdbcc4-rbmx8" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.083294 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.248224 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-certs\") pod \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.248276 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-scripts\") pod \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.248309 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-config-data\") pod \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.248363 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l27gx\" (UniqueName: \"kubernetes.io/projected/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-kube-api-access-l27gx\") pod \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.248534 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-combined-ca-bundle\") pod \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\" (UID: \"3252e1be-6e47-4264-a4d2-ba4099c9f3c0\") " Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.253633 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-scripts" (OuterVolumeSpecName: "scripts") pod "3252e1be-6e47-4264-a4d2-ba4099c9f3c0" (UID: "3252e1be-6e47-4264-a4d2-ba4099c9f3c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.256282 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-kube-api-access-l27gx" (OuterVolumeSpecName: "kube-api-access-l27gx") pod "3252e1be-6e47-4264-a4d2-ba4099c9f3c0" (UID: "3252e1be-6e47-4264-a4d2-ba4099c9f3c0"). InnerVolumeSpecName "kube-api-access-l27gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.271624 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-certs" (OuterVolumeSpecName: "certs") pod "3252e1be-6e47-4264-a4d2-ba4099c9f3c0" (UID: "3252e1be-6e47-4264-a4d2-ba4099c9f3c0"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.278965 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-lrs2d" event={"ID":"3252e1be-6e47-4264-a4d2-ba4099c9f3c0","Type":"ContainerDied","Data":"502fae4e88115565750253ab6f9988cb004981959d59d2ba640fd0ef3586df94"} Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.279007 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="502fae4e88115565750253ab6f9988cb004981959d59d2ba640fd0ef3586df94" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.279063 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-lrs2d" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.286680 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a613bc41-1308-4925-a2df-026f6622f0c2","Type":"ContainerStarted","Data":"21646c159936405e49ff1e9dd422b6e3f858f98de97db52ebe70b6e7083ced64"} Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.294874 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-config-data" (OuterVolumeSpecName: "config-data") pod "3252e1be-6e47-4264-a4d2-ba4099c9f3c0" (UID: "3252e1be-6e47-4264-a4d2-ba4099c9f3c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.302772 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0133f6a6-0179-41e7-a5b9-bc1661ea19e2","Type":"ContainerStarted","Data":"c666985feedf61a37eaf1b4d4f447d602c3d39a59b346ef5c16c3eb5e854f661"} Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.308516 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3252e1be-6e47-4264-a4d2-ba4099c9f3c0" (UID: "3252e1be-6e47-4264-a4d2-ba4099c9f3c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.350532 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.350828 4746 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.350838 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.350845 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.350854 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l27gx\" (UniqueName: \"kubernetes.io/projected/3252e1be-6e47-4264-a4d2-ba4099c9f3c0-kube-api-access-l27gx\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.450261 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7fncg"] Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.452429 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" podUID="d730a144-0107-409b-9abe-1e316143afc9" containerName="dnsmasq-dns" containerID="cri-o://3ff294de60bb526f0e36528008951905cb66f06f67099f304d07a2b85c2f1ab1" gracePeriod=10 Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.470169 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 28 21:00:31 crc kubenswrapper[4746]: E0128 21:00:31.470857 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3252e1be-6e47-4264-a4d2-ba4099c9f3c0" containerName="cloudkitty-storageinit" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.470874 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="3252e1be-6e47-4264-a4d2-ba4099c9f3c0" containerName="cloudkitty-storageinit" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.471191 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="3252e1be-6e47-4264-a4d2-ba4099c9f3c0" containerName="cloudkitty-storageinit" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.472267 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.483730 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.484894 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-wpkx6"] Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.491311 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.570142 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-wpkx6"] Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.571968 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-wpkx6\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.580016 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.583416 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.583683 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-wpkx6\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.583806 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-scripts\") pod \"cloudkitty-proc-0\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.583958 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4qfz\" (UniqueName: \"kubernetes.io/projected/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-kube-api-access-f4qfz\") pod \"dnsmasq-dns-67bdc55879-wpkx6\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.584122 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-dns-svc\") pod \"dnsmasq-dns-67bdc55879-wpkx6\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.584203 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-config-data\") pod \"cloudkitty-proc-0\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.584286 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjnsz\" (UniqueName: \"kubernetes.io/projected/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-kube-api-access-rjnsz\") pod \"cloudkitty-proc-0\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.584389 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-certs\") pod \"cloudkitty-proc-0\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.594464 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-wpkx6\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.594657 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-config\") pod \"dnsmasq-dns-67bdc55879-wpkx6\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.640168 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.697512 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-config\") pod \"dnsmasq-dns-67bdc55879-wpkx6\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.697603 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-wpkx6\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.697646 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.697677 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.697711 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-wpkx6\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.697735 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-scripts\") pod \"cloudkitty-proc-0\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.697762 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4qfz\" (UniqueName: \"kubernetes.io/projected/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-kube-api-access-f4qfz\") pod \"dnsmasq-dns-67bdc55879-wpkx6\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.697797 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-dns-svc\") pod \"dnsmasq-dns-67bdc55879-wpkx6\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.697813 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-config-data\") pod \"cloudkitty-proc-0\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.697832 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjnsz\" (UniqueName: \"kubernetes.io/projected/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-kube-api-access-rjnsz\") pod \"cloudkitty-proc-0\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.697860 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-certs\") pod \"cloudkitty-proc-0\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.697891 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-wpkx6\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.699051 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-wpkx6\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.700024 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-wpkx6\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.702223 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-dns-svc\") pod \"dnsmasq-dns-67bdc55879-wpkx6\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.718451 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-wpkx6\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.718542 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-config\") pod \"dnsmasq-dns-67bdc55879-wpkx6\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.731294 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-scripts\") pod \"cloudkitty-proc-0\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.735322 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.753858 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.768601 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-config-data\") pod \"cloudkitty-proc-0\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.769288 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4qfz\" (UniqueName: \"kubernetes.io/projected/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-kube-api-access-f4qfz\") pod \"dnsmasq-dns-67bdc55879-wpkx6\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.772699 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-certs\") pod \"cloudkitty-proc-0\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.772775 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.774658 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.776281 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjnsz\" (UniqueName: \"kubernetes.io/projected/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-kube-api-access-rjnsz\") pod \"cloudkitty-proc-0\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.777394 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.789924 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.870611 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.908185 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/716bc97d-b62f-4d0a-9478-b522f2fd26dd-logs\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.908246 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/716bc97d-b62f-4d0a-9478-b522f2fd26dd-certs\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.908272 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.908373 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt6g9\" (UniqueName: \"kubernetes.io/projected/716bc97d-b62f-4d0a-9478-b522f2fd26dd-kube-api-access-tt6g9\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.908422 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.908448 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-scripts\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.908462 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-config-data\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:31 crc kubenswrapper[4746]: I0128 21:00:31.914739 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.012545 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.012626 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-scripts\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.012653 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-config-data\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.012734 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/716bc97d-b62f-4d0a-9478-b522f2fd26dd-logs\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.012764 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/716bc97d-b62f-4d0a-9478-b522f2fd26dd-certs\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.012798 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.012925 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt6g9\" (UniqueName: \"kubernetes.io/projected/716bc97d-b62f-4d0a-9478-b522f2fd26dd-kube-api-access-tt6g9\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.021601 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/716bc97d-b62f-4d0a-9478-b522f2fd26dd-logs\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.023372 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.031586 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-scripts\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.036879 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.037458 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/716bc97d-b62f-4d0a-9478-b522f2fd26dd-certs\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.069171 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-config-data\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.077836 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt6g9\" (UniqueName: \"kubernetes.io/projected/716bc97d-b62f-4d0a-9478-b522f2fd26dd-kube-api-access-tt6g9\") pod \"cloudkitty-api-0\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.260555 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.376817 4746 generic.go:334] "Generic (PLEG): container finished" podID="d730a144-0107-409b-9abe-1e316143afc9" containerID="3ff294de60bb526f0e36528008951905cb66f06f67099f304d07a2b85c2f1ab1" exitCode=0 Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.376943 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" event={"ID":"d730a144-0107-409b-9abe-1e316143afc9","Type":"ContainerDied","Data":"3ff294de60bb526f0e36528008951905cb66f06f67099f304d07a2b85c2f1ab1"} Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.406940 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a613bc41-1308-4925-a2df-026f6622f0c2","Type":"ContainerStarted","Data":"d47c1ce0f28409cd7438804aca26c6904e9e59ea5c96d2a48d7b64f77b54a73e"} Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.407952 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.538995 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-ovsdbserver-nb\") pod \"d730a144-0107-409b-9abe-1e316143afc9\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.539156 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqdnm\" (UniqueName: \"kubernetes.io/projected/d730a144-0107-409b-9abe-1e316143afc9-kube-api-access-hqdnm\") pod \"d730a144-0107-409b-9abe-1e316143afc9\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.539257 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-ovsdbserver-sb\") pod \"d730a144-0107-409b-9abe-1e316143afc9\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.539315 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-dns-svc\") pod \"d730a144-0107-409b-9abe-1e316143afc9\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.539337 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-config\") pod \"d730a144-0107-409b-9abe-1e316143afc9\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.539355 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-dns-swift-storage-0\") pod \"d730a144-0107-409b-9abe-1e316143afc9\" (UID: \"d730a144-0107-409b-9abe-1e316143afc9\") " Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.560356 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d730a144-0107-409b-9abe-1e316143afc9-kube-api-access-hqdnm" (OuterVolumeSpecName: "kube-api-access-hqdnm") pod "d730a144-0107-409b-9abe-1e316143afc9" (UID: "d730a144-0107-409b-9abe-1e316143afc9"). InnerVolumeSpecName "kube-api-access-hqdnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.644359 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqdnm\" (UniqueName: \"kubernetes.io/projected/d730a144-0107-409b-9abe-1e316143afc9-kube-api-access-hqdnm\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.661716 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d730a144-0107-409b-9abe-1e316143afc9" (UID: "d730a144-0107-409b-9abe-1e316143afc9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.699629 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d730a144-0107-409b-9abe-1e316143afc9" (UID: "d730a144-0107-409b-9abe-1e316143afc9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.750955 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.750983 4746 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.755589 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d730a144-0107-409b-9abe-1e316143afc9" (UID: "d730a144-0107-409b-9abe-1e316143afc9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.765602 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d730a144-0107-409b-9abe-1e316143afc9" (UID: "d730a144-0107-409b-9abe-1e316143afc9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.765872 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-config" (OuterVolumeSpecName: "config") pod "d730a144-0107-409b-9abe-1e316143afc9" (UID: "d730a144-0107-409b-9abe-1e316143afc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.860242 4746 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.860288 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-config\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:32 crc kubenswrapper[4746]: I0128 21:00:32.860455 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d730a144-0107-409b-9abe-1e316143afc9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:33 crc kubenswrapper[4746]: I0128 21:00:33.014195 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 28 21:00:33 crc kubenswrapper[4746]: I0128 21:00:33.026353 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6ff88f78d4-bh6qm" Jan 28 21:00:33 crc kubenswrapper[4746]: I0128 21:00:33.031894 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-wpkx6"] Jan 28 21:00:33 crc kubenswrapper[4746]: I0128 21:00:33.187836 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 28 21:00:33 crc kubenswrapper[4746]: I0128 21:00:33.467490 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" event={"ID":"d730a144-0107-409b-9abe-1e316143afc9","Type":"ContainerDied","Data":"2645007e619376b57c4674e24facd7aebcbc59137a1a5137801479b19218508f"} Jan 28 21:00:33 crc kubenswrapper[4746]: I0128 21:00:33.467791 4746 scope.go:117] "RemoveContainer" containerID="3ff294de60bb526f0e36528008951905cb66f06f67099f304d07a2b85c2f1ab1" Jan 28 21:00:33 crc kubenswrapper[4746]: I0128 21:00:33.467497 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-7fncg" Jan 28 21:00:33 crc kubenswrapper[4746]: I0128 21:00:33.488402 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a613bc41-1308-4925-a2df-026f6622f0c2","Type":"ContainerStarted","Data":"527e003279fa24b6d7cb31d536e7a3847d2c5c2b2b5b1c77bfb03de91ce90b3e"} Jan 28 21:00:33 crc kubenswrapper[4746]: I0128 21:00:33.489405 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 28 21:00:33 crc kubenswrapper[4746]: I0128 21:00:33.529684 4746 scope.go:117] "RemoveContainer" containerID="050b9c14f0d8e90528e086d11503acb753c25c68d2fa52f5e415a11ca20961c1" Jan 28 21:00:33 crc kubenswrapper[4746]: I0128 21:00:33.544327 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7fncg"] Jan 28 21:00:33 crc kubenswrapper[4746]: I0128 21:00:33.547340 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0133f6a6-0179-41e7-a5b9-bc1661ea19e2","Type":"ContainerStarted","Data":"9ec51aab4b2fbdb8b869a6f608677c59886791c38eb25e8a1e71ab247b090360"} Jan 28 21:00:33 crc kubenswrapper[4746]: I0128 21:00:33.564265 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.564244852 podStartE2EDuration="4.564244852s" podCreationTimestamp="2026-01-28 21:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:33.538643214 +0000 UTC m=+1261.494829558" watchObservedRunningTime="2026-01-28 21:00:33.564244852 +0000 UTC m=+1261.520431196" Jan 28 21:00:33 crc kubenswrapper[4746]: I0128 21:00:33.566857 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7fncg"] Jan 28 21:00:33 crc kubenswrapper[4746]: I0128 21:00:33.611325 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"716bc97d-b62f-4d0a-9478-b522f2fd26dd","Type":"ContainerStarted","Data":"d39599ed6aaa93ac4eeb10f994c74adbfd695eea4d17fa05b3d5e7d65219aeea"} Jan 28 21:00:33 crc kubenswrapper[4746]: I0128 21:00:33.626643 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"0bb735bc-29e9-4a5e-b6b4-a643776d9e44","Type":"ContainerStarted","Data":"608f778a2eb545def2a95955d51b09c7e0e15c1d1b0e332b36bb1d368328b0cf"} Jan 28 21:00:33 crc kubenswrapper[4746]: I0128 21:00:33.649403 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" event={"ID":"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7","Type":"ContainerStarted","Data":"3629995d4fda40f9859d8bcb57b0d7c16909c9d6e84bf3d481632363b64381c7"} Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.590895 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 28 21:00:34 crc kubenswrapper[4746]: E0128 21:00:34.592447 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d730a144-0107-409b-9abe-1e316143afc9" containerName="init" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.592464 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="d730a144-0107-409b-9abe-1e316143afc9" containerName="init" Jan 28 21:00:34 crc kubenswrapper[4746]: E0128 21:00:34.592490 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d730a144-0107-409b-9abe-1e316143afc9" containerName="dnsmasq-dns" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.592496 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="d730a144-0107-409b-9abe-1e316143afc9" containerName="dnsmasq-dns" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.592673 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="d730a144-0107-409b-9abe-1e316143afc9" containerName="dnsmasq-dns" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.593325 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.596790 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-qr8wq" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.600346 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.608474 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.628147 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.691512 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b9e46853-37d6-49c8-ada6-344f49a39e5f-openstack-config-secret\") pod \"openstackclient\" (UID: \"b9e46853-37d6-49c8-ada6-344f49a39e5f\") " pod="openstack/openstackclient" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.691585 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e46853-37d6-49c8-ada6-344f49a39e5f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b9e46853-37d6-49c8-ada6-344f49a39e5f\") " pod="openstack/openstackclient" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.691643 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b9e46853-37d6-49c8-ada6-344f49a39e5f-openstack-config\") pod \"openstackclient\" (UID: \"b9e46853-37d6-49c8-ada6-344f49a39e5f\") " pod="openstack/openstackclient" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.691687 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhg55\" (UniqueName: \"kubernetes.io/projected/b9e46853-37d6-49c8-ada6-344f49a39e5f-kube-api-access-rhg55\") pod \"openstackclient\" (UID: \"b9e46853-37d6-49c8-ada6-344f49a39e5f\") " pod="openstack/openstackclient" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.693809 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.724072 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0133f6a6-0179-41e7-a5b9-bc1661ea19e2","Type":"ContainerStarted","Data":"885bcee0a0ea10f485d7731a6f1d59ecbc34909cb8d86deae1255c7363fbb21d"} Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.742281 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"716bc97d-b62f-4d0a-9478-b522f2fd26dd","Type":"ContainerStarted","Data":"13cb5953a29ee1afaa362950ffd13c7f3b101c242304b93cd9f92bc40a5e2696"} Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.742330 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"716bc97d-b62f-4d0a-9478-b522f2fd26dd","Type":"ContainerStarted","Data":"1ac446194fb9394f0a16a5c38838dd96a80955ceebe7310d07a40937c787ed28"} Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.743724 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.759945 4746 generic.go:334] "Generic (PLEG): container finished" podID="8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7" containerID="8e5056677d97b78160110caaa2719f7a0de3c0c2b784a82d52054567943aaa26" exitCode=0 Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.760011 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" event={"ID":"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7","Type":"ContainerDied","Data":"8e5056677d97b78160110caaa2719f7a0de3c0c2b784a82d52054567943aaa26"} Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.816807 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b9e46853-37d6-49c8-ada6-344f49a39e5f-openstack-config-secret\") pod \"openstackclient\" (UID: \"b9e46853-37d6-49c8-ada6-344f49a39e5f\") " pod="openstack/openstackclient" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.817483 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e46853-37d6-49c8-ada6-344f49a39e5f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b9e46853-37d6-49c8-ada6-344f49a39e5f\") " pod="openstack/openstackclient" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.817743 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b9e46853-37d6-49c8-ada6-344f49a39e5f-openstack-config\") pod \"openstackclient\" (UID: \"b9e46853-37d6-49c8-ada6-344f49a39e5f\") " pod="openstack/openstackclient" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.817933 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhg55\" (UniqueName: \"kubernetes.io/projected/b9e46853-37d6-49c8-ada6-344f49a39e5f-kube-api-access-rhg55\") pod \"openstackclient\" (UID: \"b9e46853-37d6-49c8-ada6-344f49a39e5f\") " pod="openstack/openstackclient" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.825400 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b9e46853-37d6-49c8-ada6-344f49a39e5f-openstack-config\") pod \"openstackclient\" (UID: \"b9e46853-37d6-49c8-ada6-344f49a39e5f\") " pod="openstack/openstackclient" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.826457 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b9e46853-37d6-49c8-ada6-344f49a39e5f-openstack-config-secret\") pod \"openstackclient\" (UID: \"b9e46853-37d6-49c8-ada6-344f49a39e5f\") " pod="openstack/openstackclient" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.827505 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e46853-37d6-49c8-ada6-344f49a39e5f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b9e46853-37d6-49c8-ada6-344f49a39e5f\") " pod="openstack/openstackclient" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.834411 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=3.834291355 podStartE2EDuration="3.834291355s" podCreationTimestamp="2026-01-28 21:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:34.761760344 +0000 UTC m=+1262.717946698" watchObservedRunningTime="2026-01-28 21:00:34.834291355 +0000 UTC m=+1262.790477709" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.843477 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhg55\" (UniqueName: \"kubernetes.io/projected/b9e46853-37d6-49c8-ada6-344f49a39e5f-kube-api-access-rhg55\") pod \"openstackclient\" (UID: \"b9e46853-37d6-49c8-ada6-344f49a39e5f\") " pod="openstack/openstackclient" Jan 28 21:00:34 crc kubenswrapper[4746]: I0128 21:00:34.886966 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d730a144-0107-409b-9abe-1e316143afc9" path="/var/lib/kubelet/pods/d730a144-0107-409b-9abe-1e316143afc9/volumes" Jan 28 21:00:35 crc kubenswrapper[4746]: I0128 21:00:35.000545 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 28 21:00:35 crc kubenswrapper[4746]: I0128 21:00:35.255734 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 28 21:00:35 crc kubenswrapper[4746]: I0128 21:00:35.344426 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 21:00:35 crc kubenswrapper[4746]: I0128 21:00:35.550174 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:35 crc kubenswrapper[4746]: I0128 21:00:35.636124 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:35 crc kubenswrapper[4746]: I0128 21:00:35.742403 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:35 crc kubenswrapper[4746]: I0128 21:00:35.761984 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 28 21:00:35 crc kubenswrapper[4746]: I0128 21:00:35.791839 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0133f6a6-0179-41e7-a5b9-bc1661ea19e2","Type":"ContainerStarted","Data":"0a24e65d3bb4e9c5c16c333080a33a5a7daeb8d88b95a1f51129b36d126f8e47"} Jan 28 21:00:35 crc kubenswrapper[4746]: I0128 21:00:35.801890 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" event={"ID":"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7","Type":"ContainerStarted","Data":"ed92c85d85d05f882b4652d6fab0ad079fda2326af5a70d2e8cc1a622430551f"} Jan 28 21:00:35 crc kubenswrapper[4746]: I0128 21:00:35.802455 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="716bc97d-b62f-4d0a-9478-b522f2fd26dd" containerName="cloudkitty-api-log" containerID="cri-o://1ac446194fb9394f0a16a5c38838dd96a80955ceebe7310d07a40937c787ed28" gracePeriod=30 Jan 28 21:00:35 crc kubenswrapper[4746]: I0128 21:00:35.802735 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="726e5b20-4725-4c19-9dac-42b68d0e181a" containerName="cinder-scheduler" containerID="cri-o://c358fe175c62284f9277b975bf38d8ec6d0552e688e5245a0625ecfbf2e4b6ba" gracePeriod=30 Jan 28 21:00:35 crc kubenswrapper[4746]: I0128 21:00:35.802789 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:35 crc kubenswrapper[4746]: I0128 21:00:35.802822 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="716bc97d-b62f-4d0a-9478-b522f2fd26dd" containerName="cloudkitty-api" containerID="cri-o://13cb5953a29ee1afaa362950ffd13c7f3b101c242304b93cd9f92bc40a5e2696" gracePeriod=30 Jan 28 21:00:35 crc kubenswrapper[4746]: I0128 21:00:35.802871 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="726e5b20-4725-4c19-9dac-42b68d0e181a" containerName="probe" containerID="cri-o://0265e6865880168295fe6b0ef61dc05766773a2c0617568a8509325abe102174" gracePeriod=30 Jan 28 21:00:35 crc kubenswrapper[4746]: I0128 21:00:35.870058 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" podStartSLOduration=4.870034345 podStartE2EDuration="4.870034345s" podCreationTimestamp="2026-01-28 21:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:35.82264955 +0000 UTC m=+1263.778835904" watchObservedRunningTime="2026-01-28 21:00:35.870034345 +0000 UTC m=+1263.826220709" Jan 28 21:00:36 crc kubenswrapper[4746]: I0128 21:00:36.133537 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79599f5dcd-btgz7" Jan 28 21:00:36 crc kubenswrapper[4746]: I0128 21:00:36.195929 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6dcb95875d-qz4xq"] Jan 28 21:00:36 crc kubenswrapper[4746]: I0128 21:00:36.396283 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6dcb95875d-qz4xq" podUID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.186:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 21:00:36 crc kubenswrapper[4746]: W0128 21:00:36.838752 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9e46853_37d6_49c8_ada6_344f49a39e5f.slice/crio-6c9d3db51395a4d40a8a4d715b32681d6d2a500271ee5c33731617dab09c96d6 WatchSource:0}: Error finding container 6c9d3db51395a4d40a8a4d715b32681d6d2a500271ee5c33731617dab09c96d6: Status 404 returned error can't find the container with id 6c9d3db51395a4d40a8a4d715b32681d6d2a500271ee5c33731617dab09c96d6 Jan 28 21:00:36 crc kubenswrapper[4746]: I0128 21:00:36.884878 4746 generic.go:334] "Generic (PLEG): container finished" podID="716bc97d-b62f-4d0a-9478-b522f2fd26dd" containerID="13cb5953a29ee1afaa362950ffd13c7f3b101c242304b93cd9f92bc40a5e2696" exitCode=0 Jan 28 21:00:36 crc kubenswrapper[4746]: I0128 21:00:36.884967 4746 generic.go:334] "Generic (PLEG): container finished" podID="716bc97d-b62f-4d0a-9478-b522f2fd26dd" containerID="1ac446194fb9394f0a16a5c38838dd96a80955ceebe7310d07a40937c787ed28" exitCode=143 Jan 28 21:00:36 crc kubenswrapper[4746]: I0128 21:00:36.885211 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6dcb95875d-qz4xq" podUID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" containerName="barbican-api-log" containerID="cri-o://15361ac8791076a3c1cf04c0bd4b99b14ed3f96b51ede463fe6767f6242bbd80" gracePeriod=30 Jan 28 21:00:36 crc kubenswrapper[4746]: I0128 21:00:36.885358 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"716bc97d-b62f-4d0a-9478-b522f2fd26dd","Type":"ContainerDied","Data":"13cb5953a29ee1afaa362950ffd13c7f3b101c242304b93cd9f92bc40a5e2696"} Jan 28 21:00:36 crc kubenswrapper[4746]: I0128 21:00:36.885439 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"716bc97d-b62f-4d0a-9478-b522f2fd26dd","Type":"ContainerDied","Data":"1ac446194fb9394f0a16a5c38838dd96a80955ceebe7310d07a40937c787ed28"} Jan 28 21:00:36 crc kubenswrapper[4746]: I0128 21:00:36.886617 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6dcb95875d-qz4xq" podUID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" containerName="barbican-api" containerID="cri-o://c64bfb4b8a459b76abaf202a730fc58370844fe34425ab49b2fb2b2f77e9b7f7" gracePeriod=30 Jan 28 21:00:36 crc kubenswrapper[4746]: I0128 21:00:36.925001 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6dcb95875d-qz4xq" podUID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.186:9311/healthcheck\": EOF" Jan 28 21:00:36 crc kubenswrapper[4746]: I0128 21:00:36.925109 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6dcb95875d-qz4xq" podUID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.186:9311/healthcheck\": EOF" Jan 28 21:00:36 crc kubenswrapper[4746]: I0128 21:00:36.925215 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6dcb95875d-qz4xq" podUID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.186:9311/healthcheck\": EOF" Jan 28 21:00:36 crc kubenswrapper[4746]: I0128 21:00:36.925274 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6dcb95875d-qz4xq" podUID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.186:9311/healthcheck\": EOF" Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.543377 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.715671 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/716bc97d-b62f-4d0a-9478-b522f2fd26dd-certs\") pod \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.715728 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-config-data-custom\") pod \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.715855 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt6g9\" (UniqueName: \"kubernetes.io/projected/716bc97d-b62f-4d0a-9478-b522f2fd26dd-kube-api-access-tt6g9\") pod \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.715907 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-scripts\") pod \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.715942 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/716bc97d-b62f-4d0a-9478-b522f2fd26dd-logs\") pod \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.716025 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-config-data\") pod \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.716061 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-combined-ca-bundle\") pod \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\" (UID: \"716bc97d-b62f-4d0a-9478-b522f2fd26dd\") " Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.721002 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "716bc97d-b62f-4d0a-9478-b522f2fd26dd" (UID: "716bc97d-b62f-4d0a-9478-b522f2fd26dd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.724426 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/716bc97d-b62f-4d0a-9478-b522f2fd26dd-logs" (OuterVolumeSpecName: "logs") pod "716bc97d-b62f-4d0a-9478-b522f2fd26dd" (UID: "716bc97d-b62f-4d0a-9478-b522f2fd26dd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.728441 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/716bc97d-b62f-4d0a-9478-b522f2fd26dd-certs" (OuterVolumeSpecName: "certs") pod "716bc97d-b62f-4d0a-9478-b522f2fd26dd" (UID: "716bc97d-b62f-4d0a-9478-b522f2fd26dd"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.728499 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/716bc97d-b62f-4d0a-9478-b522f2fd26dd-kube-api-access-tt6g9" (OuterVolumeSpecName: "kube-api-access-tt6g9") pod "716bc97d-b62f-4d0a-9478-b522f2fd26dd" (UID: "716bc97d-b62f-4d0a-9478-b522f2fd26dd"). InnerVolumeSpecName "kube-api-access-tt6g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.728500 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-scripts" (OuterVolumeSpecName: "scripts") pod "716bc97d-b62f-4d0a-9478-b522f2fd26dd" (UID: "716bc97d-b62f-4d0a-9478-b522f2fd26dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.752245 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "716bc97d-b62f-4d0a-9478-b522f2fd26dd" (UID: "716bc97d-b62f-4d0a-9478-b522f2fd26dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.762394 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-config-data" (OuterVolumeSpecName: "config-data") pod "716bc97d-b62f-4d0a-9478-b522f2fd26dd" (UID: "716bc97d-b62f-4d0a-9478-b522f2fd26dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.818487 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt6g9\" (UniqueName: \"kubernetes.io/projected/716bc97d-b62f-4d0a-9478-b522f2fd26dd-kube-api-access-tt6g9\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.818526 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.818539 4746 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/716bc97d-b62f-4d0a-9478-b522f2fd26dd-logs\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.818550 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.818565 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.818576 4746 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/716bc97d-b62f-4d0a-9478-b522f2fd26dd-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.818585 4746 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/716bc97d-b62f-4d0a-9478-b522f2fd26dd-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.901749 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"0bb735bc-29e9-4a5e-b6b4-a643776d9e44","Type":"ContainerStarted","Data":"3e8f0d0271a494ec0a7737528b6d58303a35565a92add21bf5b49f48ff90e0bf"} Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.937253 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.736952987 podStartE2EDuration="6.93723236s" podCreationTimestamp="2026-01-28 21:00:31 +0000 UTC" firstStartedPulling="2026-01-28 21:00:32.955879538 +0000 UTC m=+1260.912065892" lastFinishedPulling="2026-01-28 21:00:37.156158911 +0000 UTC m=+1265.112345265" observedRunningTime="2026-01-28 21:00:37.926606464 +0000 UTC m=+1265.882792838" watchObservedRunningTime="2026-01-28 21:00:37.93723236 +0000 UTC m=+1265.893418704" Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.940461 4746 generic.go:334] "Generic (PLEG): container finished" podID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" containerID="15361ac8791076a3c1cf04c0bd4b99b14ed3f96b51ede463fe6767f6242bbd80" exitCode=143 Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.940566 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dcb95875d-qz4xq" event={"ID":"ea56fe21-dd12-486e-a4e9-2af7cb7b9387","Type":"ContainerDied","Data":"15361ac8791076a3c1cf04c0bd4b99b14ed3f96b51ede463fe6767f6242bbd80"} Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.956904 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b9e46853-37d6-49c8-ada6-344f49a39e5f","Type":"ContainerStarted","Data":"6c9d3db51395a4d40a8a4d715b32681d6d2a500271ee5c33731617dab09c96d6"} Jan 28 21:00:37 crc kubenswrapper[4746]: I0128 21:00:37.994419 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.029739 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0133f6a6-0179-41e7-a5b9-bc1661ea19e2","Type":"ContainerStarted","Data":"264addfba649d18292bed5e3b3f9e5ac9c7afadb889210023e26b5a932c66687"} Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.029830 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.041251 4746 generic.go:334] "Generic (PLEG): container finished" podID="726e5b20-4725-4c19-9dac-42b68d0e181a" containerID="0265e6865880168295fe6b0ef61dc05766773a2c0617568a8509325abe102174" exitCode=0 Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.041294 4746 generic.go:334] "Generic (PLEG): container finished" podID="726e5b20-4725-4c19-9dac-42b68d0e181a" containerID="c358fe175c62284f9277b975bf38d8ec6d0552e688e5245a0625ecfbf2e4b6ba" exitCode=0 Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.041349 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"726e5b20-4725-4c19-9dac-42b68d0e181a","Type":"ContainerDied","Data":"0265e6865880168295fe6b0ef61dc05766773a2c0617568a8509325abe102174"} Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.041384 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"726e5b20-4725-4c19-9dac-42b68d0e181a","Type":"ContainerDied","Data":"c358fe175c62284f9277b975bf38d8ec6d0552e688e5245a0625ecfbf2e4b6ba"} Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.069909 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.8779306030000003 podStartE2EDuration="9.069894318s" podCreationTimestamp="2026-01-28 21:00:29 +0000 UTC" firstStartedPulling="2026-01-28 21:00:30.800235874 +0000 UTC m=+1258.756422228" lastFinishedPulling="2026-01-28 21:00:36.992199589 +0000 UTC m=+1264.948385943" observedRunningTime="2026-01-28 21:00:38.051577896 +0000 UTC m=+1266.007764250" watchObservedRunningTime="2026-01-28 21:00:38.069894318 +0000 UTC m=+1266.026080672" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.089625 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.090184 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"716bc97d-b62f-4d0a-9478-b522f2fd26dd","Type":"ContainerDied","Data":"d39599ed6aaa93ac4eeb10f994c74adbfd695eea4d17fa05b3d5e7d65219aeea"} Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.090244 4746 scope.go:117] "RemoveContainer" containerID="13cb5953a29ee1afaa362950ffd13c7f3b101c242304b93cd9f92bc40a5e2696" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.135355 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.147018 4746 scope.go:117] "RemoveContainer" containerID="1ac446194fb9394f0a16a5c38838dd96a80955ceebe7310d07a40937c787ed28" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.158468 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.184164 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Jan 28 21:00:38 crc kubenswrapper[4746]: E0128 21:00:38.184727 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="716bc97d-b62f-4d0a-9478-b522f2fd26dd" containerName="cloudkitty-api" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.184750 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="716bc97d-b62f-4d0a-9478-b522f2fd26dd" containerName="cloudkitty-api" Jan 28 21:00:38 crc kubenswrapper[4746]: E0128 21:00:38.184779 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="716bc97d-b62f-4d0a-9478-b522f2fd26dd" containerName="cloudkitty-api-log" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.184787 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="716bc97d-b62f-4d0a-9478-b522f2fd26dd" containerName="cloudkitty-api-log" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.185044 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="716bc97d-b62f-4d0a-9478-b522f2fd26dd" containerName="cloudkitty-api-log" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.185071 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="716bc97d-b62f-4d0a-9478-b522f2fd26dd" containerName="cloudkitty-api" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.186459 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.189866 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.190161 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.191449 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.195753 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.284796 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.343272 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-scripts\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.343322 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-certs\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.343364 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-config-data\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.343653 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.343772 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-logs\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.343835 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z72c4\" (UniqueName: \"kubernetes.io/projected/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-kube-api-access-z72c4\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.343889 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.343997 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.344038 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.445984 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-config-data\") pod \"726e5b20-4725-4c19-9dac-42b68d0e181a\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.446045 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/726e5b20-4725-4c19-9dac-42b68d0e181a-etc-machine-id\") pod \"726e5b20-4725-4c19-9dac-42b68d0e181a\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.446064 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-combined-ca-bundle\") pod \"726e5b20-4725-4c19-9dac-42b68d0e181a\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.446128 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vn27\" (UniqueName: \"kubernetes.io/projected/726e5b20-4725-4c19-9dac-42b68d0e181a-kube-api-access-7vn27\") pod \"726e5b20-4725-4c19-9dac-42b68d0e181a\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.446208 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-scripts\") pod \"726e5b20-4725-4c19-9dac-42b68d0e181a\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.446218 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/726e5b20-4725-4c19-9dac-42b68d0e181a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "726e5b20-4725-4c19-9dac-42b68d0e181a" (UID: "726e5b20-4725-4c19-9dac-42b68d0e181a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.446378 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-config-data-custom\") pod \"726e5b20-4725-4c19-9dac-42b68d0e181a\" (UID: \"726e5b20-4725-4c19-9dac-42b68d0e181a\") " Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.446695 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.446718 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.446772 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-scripts\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.446786 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-certs\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.446807 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-config-data\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.446830 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.446885 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-logs\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.446915 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z72c4\" (UniqueName: \"kubernetes.io/projected/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-kube-api-access-z72c4\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.446941 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.447054 4746 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/726e5b20-4725-4c19-9dac-42b68d0e181a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.451871 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/726e5b20-4725-4c19-9dac-42b68d0e181a-kube-api-access-7vn27" (OuterVolumeSpecName: "kube-api-access-7vn27") pod "726e5b20-4725-4c19-9dac-42b68d0e181a" (UID: "726e5b20-4725-4c19-9dac-42b68d0e181a"). InnerVolumeSpecName "kube-api-access-7vn27". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.454447 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-logs\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.456044 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.457769 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-certs\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.459104 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.473387 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "726e5b20-4725-4c19-9dac-42b68d0e181a" (UID: "726e5b20-4725-4c19-9dac-42b68d0e181a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.473641 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-scripts" (OuterVolumeSpecName: "scripts") pod "726e5b20-4725-4c19-9dac-42b68d0e181a" (UID: "726e5b20-4725-4c19-9dac-42b68d0e181a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.475583 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z72c4\" (UniqueName: \"kubernetes.io/projected/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-kube-api-access-z72c4\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.477324 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.482100 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-config-data\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.483485 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.490409 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-scripts\") pod \"cloudkitty-api-0\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.548892 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "726e5b20-4725-4c19-9dac-42b68d0e181a" (UID: "726e5b20-4725-4c19-9dac-42b68d0e181a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.549468 4746 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.549490 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.549501 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vn27\" (UniqueName: \"kubernetes.io/projected/726e5b20-4725-4c19-9dac-42b68d0e181a-kube-api-access-7vn27\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.549510 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.593524 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.605115 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-config-data" (OuterVolumeSpecName: "config-data") pod "726e5b20-4725-4c19-9dac-42b68d0e181a" (UID: "726e5b20-4725-4c19-9dac-42b68d0e181a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.654064 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726e5b20-4725-4c19-9dac-42b68d0e181a-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.864587 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-79599f5dcd-btgz7" podUID="8fa661e3-776e-42b0-83db-374d372232ad" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.188:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 21:00:38 crc kubenswrapper[4746]: I0128 21:00:38.872980 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="716bc97d-b62f-4d0a-9478-b522f2fd26dd" path="/var/lib/kubelet/pods/716bc97d-b62f-4d0a-9478-b522f2fd26dd/volumes" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.113745 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"726e5b20-4725-4c19-9dac-42b68d0e181a","Type":"ContainerDied","Data":"80afe71102801c364cfc9116319912bdce369eaab4832c5814a331fb4bc491fe"} Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.114074 4746 scope.go:117] "RemoveContainer" containerID="0265e6865880168295fe6b0ef61dc05766773a2c0617568a8509325abe102174" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.114221 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.192266 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.205294 4746 scope.go:117] "RemoveContainer" containerID="c358fe175c62284f9277b975bf38d8ec6d0552e688e5245a0625ecfbf2e4b6ba" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.207318 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.222956 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 21:00:39 crc kubenswrapper[4746]: E0128 21:00:39.223581 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726e5b20-4725-4c19-9dac-42b68d0e181a" containerName="cinder-scheduler" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.223601 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="726e5b20-4725-4c19-9dac-42b68d0e181a" containerName="cinder-scheduler" Jan 28 21:00:39 crc kubenswrapper[4746]: E0128 21:00:39.223699 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="726e5b20-4725-4c19-9dac-42b68d0e181a" containerName="probe" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.223708 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="726e5b20-4725-4c19-9dac-42b68d0e181a" containerName="probe" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.223907 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="726e5b20-4725-4c19-9dac-42b68d0e181a" containerName="probe" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.223926 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="726e5b20-4725-4c19-9dac-42b68d0e181a" containerName="cinder-scheduler" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.224950 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.229369 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.231932 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.307534 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.372988 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtbnr\" (UniqueName: \"kubernetes.io/projected/9305786c-240c-4e6a-a110-599c0067ce78-kube-api-access-qtbnr\") pod \"cinder-scheduler-0\" (UID: \"9305786c-240c-4e6a-a110-599c0067ce78\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.373137 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9305786c-240c-4e6a-a110-599c0067ce78-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9305786c-240c-4e6a-a110-599c0067ce78\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.373171 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9305786c-240c-4e6a-a110-599c0067ce78-config-data\") pod \"cinder-scheduler-0\" (UID: \"9305786c-240c-4e6a-a110-599c0067ce78\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.373222 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9305786c-240c-4e6a-a110-599c0067ce78-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9305786c-240c-4e6a-a110-599c0067ce78\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.373290 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9305786c-240c-4e6a-a110-599c0067ce78-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9305786c-240c-4e6a-a110-599c0067ce78\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.373353 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9305786c-240c-4e6a-a110-599c0067ce78-scripts\") pod \"cinder-scheduler-0\" (UID: \"9305786c-240c-4e6a-a110-599c0067ce78\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.475237 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9305786c-240c-4e6a-a110-599c0067ce78-config-data\") pod \"cinder-scheduler-0\" (UID: \"9305786c-240c-4e6a-a110-599c0067ce78\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.475283 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9305786c-240c-4e6a-a110-599c0067ce78-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9305786c-240c-4e6a-a110-599c0067ce78\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.475313 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9305786c-240c-4e6a-a110-599c0067ce78-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9305786c-240c-4e6a-a110-599c0067ce78\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.475349 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9305786c-240c-4e6a-a110-599c0067ce78-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9305786c-240c-4e6a-a110-599c0067ce78\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.475378 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9305786c-240c-4e6a-a110-599c0067ce78-scripts\") pod \"cinder-scheduler-0\" (UID: \"9305786c-240c-4e6a-a110-599c0067ce78\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.475481 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtbnr\" (UniqueName: \"kubernetes.io/projected/9305786c-240c-4e6a-a110-599c0067ce78-kube-api-access-qtbnr\") pod \"cinder-scheduler-0\" (UID: \"9305786c-240c-4e6a-a110-599c0067ce78\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.485290 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9305786c-240c-4e6a-a110-599c0067ce78-config-data\") pod \"cinder-scheduler-0\" (UID: \"9305786c-240c-4e6a-a110-599c0067ce78\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.485590 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9305786c-240c-4e6a-a110-599c0067ce78-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9305786c-240c-4e6a-a110-599c0067ce78\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.486694 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9305786c-240c-4e6a-a110-599c0067ce78-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9305786c-240c-4e6a-a110-599c0067ce78\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.492687 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9305786c-240c-4e6a-a110-599c0067ce78-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9305786c-240c-4e6a-a110-599c0067ce78\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.501910 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9305786c-240c-4e6a-a110-599c0067ce78-scripts\") pod \"cinder-scheduler-0\" (UID: \"9305786c-240c-4e6a-a110-599c0067ce78\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.506849 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtbnr\" (UniqueName: \"kubernetes.io/projected/9305786c-240c-4e6a-a110-599c0067ce78-kube-api-access-qtbnr\") pod \"cinder-scheduler-0\" (UID: \"9305786c-240c-4e6a-a110-599c0067ce78\") " pod="openstack/cinder-scheduler-0" Jan 28 21:00:39 crc kubenswrapper[4746]: I0128 21:00:39.564032 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 28 21:00:40 crc kubenswrapper[4746]: I0128 21:00:40.294825 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"6e308833-0f26-4f16-9f4d-cd9a6e583ee9","Type":"ContainerStarted","Data":"832297f2799904b89c93ad5ebc2e91e71de6e0b478f0dfd3fc3a9f2f3fa1f944"} Jan 28 21:00:40 crc kubenswrapper[4746]: I0128 21:00:40.294876 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"6e308833-0f26-4f16-9f4d-cd9a6e583ee9","Type":"ContainerStarted","Data":"b8cfc746885ee651b2f65ddaa465c26371726b571b267ce7f809c540dc9d4d35"} Jan 28 21:00:40 crc kubenswrapper[4746]: I0128 21:00:40.295061 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="0bb735bc-29e9-4a5e-b6b4-a643776d9e44" containerName="cloudkitty-proc" containerID="cri-o://3e8f0d0271a494ec0a7737528b6d58303a35565a92add21bf5b49f48ff90e0bf" gracePeriod=30 Jan 28 21:00:40 crc kubenswrapper[4746]: I0128 21:00:40.373844 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 28 21:00:40 crc kubenswrapper[4746]: I0128 21:00:40.864255 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="726e5b20-4725-4c19-9dac-42b68d0e181a" path="/var/lib/kubelet/pods/726e5b20-4725-4c19-9dac-42b68d0e181a/volumes" Jan 28 21:00:41 crc kubenswrapper[4746]: I0128 21:00:41.139322 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79599f5dcd-btgz7" podUID="8fa661e3-776e-42b0-83db-374d372232ad" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.188:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 21:00:41 crc kubenswrapper[4746]: I0128 21:00:41.336941 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"6e308833-0f26-4f16-9f4d-cd9a6e583ee9","Type":"ContainerStarted","Data":"c82b268270307d0d209b8578cb1b031e6c4e7d82ac12518698a356f5d64ba8aa"} Jan 28 21:00:41 crc kubenswrapper[4746]: I0128 21:00:41.338237 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Jan 28 21:00:41 crc kubenswrapper[4746]: I0128 21:00:41.363132 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9305786c-240c-4e6a-a110-599c0067ce78","Type":"ContainerStarted","Data":"de81f4d32a08ef2506ec3d515d4de09a92b13647faa00ef302dd30daf0cfcede"} Jan 28 21:00:41 crc kubenswrapper[4746]: I0128 21:00:41.371170 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=3.371148578 podStartE2EDuration="3.371148578s" podCreationTimestamp="2026-01-28 21:00:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:41.354770317 +0000 UTC m=+1269.310956671" watchObservedRunningTime="2026-01-28 21:00:41.371148578 +0000 UTC m=+1269.327334932" Jan 28 21:00:41 crc kubenswrapper[4746]: I0128 21:00:41.917234 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:00:41 crc kubenswrapper[4746]: I0128 21:00:41.967287 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6dcb95875d-qz4xq" podUID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.186:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 21:00:41 crc kubenswrapper[4746]: I0128 21:00:41.987320 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-8hh8h"] Jan 28 21:00:41 crc kubenswrapper[4746]: I0128 21:00:41.987740 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" podUID="ee1b35a7-970d-4abf-b645-eebbcadd7e8e" containerName="dnsmasq-dns" containerID="cri-o://34cbf343b96758686d17c89cafd8ba1ef5a3222f320d6831f7f4f740567d37ab" gracePeriod=10 Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.400045 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9305786c-240c-4e6a-a110-599c0067ce78","Type":"ContainerStarted","Data":"2bae19e1188b4a5c4ebe64cdcb8a951bab3608f0d525bc3665e20c523a993a7d"} Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.400253 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9305786c-240c-4e6a-a110-599c0067ce78","Type":"ContainerStarted","Data":"0d1808c221bc0cceb54ae8159d4e8fe55046bb988aa6fd5dca8524480074e4a7"} Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.402599 4746 generic.go:334] "Generic (PLEG): container finished" podID="ee1b35a7-970d-4abf-b645-eebbcadd7e8e" containerID="34cbf343b96758686d17c89cafd8ba1ef5a3222f320d6831f7f4f740567d37ab" exitCode=0 Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.403646 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" event={"ID":"ee1b35a7-970d-4abf-b645-eebbcadd7e8e","Type":"ContainerDied","Data":"34cbf343b96758686d17c89cafd8ba1ef5a3222f320d6831f7f4f740567d37ab"} Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.421550 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.421538172 podStartE2EDuration="3.421538172s" podCreationTimestamp="2026-01-28 21:00:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:42.417585405 +0000 UTC m=+1270.373771759" watchObservedRunningTime="2026-01-28 21:00:42.421538172 +0000 UTC m=+1270.377724526" Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.709879 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.792946 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-dns-swift-storage-0\") pod \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.793307 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-ovsdbserver-sb\") pod \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.793483 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-dns-svc\") pod \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.793529 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g669\" (UniqueName: \"kubernetes.io/projected/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-kube-api-access-4g669\") pod \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.793600 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-config\") pod \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.793626 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-ovsdbserver-nb\") pod \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\" (UID: \"ee1b35a7-970d-4abf-b645-eebbcadd7e8e\") " Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.823308 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-kube-api-access-4g669" (OuterVolumeSpecName: "kube-api-access-4g669") pod "ee1b35a7-970d-4abf-b645-eebbcadd7e8e" (UID: "ee1b35a7-970d-4abf-b645-eebbcadd7e8e"). InnerVolumeSpecName "kube-api-access-4g669". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.897683 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee1b35a7-970d-4abf-b645-eebbcadd7e8e" (UID: "ee1b35a7-970d-4abf-b645-eebbcadd7e8e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.903413 4746 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.903516 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g669\" (UniqueName: \"kubernetes.io/projected/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-kube-api-access-4g669\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.930931 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ee1b35a7-970d-4abf-b645-eebbcadd7e8e" (UID: "ee1b35a7-970d-4abf-b645-eebbcadd7e8e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.945675 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee1b35a7-970d-4abf-b645-eebbcadd7e8e" (UID: "ee1b35a7-970d-4abf-b645-eebbcadd7e8e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.958692 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-config" (OuterVolumeSpecName: "config") pod "ee1b35a7-970d-4abf-b645-eebbcadd7e8e" (UID: "ee1b35a7-970d-4abf-b645-eebbcadd7e8e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:42 crc kubenswrapper[4746]: I0128 21:00:42.986788 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ee1b35a7-970d-4abf-b645-eebbcadd7e8e" (UID: "ee1b35a7-970d-4abf-b645-eebbcadd7e8e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:00:43 crc kubenswrapper[4746]: I0128 21:00:43.007153 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-config\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:43 crc kubenswrapper[4746]: I0128 21:00:43.007512 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:43 crc kubenswrapper[4746]: I0128 21:00:43.007532 4746 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:43 crc kubenswrapper[4746]: I0128 21:00:43.007541 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee1b35a7-970d-4abf-b645-eebbcadd7e8e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:43 crc kubenswrapper[4746]: I0128 21:00:43.439340 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" Jan 28 21:00:43 crc kubenswrapper[4746]: I0128 21:00:43.439649 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-8hh8h" event={"ID":"ee1b35a7-970d-4abf-b645-eebbcadd7e8e","Type":"ContainerDied","Data":"ef953ba0e39534e4f0d56311f6dc84a03fe95bc7ca5f0e6fb8a1dda880f72e34"} Jan 28 21:00:43 crc kubenswrapper[4746]: I0128 21:00:43.439706 4746 scope.go:117] "RemoveContainer" containerID="34cbf343b96758686d17c89cafd8ba1ef5a3222f320d6831f7f4f740567d37ab" Jan 28 21:00:43 crc kubenswrapper[4746]: I0128 21:00:43.506265 4746 scope.go:117] "RemoveContainer" containerID="348c53e2e4cc51ad0b2a0899e9d685d1ae75a36ea387f74a447d30fc80071eee" Jan 28 21:00:43 crc kubenswrapper[4746]: I0128 21:00:43.539909 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-8hh8h"] Jan 28 21:00:43 crc kubenswrapper[4746]: I0128 21:00:43.550301 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-8hh8h"] Jan 28 21:00:44 crc kubenswrapper[4746]: I0128 21:00:44.468994 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6dcb95875d-qz4xq" podUID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.186:9311/healthcheck\": read tcp 10.217.0.2:44256->10.217.0.186:9311: read: connection reset by peer" Jan 28 21:00:44 crc kubenswrapper[4746]: I0128 21:00:44.469281 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6dcb95875d-qz4xq" podUID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.186:9311/healthcheck\": read tcp 10.217.0.2:44244->10.217.0.186:9311: read: connection reset by peer" Jan 28 21:00:44 crc kubenswrapper[4746]: I0128 21:00:44.469478 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:44 crc kubenswrapper[4746]: I0128 21:00:44.564220 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 28 21:00:44 crc kubenswrapper[4746]: I0128 21:00:44.852293 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="a613bc41-1308-4925-a2df-026f6622f0c2" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.191:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 21:00:44 crc kubenswrapper[4746]: I0128 21:00:44.856989 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee1b35a7-970d-4abf-b645-eebbcadd7e8e" path="/var/lib/kubelet/pods/ee1b35a7-970d-4abf-b645-eebbcadd7e8e/volumes" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.211030 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.291562 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-config-data\") pod \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.291722 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-logs\") pod \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.291780 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-config-data-custom\") pod \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.291862 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-combined-ca-bundle\") pod \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.292021 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84fk6\" (UniqueName: \"kubernetes.io/projected/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-kube-api-access-84fk6\") pod \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.293435 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-logs" (OuterVolumeSpecName: "logs") pod "ea56fe21-dd12-486e-a4e9-2af7cb7b9387" (UID: "ea56fe21-dd12-486e-a4e9-2af7cb7b9387"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.293732 4746 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-logs\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.302480 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-kube-api-access-84fk6" (OuterVolumeSpecName: "kube-api-access-84fk6") pod "ea56fe21-dd12-486e-a4e9-2af7cb7b9387" (UID: "ea56fe21-dd12-486e-a4e9-2af7cb7b9387"). InnerVolumeSpecName "kube-api-access-84fk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.308605 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ea56fe21-dd12-486e-a4e9-2af7cb7b9387" (UID: "ea56fe21-dd12-486e-a4e9-2af7cb7b9387"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.367757 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.375273 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-config-data" (OuterVolumeSpecName: "config-data") pod "ea56fe21-dd12-486e-a4e9-2af7cb7b9387" (UID: "ea56fe21-dd12-486e-a4e9-2af7cb7b9387"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.397317 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea56fe21-dd12-486e-a4e9-2af7cb7b9387" (UID: "ea56fe21-dd12-486e-a4e9-2af7cb7b9387"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.411761 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-scripts\") pod \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.411843 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-config-data\") pod \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.412052 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-certs\") pod \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.412125 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjnsz\" (UniqueName: \"kubernetes.io/projected/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-kube-api-access-rjnsz\") pod \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.412159 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-combined-ca-bundle\") pod \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\" (UID: \"ea56fe21-dd12-486e-a4e9-2af7cb7b9387\") " Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.412188 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-config-data-custom\") pod \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.412209 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-combined-ca-bundle\") pod \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\" (UID: \"0bb735bc-29e9-4a5e-b6b4-a643776d9e44\") " Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.413029 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84fk6\" (UniqueName: \"kubernetes.io/projected/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-kube-api-access-84fk6\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.413056 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.413069 4746 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:45 crc kubenswrapper[4746]: W0128 21:00:45.415351 4746 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/ea56fe21-dd12-486e-a4e9-2af7cb7b9387/volumes/kubernetes.io~secret/combined-ca-bundle Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.415377 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea56fe21-dd12-486e-a4e9-2af7cb7b9387" (UID: "ea56fe21-dd12-486e-a4e9-2af7cb7b9387"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.423915 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-scripts" (OuterVolumeSpecName: "scripts") pod "0bb735bc-29e9-4a5e-b6b4-a643776d9e44" (UID: "0bb735bc-29e9-4a5e-b6b4-a643776d9e44"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.424242 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-kube-api-access-rjnsz" (OuterVolumeSpecName: "kube-api-access-rjnsz") pod "0bb735bc-29e9-4a5e-b6b4-a643776d9e44" (UID: "0bb735bc-29e9-4a5e-b6b4-a643776d9e44"). InnerVolumeSpecName "kube-api-access-rjnsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.430374 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-certs" (OuterVolumeSpecName: "certs") pod "0bb735bc-29e9-4a5e-b6b4-a643776d9e44" (UID: "0bb735bc-29e9-4a5e-b6b4-a643776d9e44"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.443222 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0bb735bc-29e9-4a5e-b6b4-a643776d9e44" (UID: "0bb735bc-29e9-4a5e-b6b4-a643776d9e44"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.465305 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0bb735bc-29e9-4a5e-b6b4-a643776d9e44" (UID: "0bb735bc-29e9-4a5e-b6b4-a643776d9e44"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.502956 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-config-data" (OuterVolumeSpecName: "config-data") pod "0bb735bc-29e9-4a5e-b6b4-a643776d9e44" (UID: "0bb735bc-29e9-4a5e-b6b4-a643776d9e44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.507879 4746 generic.go:334] "Generic (PLEG): container finished" podID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" containerID="c64bfb4b8a459b76abaf202a730fc58370844fe34425ab49b2fb2b2f77e9b7f7" exitCode=0 Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.507942 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dcb95875d-qz4xq" event={"ID":"ea56fe21-dd12-486e-a4e9-2af7cb7b9387","Type":"ContainerDied","Data":"c64bfb4b8a459b76abaf202a730fc58370844fe34425ab49b2fb2b2f77e9b7f7"} Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.507973 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dcb95875d-qz4xq" event={"ID":"ea56fe21-dd12-486e-a4e9-2af7cb7b9387","Type":"ContainerDied","Data":"e8f82981b2a7185afefdd8be003aac5f283992841076c71eee0af3d9fa599e88"} Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.507990 4746 scope.go:117] "RemoveContainer" containerID="c64bfb4b8a459b76abaf202a730fc58370844fe34425ab49b2fb2b2f77e9b7f7" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.508145 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dcb95875d-qz4xq" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.509571 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5d6f7ddd75-47x9g"] Jan 28 21:00:45 crc kubenswrapper[4746]: E0128 21:00:45.510036 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1b35a7-970d-4abf-b645-eebbcadd7e8e" containerName="dnsmasq-dns" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.510055 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1b35a7-970d-4abf-b645-eebbcadd7e8e" containerName="dnsmasq-dns" Jan 28 21:00:45 crc kubenswrapper[4746]: E0128 21:00:45.510069 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1b35a7-970d-4abf-b645-eebbcadd7e8e" containerName="init" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.510075 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1b35a7-970d-4abf-b645-eebbcadd7e8e" containerName="init" Jan 28 21:00:45 crc kubenswrapper[4746]: E0128 21:00:45.510100 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" containerName="barbican-api" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.510107 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" containerName="barbican-api" Jan 28 21:00:45 crc kubenswrapper[4746]: E0128 21:00:45.510122 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bb735bc-29e9-4a5e-b6b4-a643776d9e44" containerName="cloudkitty-proc" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.510128 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bb735bc-29e9-4a5e-b6b4-a643776d9e44" containerName="cloudkitty-proc" Jan 28 21:00:45 crc kubenswrapper[4746]: E0128 21:00:45.510146 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" containerName="barbican-api-log" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.510151 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" containerName="barbican-api-log" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.510358 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" containerName="barbican-api-log" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.510368 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee1b35a7-970d-4abf-b645-eebbcadd7e8e" containerName="dnsmasq-dns" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.510379 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bb735bc-29e9-4a5e-b6b4-a643776d9e44" containerName="cloudkitty-proc" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.510394 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" containerName="barbican-api" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.511463 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.517207 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.517475 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.517609 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.519239 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.519367 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.519446 4746 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.519516 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjnsz\" (UniqueName: \"kubernetes.io/projected/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-kube-api-access-rjnsz\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.519592 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea56fe21-dd12-486e-a4e9-2af7cb7b9387-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.519662 4746 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.519730 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bb735bc-29e9-4a5e-b6b4-a643776d9e44-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.548178 4746 generic.go:334] "Generic (PLEG): container finished" podID="0bb735bc-29e9-4a5e-b6b4-a643776d9e44" containerID="3e8f0d0271a494ec0a7737528b6d58303a35565a92add21bf5b49f48ff90e0bf" exitCode=0 Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.548426 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.548467 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"0bb735bc-29e9-4a5e-b6b4-a643776d9e44","Type":"ContainerDied","Data":"3e8f0d0271a494ec0a7737528b6d58303a35565a92add21bf5b49f48ff90e0bf"} Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.571898 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5d6f7ddd75-47x9g"] Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.571956 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"0bb735bc-29e9-4a5e-b6b4-a643776d9e44","Type":"ContainerDied","Data":"608f778a2eb545def2a95955d51b09c7e0e15c1d1b0e332b36bb1d368328b0cf"} Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.622124 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4524d3f6-9b61-4b9c-b778-0078a31efc3e-internal-tls-certs\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.622454 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4524d3f6-9b61-4b9c-b778-0078a31efc3e-etc-swift\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.622519 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4524d3f6-9b61-4b9c-b778-0078a31efc3e-combined-ca-bundle\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.622720 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4524d3f6-9b61-4b9c-b778-0078a31efc3e-run-httpd\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.622785 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4524d3f6-9b61-4b9c-b778-0078a31efc3e-log-httpd\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.622857 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4524d3f6-9b61-4b9c-b778-0078a31efc3e-config-data\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.622884 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l56z9\" (UniqueName: \"kubernetes.io/projected/4524d3f6-9b61-4b9c-b778-0078a31efc3e-kube-api-access-l56z9\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.623360 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4524d3f6-9b61-4b9c-b778-0078a31efc3e-public-tls-certs\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.661253 4746 scope.go:117] "RemoveContainer" containerID="15361ac8791076a3c1cf04c0bd4b99b14ed3f96b51ede463fe6767f6242bbd80" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.666888 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6dcb95875d-qz4xq"] Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.676266 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6dcb95875d-qz4xq"] Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.724239 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.725316 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4524d3f6-9b61-4b9c-b778-0078a31efc3e-log-httpd\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.725369 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l56z9\" (UniqueName: \"kubernetes.io/projected/4524d3f6-9b61-4b9c-b778-0078a31efc3e-kube-api-access-l56z9\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.725386 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4524d3f6-9b61-4b9c-b778-0078a31efc3e-config-data\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.725464 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4524d3f6-9b61-4b9c-b778-0078a31efc3e-public-tls-certs\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.725495 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4524d3f6-9b61-4b9c-b778-0078a31efc3e-internal-tls-certs\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.725530 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4524d3f6-9b61-4b9c-b778-0078a31efc3e-etc-swift\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.725567 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4524d3f6-9b61-4b9c-b778-0078a31efc3e-combined-ca-bundle\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.725585 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4524d3f6-9b61-4b9c-b778-0078a31efc3e-run-httpd\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.725990 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4524d3f6-9b61-4b9c-b778-0078a31efc3e-run-httpd\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.731507 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4524d3f6-9b61-4b9c-b778-0078a31efc3e-internal-tls-certs\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.732069 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4524d3f6-9b61-4b9c-b778-0078a31efc3e-public-tls-certs\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.732117 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4524d3f6-9b61-4b9c-b778-0078a31efc3e-log-httpd\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.733736 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4524d3f6-9b61-4b9c-b778-0078a31efc3e-combined-ca-bundle\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.738031 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.738611 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4524d3f6-9b61-4b9c-b778-0078a31efc3e-config-data\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.742234 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4524d3f6-9b61-4b9c-b778-0078a31efc3e-etc-swift\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.758208 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.760388 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.762308 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l56z9\" (UniqueName: \"kubernetes.io/projected/4524d3f6-9b61-4b9c-b778-0078a31efc3e-kube-api-access-l56z9\") pod \"swift-proxy-5d6f7ddd75-47x9g\" (UID: \"4524d3f6-9b61-4b9c-b778-0078a31efc3e\") " pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.762575 4746 scope.go:117] "RemoveContainer" containerID="c64bfb4b8a459b76abaf202a730fc58370844fe34425ab49b2fb2b2f77e9b7f7" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.765571 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Jan 28 21:00:45 crc kubenswrapper[4746]: E0128 21:00:45.766540 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c64bfb4b8a459b76abaf202a730fc58370844fe34425ab49b2fb2b2f77e9b7f7\": container with ID starting with c64bfb4b8a459b76abaf202a730fc58370844fe34425ab49b2fb2b2f77e9b7f7 not found: ID does not exist" containerID="c64bfb4b8a459b76abaf202a730fc58370844fe34425ab49b2fb2b2f77e9b7f7" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.766578 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c64bfb4b8a459b76abaf202a730fc58370844fe34425ab49b2fb2b2f77e9b7f7"} err="failed to get container status \"c64bfb4b8a459b76abaf202a730fc58370844fe34425ab49b2fb2b2f77e9b7f7\": rpc error: code = NotFound desc = could not find container \"c64bfb4b8a459b76abaf202a730fc58370844fe34425ab49b2fb2b2f77e9b7f7\": container with ID starting with c64bfb4b8a459b76abaf202a730fc58370844fe34425ab49b2fb2b2f77e9b7f7 not found: ID does not exist" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.766613 4746 scope.go:117] "RemoveContainer" containerID="15361ac8791076a3c1cf04c0bd4b99b14ed3f96b51ede463fe6767f6242bbd80" Jan 28 21:00:45 crc kubenswrapper[4746]: E0128 21:00:45.767368 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15361ac8791076a3c1cf04c0bd4b99b14ed3f96b51ede463fe6767f6242bbd80\": container with ID starting with 15361ac8791076a3c1cf04c0bd4b99b14ed3f96b51ede463fe6767f6242bbd80 not found: ID does not exist" containerID="15361ac8791076a3c1cf04c0bd4b99b14ed3f96b51ede463fe6767f6242bbd80" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.767399 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15361ac8791076a3c1cf04c0bd4b99b14ed3f96b51ede463fe6767f6242bbd80"} err="failed to get container status \"15361ac8791076a3c1cf04c0bd4b99b14ed3f96b51ede463fe6767f6242bbd80\": rpc error: code = NotFound desc = could not find container \"15361ac8791076a3c1cf04c0bd4b99b14ed3f96b51ede463fe6767f6242bbd80\": container with ID starting with 15361ac8791076a3c1cf04c0bd4b99b14ed3f96b51ede463fe6767f6242bbd80 not found: ID does not exist" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.767418 4746 scope.go:117] "RemoveContainer" containerID="3e8f0d0271a494ec0a7737528b6d58303a35565a92add21bf5b49f48ff90e0bf" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.769425 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.830739 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.830819 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-certs\") pod \"cloudkitty-proc-0\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.830884 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k576\" (UniqueName: \"kubernetes.io/projected/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-kube-api-access-6k576\") pod \"cloudkitty-proc-0\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.830946 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-config-data\") pod \"cloudkitty-proc-0\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.830988 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-scripts\") pod \"cloudkitty-proc-0\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.831020 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.862859 4746 scope.go:117] "RemoveContainer" containerID="3e8f0d0271a494ec0a7737528b6d58303a35565a92add21bf5b49f48ff90e0bf" Jan 28 21:00:45 crc kubenswrapper[4746]: E0128 21:00:45.863417 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e8f0d0271a494ec0a7737528b6d58303a35565a92add21bf5b49f48ff90e0bf\": container with ID starting with 3e8f0d0271a494ec0a7737528b6d58303a35565a92add21bf5b49f48ff90e0bf not found: ID does not exist" containerID="3e8f0d0271a494ec0a7737528b6d58303a35565a92add21bf5b49f48ff90e0bf" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.863468 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e8f0d0271a494ec0a7737528b6d58303a35565a92add21bf5b49f48ff90e0bf"} err="failed to get container status \"3e8f0d0271a494ec0a7737528b6d58303a35565a92add21bf5b49f48ff90e0bf\": rpc error: code = NotFound desc = could not find container \"3e8f0d0271a494ec0a7737528b6d58303a35565a92add21bf5b49f48ff90e0bf\": container with ID starting with 3e8f0d0271a494ec0a7737528b6d58303a35565a92add21bf5b49f48ff90e0bf not found: ID does not exist" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.881917 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.881977 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.882018 4746 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.883063 4746 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6862c0afb8f6ee7e41759258bd8f935df2c29be354b170c8fd2a76edbba23242"} pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.883129 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" containerID="cri-o://6862c0afb8f6ee7e41759258bd8f935df2c29be354b170c8fd2a76edbba23242" gracePeriod=600 Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.932252 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-scripts\") pod \"cloudkitty-proc-0\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.932307 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.932450 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.932513 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-certs\") pod \"cloudkitty-proc-0\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.932569 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k576\" (UniqueName: \"kubernetes.io/projected/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-kube-api-access-6k576\") pod \"cloudkitty-proc-0\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.932649 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-config-data\") pod \"cloudkitty-proc-0\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.936281 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-scripts\") pod \"cloudkitty-proc-0\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.936761 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.939336 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.942566 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-config-data\") pod \"cloudkitty-proc-0\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.943218 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-certs\") pod \"cloudkitty-proc-0\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.963159 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:45 crc kubenswrapper[4746]: I0128 21:00:45.965669 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k576\" (UniqueName: \"kubernetes.io/projected/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-kube-api-access-6k576\") pod \"cloudkitty-proc-0\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:00:46 crc kubenswrapper[4746]: I0128 21:00:46.174146 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 28 21:00:46 crc kubenswrapper[4746]: I0128 21:00:46.584270 4746 generic.go:334] "Generic (PLEG): container finished" podID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerID="6862c0afb8f6ee7e41759258bd8f935df2c29be354b170c8fd2a76edbba23242" exitCode=0 Jan 28 21:00:46 crc kubenswrapper[4746]: I0128 21:00:46.584324 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerDied","Data":"6862c0afb8f6ee7e41759258bd8f935df2c29be354b170c8fd2a76edbba23242"} Jan 28 21:00:46 crc kubenswrapper[4746]: I0128 21:00:46.584353 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerStarted","Data":"551b5dbcacfba813c1158522c098223ffafd54f7aa789c2d4402da75877d8079"} Jan 28 21:00:46 crc kubenswrapper[4746]: I0128 21:00:46.584368 4746 scope.go:117] "RemoveContainer" containerID="635dfdb81316e9a80fdcd2f942f907e439906f4018e69db1be59f1c63b3c993e" Jan 28 21:00:46 crc kubenswrapper[4746]: I0128 21:00:46.769106 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5d6f7ddd75-47x9g"] Jan 28 21:00:46 crc kubenswrapper[4746]: I0128 21:00:46.899065 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bb735bc-29e9-4a5e-b6b4-a643776d9e44" path="/var/lib/kubelet/pods/0bb735bc-29e9-4a5e-b6b4-a643776d9e44/volumes" Jan 28 21:00:46 crc kubenswrapper[4746]: I0128 21:00:46.899880 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea56fe21-dd12-486e-a4e9-2af7cb7b9387" path="/var/lib/kubelet/pods/ea56fe21-dd12-486e-a4e9-2af7cb7b9387/volumes" Jan 28 21:00:46 crc kubenswrapper[4746]: I0128 21:00:46.900447 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 28 21:00:47 crc kubenswrapper[4746]: I0128 21:00:47.643296 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"8eab90e7-58c5-4bdf-bca6-12c78bdabea9","Type":"ContainerStarted","Data":"4ab8038e3ebef6e4484d598e7d3ed09d8e833167d3e19159b52ee369687e748d"} Jan 28 21:00:47 crc kubenswrapper[4746]: I0128 21:00:47.643708 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"8eab90e7-58c5-4bdf-bca6-12c78bdabea9","Type":"ContainerStarted","Data":"e1a1ea0ab8207674637ab36e5430dbf91210b0a889652187579a0efa44a5ebd8"} Jan 28 21:00:47 crc kubenswrapper[4746]: I0128 21:00:47.648191 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5d6f7ddd75-47x9g" event={"ID":"4524d3f6-9b61-4b9c-b778-0078a31efc3e","Type":"ContainerStarted","Data":"a6f6d05a68a642283babb4475297141d2ea2003d0cb71b782f2dde1d312beab0"} Jan 28 21:00:47 crc kubenswrapper[4746]: I0128 21:00:47.648226 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5d6f7ddd75-47x9g" event={"ID":"4524d3f6-9b61-4b9c-b778-0078a31efc3e","Type":"ContainerStarted","Data":"6b8bcda69944c0c25ea3d3757777eca91ba89875fa4345bd6ea1b142108bf634"} Jan 28 21:00:47 crc kubenswrapper[4746]: I0128 21:00:47.648236 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5d6f7ddd75-47x9g" event={"ID":"4524d3f6-9b61-4b9c-b778-0078a31efc3e","Type":"ContainerStarted","Data":"ea9d13b7424c4668f4691e14172e72b2c0012d2716d96b8c112f300f0b63c2b3"} Jan 28 21:00:47 crc kubenswrapper[4746]: I0128 21:00:47.649098 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:47 crc kubenswrapper[4746]: I0128 21:00:47.649126 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:47 crc kubenswrapper[4746]: I0128 21:00:47.671418 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.671390056 podStartE2EDuration="2.671390056s" podCreationTimestamp="2026-01-28 21:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:47.657570975 +0000 UTC m=+1275.613757329" watchObservedRunningTime="2026-01-28 21:00:47.671390056 +0000 UTC m=+1275.627576410" Jan 28 21:00:47 crc kubenswrapper[4746]: I0128 21:00:47.688147 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5d6f7ddd75-47x9g" podStartSLOduration=2.688125747 podStartE2EDuration="2.688125747s" podCreationTimestamp="2026-01-28 21:00:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:00:47.684062657 +0000 UTC m=+1275.640249011" watchObservedRunningTime="2026-01-28 21:00:47.688125747 +0000 UTC m=+1275.644312101" Jan 28 21:00:48 crc kubenswrapper[4746]: I0128 21:00:48.130646 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 28 21:00:49 crc kubenswrapper[4746]: I0128 21:00:49.715711 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5569c8497f-nhjcs" Jan 28 21:00:49 crc kubenswrapper[4746]: I0128 21:00:49.778670 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bd578f44d-rfllv"] Jan 28 21:00:49 crc kubenswrapper[4746]: I0128 21:00:49.778892 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-bd578f44d-rfllv" podUID="875165a4-1092-4e9d-ae24-5044a726e174" containerName="neutron-api" containerID="cri-o://6fc58bbe85e7f3c3353de5258e71e22d05d995e6b763b3ca70e96349116d341e" gracePeriod=30 Jan 28 21:00:49 crc kubenswrapper[4746]: I0128 21:00:49.779292 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-bd578f44d-rfllv" podUID="875165a4-1092-4e9d-ae24-5044a726e174" containerName="neutron-httpd" containerID="cri-o://b5b98064c5cf0b8cf0bae5f8b9c12ae68524d3be0f4240ff9272bdb1f0ba8e6e" gracePeriod=30 Jan 28 21:00:50 crc kubenswrapper[4746]: I0128 21:00:50.062476 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 28 21:00:50 crc kubenswrapper[4746]: I0128 21:00:50.689655 4746 generic.go:334] "Generic (PLEG): container finished" podID="875165a4-1092-4e9d-ae24-5044a726e174" containerID="b5b98064c5cf0b8cf0bae5f8b9c12ae68524d3be0f4240ff9272bdb1f0ba8e6e" exitCode=0 Jan 28 21:00:50 crc kubenswrapper[4746]: I0128 21:00:50.689726 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd578f44d-rfllv" event={"ID":"875165a4-1092-4e9d-ae24-5044a726e174","Type":"ContainerDied","Data":"b5b98064c5cf0b8cf0bae5f8b9c12ae68524d3be0f4240ff9272bdb1f0ba8e6e"} Jan 28 21:00:50 crc kubenswrapper[4746]: I0128 21:00:50.751524 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:00:50 crc kubenswrapper[4746]: I0128 21:00:50.751820 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerName="ceilometer-central-agent" containerID="cri-o://9ec51aab4b2fbdb8b869a6f608677c59886791c38eb25e8a1e71ab247b090360" gracePeriod=30 Jan 28 21:00:50 crc kubenswrapper[4746]: I0128 21:00:50.751900 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerName="sg-core" containerID="cri-o://0a24e65d3bb4e9c5c16c333080a33a5a7daeb8d88b95a1f51129b36d126f8e47" gracePeriod=30 Jan 28 21:00:50 crc kubenswrapper[4746]: I0128 21:00:50.751923 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerName="ceilometer-notification-agent" containerID="cri-o://885bcee0a0ea10f485d7731a6f1d59ecbc34909cb8d86deae1255c7363fbb21d" gracePeriod=30 Jan 28 21:00:50 crc kubenswrapper[4746]: I0128 21:00:50.751934 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerName="proxy-httpd" containerID="cri-o://264addfba649d18292bed5e3b3f9e5ac9c7afadb889210023e26b5a932c66687" gracePeriod=30 Jan 28 21:00:50 crc kubenswrapper[4746]: I0128 21:00:50.780606 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.192:3000/\": EOF" Jan 28 21:00:51 crc kubenswrapper[4746]: I0128 21:00:51.765283 4746 generic.go:334] "Generic (PLEG): container finished" podID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerID="264addfba649d18292bed5e3b3f9e5ac9c7afadb889210023e26b5a932c66687" exitCode=0 Jan 28 21:00:51 crc kubenswrapper[4746]: I0128 21:00:51.765847 4746 generic.go:334] "Generic (PLEG): container finished" podID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerID="0a24e65d3bb4e9c5c16c333080a33a5a7daeb8d88b95a1f51129b36d126f8e47" exitCode=2 Jan 28 21:00:51 crc kubenswrapper[4746]: I0128 21:00:51.765855 4746 generic.go:334] "Generic (PLEG): container finished" podID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerID="885bcee0a0ea10f485d7731a6f1d59ecbc34909cb8d86deae1255c7363fbb21d" exitCode=0 Jan 28 21:00:51 crc kubenswrapper[4746]: I0128 21:00:51.765862 4746 generic.go:334] "Generic (PLEG): container finished" podID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerID="9ec51aab4b2fbdb8b869a6f608677c59886791c38eb25e8a1e71ab247b090360" exitCode=0 Jan 28 21:00:51 crc kubenswrapper[4746]: I0128 21:00:51.765893 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0133f6a6-0179-41e7-a5b9-bc1661ea19e2","Type":"ContainerDied","Data":"264addfba649d18292bed5e3b3f9e5ac9c7afadb889210023e26b5a932c66687"} Jan 28 21:00:51 crc kubenswrapper[4746]: I0128 21:00:51.765916 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0133f6a6-0179-41e7-a5b9-bc1661ea19e2","Type":"ContainerDied","Data":"0a24e65d3bb4e9c5c16c333080a33a5a7daeb8d88b95a1f51129b36d126f8e47"} Jan 28 21:00:51 crc kubenswrapper[4746]: I0128 21:00:51.765927 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0133f6a6-0179-41e7-a5b9-bc1661ea19e2","Type":"ContainerDied","Data":"885bcee0a0ea10f485d7731a6f1d59ecbc34909cb8d86deae1255c7363fbb21d"} Jan 28 21:00:51 crc kubenswrapper[4746]: I0128 21:00:51.765936 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0133f6a6-0179-41e7-a5b9-bc1661ea19e2","Type":"ContainerDied","Data":"9ec51aab4b2fbdb8b869a6f608677c59886791c38eb25e8a1e71ab247b090360"} Jan 28 21:00:51 crc kubenswrapper[4746]: I0128 21:00:51.919852 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.010723 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-sg-core-conf-yaml\") pod \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.010809 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-config-data\") pod \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.010913 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-log-httpd\") pod \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.010981 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-run-httpd\") pod \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.011124 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-scripts\") pod \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.011160 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbzz9\" (UniqueName: \"kubernetes.io/projected/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-kube-api-access-nbzz9\") pod \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.011239 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-combined-ca-bundle\") pod \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\" (UID: \"0133f6a6-0179-41e7-a5b9-bc1661ea19e2\") " Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.013336 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0133f6a6-0179-41e7-a5b9-bc1661ea19e2" (UID: "0133f6a6-0179-41e7-a5b9-bc1661ea19e2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.014830 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0133f6a6-0179-41e7-a5b9-bc1661ea19e2" (UID: "0133f6a6-0179-41e7-a5b9-bc1661ea19e2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.024285 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-kube-api-access-nbzz9" (OuterVolumeSpecName: "kube-api-access-nbzz9") pod "0133f6a6-0179-41e7-a5b9-bc1661ea19e2" (UID: "0133f6a6-0179-41e7-a5b9-bc1661ea19e2"). InnerVolumeSpecName "kube-api-access-nbzz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.041497 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-scripts" (OuterVolumeSpecName: "scripts") pod "0133f6a6-0179-41e7-a5b9-bc1661ea19e2" (UID: "0133f6a6-0179-41e7-a5b9-bc1661ea19e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.044410 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0133f6a6-0179-41e7-a5b9-bc1661ea19e2" (UID: "0133f6a6-0179-41e7-a5b9-bc1661ea19e2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.110824 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0133f6a6-0179-41e7-a5b9-bc1661ea19e2" (UID: "0133f6a6-0179-41e7-a5b9-bc1661ea19e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.114089 4746 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.114118 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.114132 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbzz9\" (UniqueName: \"kubernetes.io/projected/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-kube-api-access-nbzz9\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.114150 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.114161 4746 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.114175 4746 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.168747 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-config-data" (OuterVolumeSpecName: "config-data") pod "0133f6a6-0179-41e7-a5b9-bc1661ea19e2" (UID: "0133f6a6-0179-41e7-a5b9-bc1661ea19e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.217254 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0133f6a6-0179-41e7-a5b9-bc1661ea19e2-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.781682 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0133f6a6-0179-41e7-a5b9-bc1661ea19e2","Type":"ContainerDied","Data":"c666985feedf61a37eaf1b4d4f447d602c3d39a59b346ef5c16c3eb5e854f661"} Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.781740 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.782017 4746 scope.go:117] "RemoveContainer" containerID="264addfba649d18292bed5e3b3f9e5ac9c7afadb889210023e26b5a932c66687" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.830956 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.875619 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.893125 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:00:52 crc kubenswrapper[4746]: E0128 21:00:52.893709 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerName="sg-core" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.893735 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerName="sg-core" Jan 28 21:00:52 crc kubenswrapper[4746]: E0128 21:00:52.893748 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerName="ceilometer-central-agent" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.893758 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerName="ceilometer-central-agent" Jan 28 21:00:52 crc kubenswrapper[4746]: E0128 21:00:52.893770 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerName="proxy-httpd" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.893779 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerName="proxy-httpd" Jan 28 21:00:52 crc kubenswrapper[4746]: E0128 21:00:52.893808 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerName="ceilometer-notification-agent" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.893818 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerName="ceilometer-notification-agent" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.894048 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerName="proxy-httpd" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.894070 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerName="ceilometer-notification-agent" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.894102 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerName="sg-core" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.894113 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" containerName="ceilometer-central-agent" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.896295 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.905533 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.906206 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 21:00:52 crc kubenswrapper[4746]: I0128 21:00:52.906470 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.037222 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.037400 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.037450 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-scripts\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.037488 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-run-httpd\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.037657 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-config-data\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.037754 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-log-httpd\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.037805 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlg52\" (UniqueName: \"kubernetes.io/projected/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-kube-api-access-nlg52\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.139322 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-log-httpd\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.139392 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlg52\" (UniqueName: \"kubernetes.io/projected/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-kube-api-access-nlg52\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.139545 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.139612 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.139639 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-scripts\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.139664 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-run-httpd\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.139697 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-config-data\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.140310 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-run-httpd\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.140638 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-log-httpd\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.150398 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-scripts\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.158447 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.165877 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.169808 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlg52\" (UniqueName: \"kubernetes.io/projected/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-kube-api-access-nlg52\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.173069 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-config-data\") pod \"ceilometer-0\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " pod="openstack/ceilometer-0" Jan 28 21:00:53 crc kubenswrapper[4746]: I0128 21:00:53.235853 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:00:54 crc kubenswrapper[4746]: I0128 21:00:54.845923 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0133f6a6-0179-41e7-a5b9-bc1661ea19e2" path="/var/lib/kubelet/pods/0133f6a6-0179-41e7-a5b9-bc1661ea19e2/volumes" Jan 28 21:00:54 crc kubenswrapper[4746]: I0128 21:00:54.943118 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 21:00:54 crc kubenswrapper[4746]: I0128 21:00:54.943402 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bae342c7-f51f-4da2-a419-61002cc82f59" containerName="glance-log" containerID="cri-o://22db722f2f2d19a10ec9c1f4b3aab062be23dcb8406902ca7d46a922791146a2" gracePeriod=30 Jan 28 21:00:54 crc kubenswrapper[4746]: I0128 21:00:54.943935 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bae342c7-f51f-4da2-a419-61002cc82f59" containerName="glance-httpd" containerID="cri-o://25d5b3df9baae67da6b02e76a3bf75adbce74ffa20972d28ed3717793041ff40" gracePeriod=30 Jan 28 21:00:55 crc kubenswrapper[4746]: I0128 21:00:55.341142 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:00:55 crc kubenswrapper[4746]: I0128 21:00:55.914346 4746 generic.go:334] "Generic (PLEG): container finished" podID="bae342c7-f51f-4da2-a419-61002cc82f59" containerID="22db722f2f2d19a10ec9c1f4b3aab062be23dcb8406902ca7d46a922791146a2" exitCode=143 Jan 28 21:00:55 crc kubenswrapper[4746]: I0128 21:00:55.914391 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bae342c7-f51f-4da2-a419-61002cc82f59","Type":"ContainerDied","Data":"22db722f2f2d19a10ec9c1f4b3aab062be23dcb8406902ca7d46a922791146a2"} Jan 28 21:00:56 crc kubenswrapper[4746]: I0128 21:00:56.069960 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:56 crc kubenswrapper[4746]: I0128 21:00:56.087479 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5d6f7ddd75-47x9g" Jan 28 21:00:57 crc kubenswrapper[4746]: I0128 21:00:57.939978 4746 generic.go:334] "Generic (PLEG): container finished" podID="875165a4-1092-4e9d-ae24-5044a726e174" containerID="6fc58bbe85e7f3c3353de5258e71e22d05d995e6b763b3ca70e96349116d341e" exitCode=0 Jan 28 21:00:57 crc kubenswrapper[4746]: I0128 21:00:57.940268 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd578f44d-rfllv" event={"ID":"875165a4-1092-4e9d-ae24-5044a726e174","Type":"ContainerDied","Data":"6fc58bbe85e7f3c3353de5258e71e22d05d995e6b763b3ca70e96349116d341e"} Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.137242 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="bae342c7-f51f-4da2-a419-61002cc82f59" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.175:9292/healthcheck\": read tcp 10.217.0.2:36830->10.217.0.175:9292: read: connection reset by peer" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.137356 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="bae342c7-f51f-4da2-a419-61002cc82f59" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.175:9292/healthcheck\": read tcp 10.217.0.2:36838->10.217.0.175:9292: read: connection reset by peer" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.562439 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-j4mld"] Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.564653 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-j4mld" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.577043 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-j4mld"] Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.603199 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eae3e347-2cb4-49b7-999d-abdfe9c3d2ac-operator-scripts\") pod \"nova-api-db-create-j4mld\" (UID: \"eae3e347-2cb4-49b7-999d-abdfe9c3d2ac\") " pod="openstack/nova-api-db-create-j4mld" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.603254 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zzwv\" (UniqueName: \"kubernetes.io/projected/eae3e347-2cb4-49b7-999d-abdfe9c3d2ac-kube-api-access-2zzwv\") pod \"nova-api-db-create-j4mld\" (UID: \"eae3e347-2cb4-49b7-999d-abdfe9c3d2ac\") " pod="openstack/nova-api-db-create-j4mld" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.705372 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eae3e347-2cb4-49b7-999d-abdfe9c3d2ac-operator-scripts\") pod \"nova-api-db-create-j4mld\" (UID: \"eae3e347-2cb4-49b7-999d-abdfe9c3d2ac\") " pod="openstack/nova-api-db-create-j4mld" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.705434 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zzwv\" (UniqueName: \"kubernetes.io/projected/eae3e347-2cb4-49b7-999d-abdfe9c3d2ac-kube-api-access-2zzwv\") pod \"nova-api-db-create-j4mld\" (UID: \"eae3e347-2cb4-49b7-999d-abdfe9c3d2ac\") " pod="openstack/nova-api-db-create-j4mld" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.706385 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eae3e347-2cb4-49b7-999d-abdfe9c3d2ac-operator-scripts\") pod \"nova-api-db-create-j4mld\" (UID: \"eae3e347-2cb4-49b7-999d-abdfe9c3d2ac\") " pod="openstack/nova-api-db-create-j4mld" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.744063 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zzwv\" (UniqueName: \"kubernetes.io/projected/eae3e347-2cb4-49b7-999d-abdfe9c3d2ac-kube-api-access-2zzwv\") pod \"nova-api-db-create-j4mld\" (UID: \"eae3e347-2cb4-49b7-999d-abdfe9c3d2ac\") " pod="openstack/nova-api-db-create-j4mld" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.768145 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-gmx6k"] Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.769686 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gmx6k" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.792104 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c1ea-account-create-update-kph5m"] Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.793380 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c1ea-account-create-update-kph5m" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.797363 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.811354 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4718d022-41c2-4684-9093-1f83e23dc367-operator-scripts\") pod \"nova-cell0-db-create-gmx6k\" (UID: \"4718d022-41c2-4684-9093-1f83e23dc367\") " pod="openstack/nova-cell0-db-create-gmx6k" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.811435 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56rwh\" (UniqueName: \"kubernetes.io/projected/4718d022-41c2-4684-9093-1f83e23dc367-kube-api-access-56rwh\") pod \"nova-cell0-db-create-gmx6k\" (UID: \"4718d022-41c2-4684-9093-1f83e23dc367\") " pod="openstack/nova-cell0-db-create-gmx6k" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.831159 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gmx6k"] Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.895796 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-j4mld" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.898130 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c1ea-account-create-update-kph5m"] Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.915619 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56rwh\" (UniqueName: \"kubernetes.io/projected/4718d022-41c2-4684-9093-1f83e23dc367-kube-api-access-56rwh\") pod \"nova-cell0-db-create-gmx6k\" (UID: \"4718d022-41c2-4684-9093-1f83e23dc367\") " pod="openstack/nova-cell0-db-create-gmx6k" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.915713 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq5zs\" (UniqueName: \"kubernetes.io/projected/6b0f1a53-15ed-455c-80a3-7f92a3851538-kube-api-access-nq5zs\") pod \"nova-api-c1ea-account-create-update-kph5m\" (UID: \"6b0f1a53-15ed-455c-80a3-7f92a3851538\") " pod="openstack/nova-api-c1ea-account-create-update-kph5m" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.915860 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b0f1a53-15ed-455c-80a3-7f92a3851538-operator-scripts\") pod \"nova-api-c1ea-account-create-update-kph5m\" (UID: \"6b0f1a53-15ed-455c-80a3-7f92a3851538\") " pod="openstack/nova-api-c1ea-account-create-update-kph5m" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.915927 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4718d022-41c2-4684-9093-1f83e23dc367-operator-scripts\") pod \"nova-cell0-db-create-gmx6k\" (UID: \"4718d022-41c2-4684-9093-1f83e23dc367\") " pod="openstack/nova-cell0-db-create-gmx6k" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.916822 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4718d022-41c2-4684-9093-1f83e23dc367-operator-scripts\") pod \"nova-cell0-db-create-gmx6k\" (UID: \"4718d022-41c2-4684-9093-1f83e23dc367\") " pod="openstack/nova-cell0-db-create-gmx6k" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.972132 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56rwh\" (UniqueName: \"kubernetes.io/projected/4718d022-41c2-4684-9093-1f83e23dc367-kube-api-access-56rwh\") pod \"nova-cell0-db-create-gmx6k\" (UID: \"4718d022-41c2-4684-9093-1f83e23dc367\") " pod="openstack/nova-cell0-db-create-gmx6k" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.982011 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-kp99z"] Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.983599 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kp99z" Jan 28 21:00:58 crc kubenswrapper[4746]: I0128 21:00:58.996415 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kp99z"] Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.000188 4746 generic.go:334] "Generic (PLEG): container finished" podID="bae342c7-f51f-4da2-a419-61002cc82f59" containerID="25d5b3df9baae67da6b02e76a3bf75adbce74ffa20972d28ed3717793041ff40" exitCode=0 Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.000241 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bae342c7-f51f-4da2-a419-61002cc82f59","Type":"ContainerDied","Data":"25d5b3df9baae67da6b02e76a3bf75adbce74ffa20972d28ed3717793041ff40"} Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.019048 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx5kj\" (UniqueName: \"kubernetes.io/projected/add7217e-e0fc-4079-a1c4-b3a328588a9a-kube-api-access-jx5kj\") pod \"nova-cell1-db-create-kp99z\" (UID: \"add7217e-e0fc-4079-a1c4-b3a328588a9a\") " pod="openstack/nova-cell1-db-create-kp99z" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.024291 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b0f1a53-15ed-455c-80a3-7f92a3851538-operator-scripts\") pod \"nova-api-c1ea-account-create-update-kph5m\" (UID: \"6b0f1a53-15ed-455c-80a3-7f92a3851538\") " pod="openstack/nova-api-c1ea-account-create-update-kph5m" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.025471 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/add7217e-e0fc-4079-a1c4-b3a328588a9a-operator-scripts\") pod \"nova-cell1-db-create-kp99z\" (UID: \"add7217e-e0fc-4079-a1c4-b3a328588a9a\") " pod="openstack/nova-cell1-db-create-kp99z" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.025692 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq5zs\" (UniqueName: \"kubernetes.io/projected/6b0f1a53-15ed-455c-80a3-7f92a3851538-kube-api-access-nq5zs\") pod \"nova-api-c1ea-account-create-update-kph5m\" (UID: \"6b0f1a53-15ed-455c-80a3-7f92a3851538\") " pod="openstack/nova-api-c1ea-account-create-update-kph5m" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.025267 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b0f1a53-15ed-455c-80a3-7f92a3851538-operator-scripts\") pod \"nova-api-c1ea-account-create-update-kph5m\" (UID: \"6b0f1a53-15ed-455c-80a3-7f92a3851538\") " pod="openstack/nova-api-c1ea-account-create-update-kph5m" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.044215 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-e199-account-create-update-v6fd6"] Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.044860 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq5zs\" (UniqueName: \"kubernetes.io/projected/6b0f1a53-15ed-455c-80a3-7f92a3851538-kube-api-access-nq5zs\") pod \"nova-api-c1ea-account-create-update-kph5m\" (UID: \"6b0f1a53-15ed-455c-80a3-7f92a3851538\") " pod="openstack/nova-api-c1ea-account-create-update-kph5m" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.045945 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e199-account-create-update-v6fd6" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.048480 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.056256 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e199-account-create-update-v6fd6"] Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.129143 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd45s\" (UniqueName: \"kubernetes.io/projected/0a9afff1-311d-4881-a211-e405af09d4a7-kube-api-access-wd45s\") pod \"nova-cell0-e199-account-create-update-v6fd6\" (UID: \"0a9afff1-311d-4881-a211-e405af09d4a7\") " pod="openstack/nova-cell0-e199-account-create-update-v6fd6" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.129260 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a9afff1-311d-4881-a211-e405af09d4a7-operator-scripts\") pod \"nova-cell0-e199-account-create-update-v6fd6\" (UID: \"0a9afff1-311d-4881-a211-e405af09d4a7\") " pod="openstack/nova-cell0-e199-account-create-update-v6fd6" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.129331 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx5kj\" (UniqueName: \"kubernetes.io/projected/add7217e-e0fc-4079-a1c4-b3a328588a9a-kube-api-access-jx5kj\") pod \"nova-cell1-db-create-kp99z\" (UID: \"add7217e-e0fc-4079-a1c4-b3a328588a9a\") " pod="openstack/nova-cell1-db-create-kp99z" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.129419 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/add7217e-e0fc-4079-a1c4-b3a328588a9a-operator-scripts\") pod \"nova-cell1-db-create-kp99z\" (UID: \"add7217e-e0fc-4079-a1c4-b3a328588a9a\") " pod="openstack/nova-cell1-db-create-kp99z" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.130139 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/add7217e-e0fc-4079-a1c4-b3a328588a9a-operator-scripts\") pod \"nova-cell1-db-create-kp99z\" (UID: \"add7217e-e0fc-4079-a1c4-b3a328588a9a\") " pod="openstack/nova-cell1-db-create-kp99z" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.153525 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gmx6k" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.192816 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c1ea-account-create-update-kph5m" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.205306 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx5kj\" (UniqueName: \"kubernetes.io/projected/add7217e-e0fc-4079-a1c4-b3a328588a9a-kube-api-access-jx5kj\") pod \"nova-cell1-db-create-kp99z\" (UID: \"add7217e-e0fc-4079-a1c4-b3a328588a9a\") " pod="openstack/nova-cell1-db-create-kp99z" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.208019 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-42e2-account-create-update-k45vw"] Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.215770 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-42e2-account-create-update-k45vw" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.220536 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.223389 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-42e2-account-create-update-k45vw"] Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.233275 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxqrn\" (UniqueName: \"kubernetes.io/projected/9f60a726-8cd2-4eca-b252-014d68fded35-kube-api-access-qxqrn\") pod \"nova-cell1-42e2-account-create-update-k45vw\" (UID: \"9f60a726-8cd2-4eca-b252-014d68fded35\") " pod="openstack/nova-cell1-42e2-account-create-update-k45vw" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.233438 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd45s\" (UniqueName: \"kubernetes.io/projected/0a9afff1-311d-4881-a211-e405af09d4a7-kube-api-access-wd45s\") pod \"nova-cell0-e199-account-create-update-v6fd6\" (UID: \"0a9afff1-311d-4881-a211-e405af09d4a7\") " pod="openstack/nova-cell0-e199-account-create-update-v6fd6" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.233518 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f60a726-8cd2-4eca-b252-014d68fded35-operator-scripts\") pod \"nova-cell1-42e2-account-create-update-k45vw\" (UID: \"9f60a726-8cd2-4eca-b252-014d68fded35\") " pod="openstack/nova-cell1-42e2-account-create-update-k45vw" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.233597 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a9afff1-311d-4881-a211-e405af09d4a7-operator-scripts\") pod \"nova-cell0-e199-account-create-update-v6fd6\" (UID: \"0a9afff1-311d-4881-a211-e405af09d4a7\") " pod="openstack/nova-cell0-e199-account-create-update-v6fd6" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.234532 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a9afff1-311d-4881-a211-e405af09d4a7-operator-scripts\") pod \"nova-cell0-e199-account-create-update-v6fd6\" (UID: \"0a9afff1-311d-4881-a211-e405af09d4a7\") " pod="openstack/nova-cell0-e199-account-create-update-v6fd6" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.257661 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd45s\" (UniqueName: \"kubernetes.io/projected/0a9afff1-311d-4881-a211-e405af09d4a7-kube-api-access-wd45s\") pod \"nova-cell0-e199-account-create-update-v6fd6\" (UID: \"0a9afff1-311d-4881-a211-e405af09d4a7\") " pod="openstack/nova-cell0-e199-account-create-update-v6fd6" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.335310 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxqrn\" (UniqueName: \"kubernetes.io/projected/9f60a726-8cd2-4eca-b252-014d68fded35-kube-api-access-qxqrn\") pod \"nova-cell1-42e2-account-create-update-k45vw\" (UID: \"9f60a726-8cd2-4eca-b252-014d68fded35\") " pod="openstack/nova-cell1-42e2-account-create-update-k45vw" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.335457 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f60a726-8cd2-4eca-b252-014d68fded35-operator-scripts\") pod \"nova-cell1-42e2-account-create-update-k45vw\" (UID: \"9f60a726-8cd2-4eca-b252-014d68fded35\") " pod="openstack/nova-cell1-42e2-account-create-update-k45vw" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.336306 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f60a726-8cd2-4eca-b252-014d68fded35-operator-scripts\") pod \"nova-cell1-42e2-account-create-update-k45vw\" (UID: \"9f60a726-8cd2-4eca-b252-014d68fded35\") " pod="openstack/nova-cell1-42e2-account-create-update-k45vw" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.363854 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxqrn\" (UniqueName: \"kubernetes.io/projected/9f60a726-8cd2-4eca-b252-014d68fded35-kube-api-access-qxqrn\") pod \"nova-cell1-42e2-account-create-update-k45vw\" (UID: \"9f60a726-8cd2-4eca-b252-014d68fded35\") " pod="openstack/nova-cell1-42e2-account-create-update-k45vw" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.437988 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e199-account-create-update-v6fd6" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.439154 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kp99z" Jan 28 21:00:59 crc kubenswrapper[4746]: I0128 21:00:59.550941 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-42e2-account-create-update-k45vw" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.162889 4746 scope.go:117] "RemoveContainer" containerID="0a24e65d3bb4e9c5c16c333080a33a5a7daeb8d88b95a1f51129b36d126f8e47" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.176019 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29493901-2dcqp"] Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.177463 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29493901-2dcqp" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.204508 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29493901-2dcqp"] Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.369448 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/018e2b8e-63bb-41fd-8153-f0c8fc106af7-fernet-keys\") pod \"keystone-cron-29493901-2dcqp\" (UID: \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\") " pod="openstack/keystone-cron-29493901-2dcqp" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.369976 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7w8k\" (UniqueName: \"kubernetes.io/projected/018e2b8e-63bb-41fd-8153-f0c8fc106af7-kube-api-access-c7w8k\") pod \"keystone-cron-29493901-2dcqp\" (UID: \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\") " pod="openstack/keystone-cron-29493901-2dcqp" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.370019 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018e2b8e-63bb-41fd-8153-f0c8fc106af7-combined-ca-bundle\") pod \"keystone-cron-29493901-2dcqp\" (UID: \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\") " pod="openstack/keystone-cron-29493901-2dcqp" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.370042 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018e2b8e-63bb-41fd-8153-f0c8fc106af7-config-data\") pod \"keystone-cron-29493901-2dcqp\" (UID: \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\") " pod="openstack/keystone-cron-29493901-2dcqp" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.401512 4746 scope.go:117] "RemoveContainer" containerID="885bcee0a0ea10f485d7731a6f1d59ecbc34909cb8d86deae1255c7363fbb21d" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.471725 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/018e2b8e-63bb-41fd-8153-f0c8fc106af7-fernet-keys\") pod \"keystone-cron-29493901-2dcqp\" (UID: \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\") " pod="openstack/keystone-cron-29493901-2dcqp" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.471892 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7w8k\" (UniqueName: \"kubernetes.io/projected/018e2b8e-63bb-41fd-8153-f0c8fc106af7-kube-api-access-c7w8k\") pod \"keystone-cron-29493901-2dcqp\" (UID: \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\") " pod="openstack/keystone-cron-29493901-2dcqp" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.471924 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018e2b8e-63bb-41fd-8153-f0c8fc106af7-combined-ca-bundle\") pod \"keystone-cron-29493901-2dcqp\" (UID: \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\") " pod="openstack/keystone-cron-29493901-2dcqp" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.471942 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018e2b8e-63bb-41fd-8153-f0c8fc106af7-config-data\") pod \"keystone-cron-29493901-2dcqp\" (UID: \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\") " pod="openstack/keystone-cron-29493901-2dcqp" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.480930 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018e2b8e-63bb-41fd-8153-f0c8fc106af7-combined-ca-bundle\") pod \"keystone-cron-29493901-2dcqp\" (UID: \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\") " pod="openstack/keystone-cron-29493901-2dcqp" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.481875 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018e2b8e-63bb-41fd-8153-f0c8fc106af7-config-data\") pod \"keystone-cron-29493901-2dcqp\" (UID: \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\") " pod="openstack/keystone-cron-29493901-2dcqp" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.483582 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/018e2b8e-63bb-41fd-8153-f0c8fc106af7-fernet-keys\") pod \"keystone-cron-29493901-2dcqp\" (UID: \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\") " pod="openstack/keystone-cron-29493901-2dcqp" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.503213 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7w8k\" (UniqueName: \"kubernetes.io/projected/018e2b8e-63bb-41fd-8153-f0c8fc106af7-kube-api-access-c7w8k\") pod \"keystone-cron-29493901-2dcqp\" (UID: \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\") " pod="openstack/keystone-cron-29493901-2dcqp" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.516592 4746 scope.go:117] "RemoveContainer" containerID="9ec51aab4b2fbdb8b869a6f608677c59886791c38eb25e8a1e71ab247b090360" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.520279 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29493901-2dcqp" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.541430 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bd578f44d-rfllv" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.579631 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-ovndb-tls-certs\") pod \"875165a4-1092-4e9d-ae24-5044a726e174\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.580802 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-httpd-config\") pod \"875165a4-1092-4e9d-ae24-5044a726e174\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.580906 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-config\") pod \"875165a4-1092-4e9d-ae24-5044a726e174\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.580943 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvdq7\" (UniqueName: \"kubernetes.io/projected/875165a4-1092-4e9d-ae24-5044a726e174-kube-api-access-kvdq7\") pod \"875165a4-1092-4e9d-ae24-5044a726e174\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.580973 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-combined-ca-bundle\") pod \"875165a4-1092-4e9d-ae24-5044a726e174\" (UID: \"875165a4-1092-4e9d-ae24-5044a726e174\") " Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.588573 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/875165a4-1092-4e9d-ae24-5044a726e174-kube-api-access-kvdq7" (OuterVolumeSpecName: "kube-api-access-kvdq7") pod "875165a4-1092-4e9d-ae24-5044a726e174" (UID: "875165a4-1092-4e9d-ae24-5044a726e174"). InnerVolumeSpecName "kube-api-access-kvdq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.600681 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "875165a4-1092-4e9d-ae24-5044a726e174" (UID: "875165a4-1092-4e9d-ae24-5044a726e174"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.646544 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-config" (OuterVolumeSpecName: "config") pod "875165a4-1092-4e9d-ae24-5044a726e174" (UID: "875165a4-1092-4e9d-ae24-5044a726e174"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.692985 4746 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.693021 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-config\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.693033 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvdq7\" (UniqueName: \"kubernetes.io/projected/875165a4-1092-4e9d-ae24-5044a726e174-kube-api-access-kvdq7\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.718351 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "875165a4-1092-4e9d-ae24-5044a726e174" (UID: "875165a4-1092-4e9d-ae24-5044a726e174"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.738779 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "875165a4-1092-4e9d-ae24-5044a726e174" (UID: "875165a4-1092-4e9d-ae24-5044a726e174"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.797728 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.797946 4746 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/875165a4-1092-4e9d-ae24-5044a726e174-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:00 crc kubenswrapper[4746]: I0128 21:01:00.904988 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.004471 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383\") pod \"bae342c7-f51f-4da2-a419-61002cc82f59\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.004791 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-scripts\") pod \"bae342c7-f51f-4da2-a419-61002cc82f59\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.004955 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87vlc\" (UniqueName: \"kubernetes.io/projected/bae342c7-f51f-4da2-a419-61002cc82f59-kube-api-access-87vlc\") pod \"bae342c7-f51f-4da2-a419-61002cc82f59\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.005123 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-combined-ca-bundle\") pod \"bae342c7-f51f-4da2-a419-61002cc82f59\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.005248 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-config-data\") pod \"bae342c7-f51f-4da2-a419-61002cc82f59\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.005371 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bae342c7-f51f-4da2-a419-61002cc82f59-logs\") pod \"bae342c7-f51f-4da2-a419-61002cc82f59\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.005534 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bae342c7-f51f-4da2-a419-61002cc82f59-httpd-run\") pod \"bae342c7-f51f-4da2-a419-61002cc82f59\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.005675 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-public-tls-certs\") pod \"bae342c7-f51f-4da2-a419-61002cc82f59\" (UID: \"bae342c7-f51f-4da2-a419-61002cc82f59\") " Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.006836 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bae342c7-f51f-4da2-a419-61002cc82f59-logs" (OuterVolumeSpecName: "logs") pod "bae342c7-f51f-4da2-a419-61002cc82f59" (UID: "bae342c7-f51f-4da2-a419-61002cc82f59"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.007118 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bae342c7-f51f-4da2-a419-61002cc82f59-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bae342c7-f51f-4da2-a419-61002cc82f59" (UID: "bae342c7-f51f-4da2-a419-61002cc82f59"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.011010 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bae342c7-f51f-4da2-a419-61002cc82f59-kube-api-access-87vlc" (OuterVolumeSpecName: "kube-api-access-87vlc") pod "bae342c7-f51f-4da2-a419-61002cc82f59" (UID: "bae342c7-f51f-4da2-a419-61002cc82f59"). InnerVolumeSpecName "kube-api-access-87vlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.014068 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-scripts" (OuterVolumeSpecName: "scripts") pod "bae342c7-f51f-4da2-a419-61002cc82f59" (UID: "bae342c7-f51f-4da2-a419-61002cc82f59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.031377 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bd578f44d-rfllv" event={"ID":"875165a4-1092-4e9d-ae24-5044a726e174","Type":"ContainerDied","Data":"eaa4be35ac05930b0aaf65c487338144a247098d838392db5fbfbed3ea6cd2e6"} Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.031436 4746 scope.go:117] "RemoveContainer" containerID="b5b98064c5cf0b8cf0bae5f8b9c12ae68524d3be0f4240ff9272bdb1f0ba8e6e" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.031537 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bd578f44d-rfllv" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.043282 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b9e46853-37d6-49c8-ada6-344f49a39e5f","Type":"ContainerStarted","Data":"0034341847b3d046a065d62176fb34b2c908a3b9ebf7ad3e212bf578979b1ac6"} Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.047571 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bae342c7-f51f-4da2-a419-61002cc82f59","Type":"ContainerDied","Data":"9dfdd9c2342569e8ca259d7dfebb6487fe980a383cbf54d15bbad64325b0791b"} Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.047707 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.054097 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383" (OuterVolumeSpecName: "glance") pod "bae342c7-f51f-4da2-a419-61002cc82f59" (UID: "bae342c7-f51f-4da2-a419-61002cc82f59"). InnerVolumeSpecName "pvc-95469877-e687-4d8b-97fe-080814cf4383". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.057163 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bae342c7-f51f-4da2-a419-61002cc82f59" (UID: "bae342c7-f51f-4da2-a419-61002cc82f59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.069759 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.802807404 podStartE2EDuration="27.069736754s" podCreationTimestamp="2026-01-28 21:00:34 +0000 UTC" firstStartedPulling="2026-01-28 21:00:36.881706608 +0000 UTC m=+1264.837892962" lastFinishedPulling="2026-01-28 21:01:00.148635958 +0000 UTC m=+1288.104822312" observedRunningTime="2026-01-28 21:01:01.059345225 +0000 UTC m=+1289.015531579" watchObservedRunningTime="2026-01-28 21:01:01.069736754 +0000 UTC m=+1289.025923108" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.104435 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bae342c7-f51f-4da2-a419-61002cc82f59" (UID: "bae342c7-f51f-4da2-a419-61002cc82f59"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.108585 4746 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bae342c7-f51f-4da2-a419-61002cc82f59-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.108611 4746 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.108634 4746 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-95469877-e687-4d8b-97fe-080814cf4383\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383\") on node \"crc\" " Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.108645 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.108656 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87vlc\" (UniqueName: \"kubernetes.io/projected/bae342c7-f51f-4da2-a419-61002cc82f59-kube-api-access-87vlc\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.108666 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.108674 4746 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bae342c7-f51f-4da2-a419-61002cc82f59-logs\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.135848 4746 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.136008 4746 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-95469877-e687-4d8b-97fe-080814cf4383" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383") on node "crc" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.137584 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-config-data" (OuterVolumeSpecName: "config-data") pod "bae342c7-f51f-4da2-a419-61002cc82f59" (UID: "bae342c7-f51f-4da2-a419-61002cc82f59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.210441 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae342c7-f51f-4da2-a419-61002cc82f59-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.210480 4746 reconciler_common.go:293] "Volume detached for volume \"pvc-95469877-e687-4d8b-97fe-080814cf4383\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.217099 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bd578f44d-rfllv"] Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.225873 4746 scope.go:117] "RemoveContainer" containerID="6fc58bbe85e7f3c3353de5258e71e22d05d995e6b763b3ca70e96349116d341e" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.228591 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-bd578f44d-rfllv"] Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.278881 4746 scope.go:117] "RemoveContainer" containerID="25d5b3df9baae67da6b02e76a3bf75adbce74ffa20972d28ed3717793041ff40" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.333096 4746 scope.go:117] "RemoveContainer" containerID="22db722f2f2d19a10ec9c1f4b3aab062be23dcb8406902ca7d46a922791146a2" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.392938 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.404745 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.420000 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 21:01:01 crc kubenswrapper[4746]: E0128 21:01:01.420507 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="875165a4-1092-4e9d-ae24-5044a726e174" containerName="neutron-httpd" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.420524 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="875165a4-1092-4e9d-ae24-5044a726e174" containerName="neutron-httpd" Jan 28 21:01:01 crc kubenswrapper[4746]: E0128 21:01:01.420539 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="875165a4-1092-4e9d-ae24-5044a726e174" containerName="neutron-api" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.420545 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="875165a4-1092-4e9d-ae24-5044a726e174" containerName="neutron-api" Jan 28 21:01:01 crc kubenswrapper[4746]: E0128 21:01:01.420555 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae342c7-f51f-4da2-a419-61002cc82f59" containerName="glance-httpd" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.420561 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae342c7-f51f-4da2-a419-61002cc82f59" containerName="glance-httpd" Jan 28 21:01:01 crc kubenswrapper[4746]: E0128 21:01:01.420574 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae342c7-f51f-4da2-a419-61002cc82f59" containerName="glance-log" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.420580 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae342c7-f51f-4da2-a419-61002cc82f59" containerName="glance-log" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.420765 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="875165a4-1092-4e9d-ae24-5044a726e174" containerName="neutron-httpd" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.420777 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="bae342c7-f51f-4da2-a419-61002cc82f59" containerName="glance-httpd" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.420794 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="bae342c7-f51f-4da2-a419-61002cc82f59" containerName="glance-log" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.420813 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="875165a4-1092-4e9d-ae24-5044a726e174" containerName="neutron-api" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.421779 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.427869 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.428708 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.523707 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-95469877-e687-4d8b-97fe-080814cf4383\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.524132 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-scripts\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.524296 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-config-data\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.524435 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-logs\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.524746 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.524846 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.524927 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8lk4\" (UniqueName: \"kubernetes.io/projected/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-kube-api-access-v8lk4\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.525062 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.535624 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.630145 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.630228 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-95469877-e687-4d8b-97fe-080814cf4383\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.630257 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-scripts\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.630294 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-config-data\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.630340 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-logs\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.630372 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.630401 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.630422 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8lk4\" (UniqueName: \"kubernetes.io/projected/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-kube-api-access-v8lk4\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.631136 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.643845 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.649385 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-logs\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.649906 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-scripts\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.661354 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-config-data\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.662908 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.667068 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.667136 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-95469877-e687-4d8b-97fe-080814cf4383\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ca8c336a9061554abfcbd88e82c904ba958cd0f903c4270744870a313861497c/globalmount\"" pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.676875 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gmx6k"] Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.696206 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8lk4\" (UniqueName: \"kubernetes.io/projected/ed20e05e-643c-407e-bd2f-ce931e1e2bd1-kube-api-access-v8lk4\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.741187 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c1ea-account-create-update-kph5m"] Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.803165 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-j4mld"] Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.863173 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kp99z"] Jan 28 21:01:01 crc kubenswrapper[4746]: W0128 21:01:01.891783 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8705e73_fddb_48b4_b1e2_fd3e40a001cb.slice/crio-5ae55f14c86d0c04fb99527d531028fef55485fb1774a975b85c99d33c46a39a WatchSource:0}: Error finding container 5ae55f14c86d0c04fb99527d531028fef55485fb1774a975b85c99d33c46a39a: Status 404 returned error can't find the container with id 5ae55f14c86d0c04fb99527d531028fef55485fb1774a975b85c99d33c46a39a Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.892896 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-42e2-account-create-update-k45vw"] Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.904567 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-95469877-e687-4d8b-97fe-080814cf4383\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95469877-e687-4d8b-97fe-080814cf4383\") pod \"glance-default-external-api-0\" (UID: \"ed20e05e-643c-407e-bd2f-ce931e1e2bd1\") " pod="openstack/glance-default-external-api-0" Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.919579 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e199-account-create-update-v6fd6"] Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.948094 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:01 crc kubenswrapper[4746]: I0128 21:01:01.973903 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29493901-2dcqp"] Jan 28 21:01:02 crc kubenswrapper[4746]: I0128 21:01:02.058683 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 28 21:01:02 crc kubenswrapper[4746]: I0128 21:01:02.070646 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29493901-2dcqp" event={"ID":"018e2b8e-63bb-41fd-8153-f0c8fc106af7","Type":"ContainerStarted","Data":"26bffe4a0a61b171f2dde2b3eefcaddd8044f5071527405ed3183a277fe0f4a9"} Jan 28 21:01:02 crc kubenswrapper[4746]: I0128 21:01:02.084000 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8705e73-fddb-48b4-b1e2-fd3e40a001cb","Type":"ContainerStarted","Data":"5ae55f14c86d0c04fb99527d531028fef55485fb1774a975b85c99d33c46a39a"} Jan 28 21:01:02 crc kubenswrapper[4746]: I0128 21:01:02.095424 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-42e2-account-create-update-k45vw" event={"ID":"9f60a726-8cd2-4eca-b252-014d68fded35","Type":"ContainerStarted","Data":"96ee45264dc090712dcbe95753c8438761721480d6e1e01590b81b445fd92ee6"} Jan 28 21:01:02 crc kubenswrapper[4746]: I0128 21:01:02.100546 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-j4mld" event={"ID":"eae3e347-2cb4-49b7-999d-abdfe9c3d2ac","Type":"ContainerStarted","Data":"e1836f2d9580fd63f78b197902bbbdfe2f40ea58da7e340e2afdec63f0635c1d"} Jan 28 21:01:02 crc kubenswrapper[4746]: I0128 21:01:02.102268 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kp99z" event={"ID":"add7217e-e0fc-4079-a1c4-b3a328588a9a","Type":"ContainerStarted","Data":"b1093fff1b6f88e928b700b612695484f0207cf4324dfbc72ea5264f89612737"} Jan 28 21:01:02 crc kubenswrapper[4746]: I0128 21:01:02.106101 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c1ea-account-create-update-kph5m" event={"ID":"6b0f1a53-15ed-455c-80a3-7f92a3851538","Type":"ContainerStarted","Data":"1657c8240441670a6391562aeb809ad4d334ed795afe2fe8e086b9ae0d960bbb"} Jan 28 21:01:02 crc kubenswrapper[4746]: I0128 21:01:02.109150 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gmx6k" event={"ID":"4718d022-41c2-4684-9093-1f83e23dc367","Type":"ContainerStarted","Data":"7c6ce801fbeac6da5d727de9268114951d4a0ba84173337985735593c8d19242"} Jan 28 21:01:02 crc kubenswrapper[4746]: I0128 21:01:02.116591 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e199-account-create-update-v6fd6" event={"ID":"0a9afff1-311d-4881-a211-e405af09d4a7","Type":"ContainerStarted","Data":"1bcaba3f60d5017ef0291a34da5f7b7d49e513b4d73d392b994c07f97607dab8"} Jan 28 21:01:02 crc kubenswrapper[4746]: I0128 21:01:02.643407 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 28 21:01:02 crc kubenswrapper[4746]: I0128 21:01:02.864024 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="875165a4-1092-4e9d-ae24-5044a726e174" path="/var/lib/kubelet/pods/875165a4-1092-4e9d-ae24-5044a726e174/volumes" Jan 28 21:01:02 crc kubenswrapper[4746]: I0128 21:01:02.865399 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bae342c7-f51f-4da2-a419-61002cc82f59" path="/var/lib/kubelet/pods/bae342c7-f51f-4da2-a419-61002cc82f59/volumes" Jan 28 21:01:03 crc kubenswrapper[4746]: I0128 21:01:03.004997 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 21:01:03 crc kubenswrapper[4746]: I0128 21:01:03.005267 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8355d3eb-8a59-4393-b04b-a44d9dd7824f" containerName="glance-log" containerID="cri-o://0f88e834003af2207446e657d01363eeba0cbb7e9726650317015c56dd154be4" gracePeriod=30 Jan 28 21:01:03 crc kubenswrapper[4746]: I0128 21:01:03.005419 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8355d3eb-8a59-4393-b04b-a44d9dd7824f" containerName="glance-httpd" containerID="cri-o://1dba7a40666ee812b896899192e9f69d97b8af339400b44d5dcaa898d4fbb22b" gracePeriod=30 Jan 28 21:01:03 crc kubenswrapper[4746]: I0128 21:01:03.156592 4746 generic.go:334] "Generic (PLEG): container finished" podID="0a9afff1-311d-4881-a211-e405af09d4a7" containerID="efe875ab3e58b67563bf8cd1bf69fe6c5daa702a9ea730e11ff8260a0247bba6" exitCode=0 Jan 28 21:01:03 crc kubenswrapper[4746]: I0128 21:01:03.156715 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e199-account-create-update-v6fd6" event={"ID":"0a9afff1-311d-4881-a211-e405af09d4a7","Type":"ContainerDied","Data":"efe875ab3e58b67563bf8cd1bf69fe6c5daa702a9ea730e11ff8260a0247bba6"} Jan 28 21:01:03 crc kubenswrapper[4746]: I0128 21:01:03.162273 4746 generic.go:334] "Generic (PLEG): container finished" podID="add7217e-e0fc-4079-a1c4-b3a328588a9a" containerID="78345b845e597cbbe0267b558d090ddfd4e9395c0d78a932f6d1dde4fb653c96" exitCode=0 Jan 28 21:01:03 crc kubenswrapper[4746]: I0128 21:01:03.162397 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kp99z" event={"ID":"add7217e-e0fc-4079-a1c4-b3a328588a9a","Type":"ContainerDied","Data":"78345b845e597cbbe0267b558d090ddfd4e9395c0d78a932f6d1dde4fb653c96"} Jan 28 21:01:03 crc kubenswrapper[4746]: I0128 21:01:03.164450 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29493901-2dcqp" event={"ID":"018e2b8e-63bb-41fd-8153-f0c8fc106af7","Type":"ContainerStarted","Data":"3b0c07353d37225723bec96ef37d37026bd127a606e71c4a51e8864aae5df371"} Jan 28 21:01:03 crc kubenswrapper[4746]: I0128 21:01:03.201330 4746 generic.go:334] "Generic (PLEG): container finished" podID="9f60a726-8cd2-4eca-b252-014d68fded35" containerID="d169eec66f0269f166e9d76665f5a0824e488ca1c076ea9e824d196e4427738a" exitCode=0 Jan 28 21:01:03 crc kubenswrapper[4746]: I0128 21:01:03.201398 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-42e2-account-create-update-k45vw" event={"ID":"9f60a726-8cd2-4eca-b252-014d68fded35","Type":"ContainerDied","Data":"d169eec66f0269f166e9d76665f5a0824e488ca1c076ea9e824d196e4427738a"} Jan 28 21:01:03 crc kubenswrapper[4746]: I0128 21:01:03.204425 4746 generic.go:334] "Generic (PLEG): container finished" podID="6b0f1a53-15ed-455c-80a3-7f92a3851538" containerID="ca7de28f90af63964789fb78ce766e51fb5b77343d12a3dbcfc483f001c17385" exitCode=0 Jan 28 21:01:03 crc kubenswrapper[4746]: I0128 21:01:03.204469 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c1ea-account-create-update-kph5m" event={"ID":"6b0f1a53-15ed-455c-80a3-7f92a3851538","Type":"ContainerDied","Data":"ca7de28f90af63964789fb78ce766e51fb5b77343d12a3dbcfc483f001c17385"} Jan 28 21:01:03 crc kubenswrapper[4746]: I0128 21:01:03.205691 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ed20e05e-643c-407e-bd2f-ce931e1e2bd1","Type":"ContainerStarted","Data":"757cd86efb459e86b7ed57976b7b3b8d42beb1df12ef13c4e150abe6c8c234cd"} Jan 28 21:01:03 crc kubenswrapper[4746]: I0128 21:01:03.207238 4746 generic.go:334] "Generic (PLEG): container finished" podID="4718d022-41c2-4684-9093-1f83e23dc367" containerID="d252a81e0c138932fe8347bd1c6f1c0acde2062679fb8d94914bf45b47e500d8" exitCode=0 Jan 28 21:01:03 crc kubenswrapper[4746]: I0128 21:01:03.207275 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gmx6k" event={"ID":"4718d022-41c2-4684-9093-1f83e23dc367","Type":"ContainerDied","Data":"d252a81e0c138932fe8347bd1c6f1c0acde2062679fb8d94914bf45b47e500d8"} Jan 28 21:01:03 crc kubenswrapper[4746]: I0128 21:01:03.228292 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29493901-2dcqp" podStartSLOduration=3.228272607 podStartE2EDuration="3.228272607s" podCreationTimestamp="2026-01-28 21:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:01:03.227766132 +0000 UTC m=+1291.183952486" watchObservedRunningTime="2026-01-28 21:01:03.228272607 +0000 UTC m=+1291.184458961" Jan 28 21:01:03 crc kubenswrapper[4746]: I0128 21:01:03.231375 4746 generic.go:334] "Generic (PLEG): container finished" podID="eae3e347-2cb4-49b7-999d-abdfe9c3d2ac" containerID="8b9d8ff48bd5a1b24dbd3629a43ac6498bf514a55e69aea7af79ec775b405a69" exitCode=0 Jan 28 21:01:03 crc kubenswrapper[4746]: I0128 21:01:03.231433 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-j4mld" event={"ID":"eae3e347-2cb4-49b7-999d-abdfe9c3d2ac","Type":"ContainerDied","Data":"8b9d8ff48bd5a1b24dbd3629a43ac6498bf514a55e69aea7af79ec775b405a69"} Jan 28 21:01:04 crc kubenswrapper[4746]: I0128 21:01:04.242459 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ed20e05e-643c-407e-bd2f-ce931e1e2bd1","Type":"ContainerStarted","Data":"2aed0dfdb70f588604a73307933efc2e9c7e7d424ba34108210d3d86cf6cfc4b"} Jan 28 21:01:04 crc kubenswrapper[4746]: I0128 21:01:04.242990 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ed20e05e-643c-407e-bd2f-ce931e1e2bd1","Type":"ContainerStarted","Data":"e408619915e7e3a1123bc2e4738fa8188e6ff2b50eefa18d21bc27f1f4489a97"} Jan 28 21:01:04 crc kubenswrapper[4746]: I0128 21:01:04.244648 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8705e73-fddb-48b4-b1e2-fd3e40a001cb","Type":"ContainerStarted","Data":"a8e5786bdd005c4e878b83dd2881485458e62698a1aa4bac9b1a63762480c392"} Jan 28 21:01:04 crc kubenswrapper[4746]: I0128 21:01:04.246718 4746 generic.go:334] "Generic (PLEG): container finished" podID="8355d3eb-8a59-4393-b04b-a44d9dd7824f" containerID="0f88e834003af2207446e657d01363eeba0cbb7e9726650317015c56dd154be4" exitCode=143 Jan 28 21:01:04 crc kubenswrapper[4746]: I0128 21:01:04.246889 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8355d3eb-8a59-4393-b04b-a44d9dd7824f","Type":"ContainerDied","Data":"0f88e834003af2207446e657d01363eeba0cbb7e9726650317015c56dd154be4"} Jan 28 21:01:04 crc kubenswrapper[4746]: I0128 21:01:04.296385 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.296369096 podStartE2EDuration="3.296369096s" podCreationTimestamp="2026-01-28 21:01:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:01:04.290005416 +0000 UTC m=+1292.246191770" watchObservedRunningTime="2026-01-28 21:01:04.296369096 +0000 UTC m=+1292.252555450" Jan 28 21:01:04 crc kubenswrapper[4746]: I0128 21:01:04.979381 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-42e2-account-create-update-k45vw" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.134297 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxqrn\" (UniqueName: \"kubernetes.io/projected/9f60a726-8cd2-4eca-b252-014d68fded35-kube-api-access-qxqrn\") pod \"9f60a726-8cd2-4eca-b252-014d68fded35\" (UID: \"9f60a726-8cd2-4eca-b252-014d68fded35\") " Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.134566 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f60a726-8cd2-4eca-b252-014d68fded35-operator-scripts\") pod \"9f60a726-8cd2-4eca-b252-014d68fded35\" (UID: \"9f60a726-8cd2-4eca-b252-014d68fded35\") " Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.135602 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f60a726-8cd2-4eca-b252-014d68fded35-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f60a726-8cd2-4eca-b252-014d68fded35" (UID: "9f60a726-8cd2-4eca-b252-014d68fded35"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.142810 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f60a726-8cd2-4eca-b252-014d68fded35-kube-api-access-qxqrn" (OuterVolumeSpecName: "kube-api-access-qxqrn") pod "9f60a726-8cd2-4eca-b252-014d68fded35" (UID: "9f60a726-8cd2-4eca-b252-014d68fded35"). InnerVolumeSpecName "kube-api-access-qxqrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.238598 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f60a726-8cd2-4eca-b252-014d68fded35-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.238635 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxqrn\" (UniqueName: \"kubernetes.io/projected/9f60a726-8cd2-4eca-b252-014d68fded35-kube-api-access-qxqrn\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.284627 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8705e73-fddb-48b4-b1e2-fd3e40a001cb","Type":"ContainerStarted","Data":"67ead006986d589f8b439f4e84d5167c4c0324d7d33954b044f217d7c3804083"} Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.298607 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-42e2-account-create-update-k45vw" event={"ID":"9f60a726-8cd2-4eca-b252-014d68fded35","Type":"ContainerDied","Data":"96ee45264dc090712dcbe95753c8438761721480d6e1e01590b81b445fd92ee6"} Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.298684 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96ee45264dc090712dcbe95753c8438761721480d6e1e01590b81b445fd92ee6" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.298855 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-42e2-account-create-update-k45vw" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.317151 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gmx6k" event={"ID":"4718d022-41c2-4684-9093-1f83e23dc367","Type":"ContainerDied","Data":"7c6ce801fbeac6da5d727de9268114951d4a0ba84173337985735593c8d19242"} Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.317212 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c6ce801fbeac6da5d727de9268114951d4a0ba84173337985735593c8d19242" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.405415 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gmx6k" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.435092 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-j4mld" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.461099 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kp99z" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.465417 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c1ea-account-create-update-kph5m" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.479876 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e199-account-create-update-v6fd6" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.546448 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/add7217e-e0fc-4079-a1c4-b3a328588a9a-operator-scripts\") pod \"add7217e-e0fc-4079-a1c4-b3a328588a9a\" (UID: \"add7217e-e0fc-4079-a1c4-b3a328588a9a\") " Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.546819 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx5kj\" (UniqueName: \"kubernetes.io/projected/add7217e-e0fc-4079-a1c4-b3a328588a9a-kube-api-access-jx5kj\") pod \"add7217e-e0fc-4079-a1c4-b3a328588a9a\" (UID: \"add7217e-e0fc-4079-a1c4-b3a328588a9a\") " Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.547015 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4718d022-41c2-4684-9093-1f83e23dc367-operator-scripts\") pod \"4718d022-41c2-4684-9093-1f83e23dc367\" (UID: \"4718d022-41c2-4684-9093-1f83e23dc367\") " Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.547052 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a9afff1-311d-4881-a211-e405af09d4a7-operator-scripts\") pod \"0a9afff1-311d-4881-a211-e405af09d4a7\" (UID: \"0a9afff1-311d-4881-a211-e405af09d4a7\") " Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.547127 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd45s\" (UniqueName: \"kubernetes.io/projected/0a9afff1-311d-4881-a211-e405af09d4a7-kube-api-access-wd45s\") pod \"0a9afff1-311d-4881-a211-e405af09d4a7\" (UID: \"0a9afff1-311d-4881-a211-e405af09d4a7\") " Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.547378 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56rwh\" (UniqueName: \"kubernetes.io/projected/4718d022-41c2-4684-9093-1f83e23dc367-kube-api-access-56rwh\") pod \"4718d022-41c2-4684-9093-1f83e23dc367\" (UID: \"4718d022-41c2-4684-9093-1f83e23dc367\") " Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.547480 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eae3e347-2cb4-49b7-999d-abdfe9c3d2ac-operator-scripts\") pod \"eae3e347-2cb4-49b7-999d-abdfe9c3d2ac\" (UID: \"eae3e347-2cb4-49b7-999d-abdfe9c3d2ac\") " Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.547563 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zzwv\" (UniqueName: \"kubernetes.io/projected/eae3e347-2cb4-49b7-999d-abdfe9c3d2ac-kube-api-access-2zzwv\") pod \"eae3e347-2cb4-49b7-999d-abdfe9c3d2ac\" (UID: \"eae3e347-2cb4-49b7-999d-abdfe9c3d2ac\") " Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.547619 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq5zs\" (UniqueName: \"kubernetes.io/projected/6b0f1a53-15ed-455c-80a3-7f92a3851538-kube-api-access-nq5zs\") pod \"6b0f1a53-15ed-455c-80a3-7f92a3851538\" (UID: \"6b0f1a53-15ed-455c-80a3-7f92a3851538\") " Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.547635 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b0f1a53-15ed-455c-80a3-7f92a3851538-operator-scripts\") pod \"6b0f1a53-15ed-455c-80a3-7f92a3851538\" (UID: \"6b0f1a53-15ed-455c-80a3-7f92a3851538\") " Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.548791 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b0f1a53-15ed-455c-80a3-7f92a3851538-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b0f1a53-15ed-455c-80a3-7f92a3851538" (UID: "6b0f1a53-15ed-455c-80a3-7f92a3851538"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.556682 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eae3e347-2cb4-49b7-999d-abdfe9c3d2ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eae3e347-2cb4-49b7-999d-abdfe9c3d2ac" (UID: "eae3e347-2cb4-49b7-999d-abdfe9c3d2ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.557055 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4718d022-41c2-4684-9093-1f83e23dc367-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4718d022-41c2-4684-9093-1f83e23dc367" (UID: "4718d022-41c2-4684-9093-1f83e23dc367"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.558215 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/add7217e-e0fc-4079-a1c4-b3a328588a9a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "add7217e-e0fc-4079-a1c4-b3a328588a9a" (UID: "add7217e-e0fc-4079-a1c4-b3a328588a9a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.558404 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a9afff1-311d-4881-a211-e405af09d4a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0a9afff1-311d-4881-a211-e405af09d4a7" (UID: "0a9afff1-311d-4881-a211-e405af09d4a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.572191 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b0f1a53-15ed-455c-80a3-7f92a3851538-kube-api-access-nq5zs" (OuterVolumeSpecName: "kube-api-access-nq5zs") pod "6b0f1a53-15ed-455c-80a3-7f92a3851538" (UID: "6b0f1a53-15ed-455c-80a3-7f92a3851538"). InnerVolumeSpecName "kube-api-access-nq5zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.572324 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eae3e347-2cb4-49b7-999d-abdfe9c3d2ac-kube-api-access-2zzwv" (OuterVolumeSpecName: "kube-api-access-2zzwv") pod "eae3e347-2cb4-49b7-999d-abdfe9c3d2ac" (UID: "eae3e347-2cb4-49b7-999d-abdfe9c3d2ac"). InnerVolumeSpecName "kube-api-access-2zzwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.575652 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/add7217e-e0fc-4079-a1c4-b3a328588a9a-kube-api-access-jx5kj" (OuterVolumeSpecName: "kube-api-access-jx5kj") pod "add7217e-e0fc-4079-a1c4-b3a328588a9a" (UID: "add7217e-e0fc-4079-a1c4-b3a328588a9a"). InnerVolumeSpecName "kube-api-access-jx5kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.588986 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4718d022-41c2-4684-9093-1f83e23dc367-kube-api-access-56rwh" (OuterVolumeSpecName: "kube-api-access-56rwh") pod "4718d022-41c2-4684-9093-1f83e23dc367" (UID: "4718d022-41c2-4684-9093-1f83e23dc367"). InnerVolumeSpecName "kube-api-access-56rwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.604178 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a9afff1-311d-4881-a211-e405af09d4a7-kube-api-access-wd45s" (OuterVolumeSpecName: "kube-api-access-wd45s") pod "0a9afff1-311d-4881-a211-e405af09d4a7" (UID: "0a9afff1-311d-4881-a211-e405af09d4a7"). InnerVolumeSpecName "kube-api-access-wd45s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.650062 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4718d022-41c2-4684-9093-1f83e23dc367-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.650117 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a9afff1-311d-4881-a211-e405af09d4a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.650128 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd45s\" (UniqueName: \"kubernetes.io/projected/0a9afff1-311d-4881-a211-e405af09d4a7-kube-api-access-wd45s\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.650140 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56rwh\" (UniqueName: \"kubernetes.io/projected/4718d022-41c2-4684-9093-1f83e23dc367-kube-api-access-56rwh\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.650148 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eae3e347-2cb4-49b7-999d-abdfe9c3d2ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.650156 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zzwv\" (UniqueName: \"kubernetes.io/projected/eae3e347-2cb4-49b7-999d-abdfe9c3d2ac-kube-api-access-2zzwv\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.650165 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq5zs\" (UniqueName: \"kubernetes.io/projected/6b0f1a53-15ed-455c-80a3-7f92a3851538-kube-api-access-nq5zs\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.650173 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b0f1a53-15ed-455c-80a3-7f92a3851538-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.650182 4746 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/add7217e-e0fc-4079-a1c4-b3a328588a9a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:05 crc kubenswrapper[4746]: I0128 21:01:05.650190 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx5kj\" (UniqueName: \"kubernetes.io/projected/add7217e-e0fc-4079-a1c4-b3a328588a9a-kube-api-access-jx5kj\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:05 crc kubenswrapper[4746]: E0128 21:01:05.866920 4746 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod018e2b8e_63bb_41fd_8153_f0c8fc106af7.slice/crio-conmon-3b0c07353d37225723bec96ef37d37026bd127a606e71c4a51e8864aae5df371.scope\": RecentStats: unable to find data in memory cache]" Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.325915 4746 generic.go:334] "Generic (PLEG): container finished" podID="8355d3eb-8a59-4393-b04b-a44d9dd7824f" containerID="1dba7a40666ee812b896899192e9f69d97b8af339400b44d5dcaa898d4fbb22b" exitCode=0 Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.325995 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8355d3eb-8a59-4393-b04b-a44d9dd7824f","Type":"ContainerDied","Data":"1dba7a40666ee812b896899192e9f69d97b8af339400b44d5dcaa898d4fbb22b"} Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.329540 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c1ea-account-create-update-kph5m" event={"ID":"6b0f1a53-15ed-455c-80a3-7f92a3851538","Type":"ContainerDied","Data":"1657c8240441670a6391562aeb809ad4d334ed795afe2fe8e086b9ae0d960bbb"} Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.329578 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c1ea-account-create-update-kph5m" Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.329580 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1657c8240441670a6391562aeb809ad4d334ed795afe2fe8e086b9ae0d960bbb" Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.331218 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-j4mld" event={"ID":"eae3e347-2cb4-49b7-999d-abdfe9c3d2ac","Type":"ContainerDied","Data":"e1836f2d9580fd63f78b197902bbbdfe2f40ea58da7e340e2afdec63f0635c1d"} Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.331250 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1836f2d9580fd63f78b197902bbbdfe2f40ea58da7e340e2afdec63f0635c1d" Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.331333 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-j4mld" Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.335300 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e199-account-create-update-v6fd6" Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.335319 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e199-account-create-update-v6fd6" event={"ID":"0a9afff1-311d-4881-a211-e405af09d4a7","Type":"ContainerDied","Data":"1bcaba3f60d5017ef0291a34da5f7b7d49e513b4d73d392b994c07f97607dab8"} Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.335360 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bcaba3f60d5017ef0291a34da5f7b7d49e513b4d73d392b994c07f97607dab8" Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.337229 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kp99z" event={"ID":"add7217e-e0fc-4079-a1c4-b3a328588a9a","Type":"ContainerDied","Data":"b1093fff1b6f88e928b700b612695484f0207cf4324dfbc72ea5264f89612737"} Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.337266 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1093fff1b6f88e928b700b612695484f0207cf4324dfbc72ea5264f89612737" Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.337328 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kp99z" Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.340592 4746 generic.go:334] "Generic (PLEG): container finished" podID="018e2b8e-63bb-41fd-8153-f0c8fc106af7" containerID="3b0c07353d37225723bec96ef37d37026bd127a606e71c4a51e8864aae5df371" exitCode=0 Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.340655 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29493901-2dcqp" event={"ID":"018e2b8e-63bb-41fd-8153-f0c8fc106af7","Type":"ContainerDied","Data":"3b0c07353d37225723bec96ef37d37026bd127a606e71c4a51e8864aae5df371"} Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.346345 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gmx6k" Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.346340 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8705e73-fddb-48b4-b1e2-fd3e40a001cb","Type":"ContainerStarted","Data":"44628f9cf49f9de7f54565f66b9289e264cbc17ae8325e0426f5dd73a3cacf38"} Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.923469 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.979029 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-internal-tls-certs\") pod \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.979073 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-combined-ca-bundle\") pod \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.979169 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-scripts\") pod \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.979244 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8355d3eb-8a59-4393-b04b-a44d9dd7824f-logs\") pod \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.979331 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhffj\" (UniqueName: \"kubernetes.io/projected/8355d3eb-8a59-4393-b04b-a44d9dd7824f-kube-api-access-zhffj\") pod \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.979489 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\") pod \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.979561 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-config-data\") pod \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.979711 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8355d3eb-8a59-4393-b04b-a44d9dd7824f-httpd-run\") pod \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.980736 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8355d3eb-8a59-4393-b04b-a44d9dd7824f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8355d3eb-8a59-4393-b04b-a44d9dd7824f" (UID: "8355d3eb-8a59-4393-b04b-a44d9dd7824f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:01:06 crc kubenswrapper[4746]: I0128 21:01:06.981468 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8355d3eb-8a59-4393-b04b-a44d9dd7824f-logs" (OuterVolumeSpecName: "logs") pod "8355d3eb-8a59-4393-b04b-a44d9dd7824f" (UID: "8355d3eb-8a59-4393-b04b-a44d9dd7824f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.013486 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-scripts" (OuterVolumeSpecName: "scripts") pod "8355d3eb-8a59-4393-b04b-a44d9dd7824f" (UID: "8355d3eb-8a59-4393-b04b-a44d9dd7824f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.019693 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8355d3eb-8a59-4393-b04b-a44d9dd7824f" (UID: "8355d3eb-8a59-4393-b04b-a44d9dd7824f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.028254 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8355d3eb-8a59-4393-b04b-a44d9dd7824f-kube-api-access-zhffj" (OuterVolumeSpecName: "kube-api-access-zhffj") pod "8355d3eb-8a59-4393-b04b-a44d9dd7824f" (UID: "8355d3eb-8a59-4393-b04b-a44d9dd7824f"). InnerVolumeSpecName "kube-api-access-zhffj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:01:07 crc kubenswrapper[4746]: E0128 21:01:07.029417 4746 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d podName:8355d3eb-8a59-4393-b04b-a44d9dd7824f nodeName:}" failed. No retries permitted until 2026-01-28 21:01:07.529390181 +0000 UTC m=+1295.485576735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "glance" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d") pod "8355d3eb-8a59-4393-b04b-a44d9dd7824f" (UID: "8355d3eb-8a59-4393-b04b-a44d9dd7824f") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.089259 4746 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8355d3eb-8a59-4393-b04b-a44d9dd7824f-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.089601 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.089613 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.089621 4746 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8355d3eb-8a59-4393-b04b-a44d9dd7824f-logs\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.089629 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhffj\" (UniqueName: \"kubernetes.io/projected/8355d3eb-8a59-4393-b04b-a44d9dd7824f-kube-api-access-zhffj\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.090288 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8355d3eb-8a59-4393-b04b-a44d9dd7824f" (UID: "8355d3eb-8a59-4393-b04b-a44d9dd7824f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.101460 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-config-data" (OuterVolumeSpecName: "config-data") pod "8355d3eb-8a59-4393-b04b-a44d9dd7824f" (UID: "8355d3eb-8a59-4393-b04b-a44d9dd7824f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.193241 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.193269 4746 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8355d3eb-8a59-4393-b04b-a44d9dd7824f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.362611 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8355d3eb-8a59-4393-b04b-a44d9dd7824f","Type":"ContainerDied","Data":"a27ca5c58be1de3f3f8262768402c61ffabf4e7622be35e73a021d98a7eb4b0c"} Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.362682 4746 scope.go:117] "RemoveContainer" containerID="1dba7a40666ee812b896899192e9f69d97b8af339400b44d5dcaa898d4fbb22b" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.362637 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.413066 4746 scope.go:117] "RemoveContainer" containerID="0f88e834003af2207446e657d01363eeba0cbb7e9726650317015c56dd154be4" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.601774 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\") pod \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\" (UID: \"8355d3eb-8a59-4393-b04b-a44d9dd7824f\") " Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.618232 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d" (OuterVolumeSpecName: "glance") pod "8355d3eb-8a59-4393-b04b-a44d9dd7824f" (UID: "8355d3eb-8a59-4393-b04b-a44d9dd7824f"). InnerVolumeSpecName "pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.623508 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29493901-2dcqp" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.704770 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/018e2b8e-63bb-41fd-8153-f0c8fc106af7-fernet-keys\") pod \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\" (UID: \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\") " Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.705009 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018e2b8e-63bb-41fd-8153-f0c8fc106af7-config-data\") pod \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\" (UID: \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\") " Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.705115 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018e2b8e-63bb-41fd-8153-f0c8fc106af7-combined-ca-bundle\") pod \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\" (UID: \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\") " Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.705194 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7w8k\" (UniqueName: \"kubernetes.io/projected/018e2b8e-63bb-41fd-8153-f0c8fc106af7-kube-api-access-c7w8k\") pod \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\" (UID: \"018e2b8e-63bb-41fd-8153-f0c8fc106af7\") " Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.706971 4746 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\") on node \"crc\" " Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.722005 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/018e2b8e-63bb-41fd-8153-f0c8fc106af7-kube-api-access-c7w8k" (OuterVolumeSpecName: "kube-api-access-c7w8k") pod "018e2b8e-63bb-41fd-8153-f0c8fc106af7" (UID: "018e2b8e-63bb-41fd-8153-f0c8fc106af7"). InnerVolumeSpecName "kube-api-access-c7w8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.724025 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/018e2b8e-63bb-41fd-8153-f0c8fc106af7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "018e2b8e-63bb-41fd-8153-f0c8fc106af7" (UID: "018e2b8e-63bb-41fd-8153-f0c8fc106af7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.757094 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.779644 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.790061 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 21:01:07 crc kubenswrapper[4746]: E0128 21:01:07.790577 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8355d3eb-8a59-4393-b04b-a44d9dd7824f" containerName="glance-log" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.790597 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8355d3eb-8a59-4393-b04b-a44d9dd7824f" containerName="glance-log" Jan 28 21:01:07 crc kubenswrapper[4746]: E0128 21:01:07.790613 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a9afff1-311d-4881-a211-e405af09d4a7" containerName="mariadb-account-create-update" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.790621 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a9afff1-311d-4881-a211-e405af09d4a7" containerName="mariadb-account-create-update" Jan 28 21:01:07 crc kubenswrapper[4746]: E0128 21:01:07.790637 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="018e2b8e-63bb-41fd-8153-f0c8fc106af7" containerName="keystone-cron" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.790647 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="018e2b8e-63bb-41fd-8153-f0c8fc106af7" containerName="keystone-cron" Jan 28 21:01:07 crc kubenswrapper[4746]: E0128 21:01:07.790669 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4718d022-41c2-4684-9093-1f83e23dc367" containerName="mariadb-database-create" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.790678 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="4718d022-41c2-4684-9093-1f83e23dc367" containerName="mariadb-database-create" Jan 28 21:01:07 crc kubenswrapper[4746]: E0128 21:01:07.790702 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b0f1a53-15ed-455c-80a3-7f92a3851538" containerName="mariadb-account-create-update" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.790711 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b0f1a53-15ed-455c-80a3-7f92a3851538" containerName="mariadb-account-create-update" Jan 28 21:01:07 crc kubenswrapper[4746]: E0128 21:01:07.790722 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8355d3eb-8a59-4393-b04b-a44d9dd7824f" containerName="glance-httpd" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.790730 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8355d3eb-8a59-4393-b04b-a44d9dd7824f" containerName="glance-httpd" Jan 28 21:01:07 crc kubenswrapper[4746]: E0128 21:01:07.790742 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f60a726-8cd2-4eca-b252-014d68fded35" containerName="mariadb-account-create-update" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.790750 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f60a726-8cd2-4eca-b252-014d68fded35" containerName="mariadb-account-create-update" Jan 28 21:01:07 crc kubenswrapper[4746]: E0128 21:01:07.790779 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="add7217e-e0fc-4079-a1c4-b3a328588a9a" containerName="mariadb-database-create" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.790789 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="add7217e-e0fc-4079-a1c4-b3a328588a9a" containerName="mariadb-database-create" Jan 28 21:01:07 crc kubenswrapper[4746]: E0128 21:01:07.790802 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae3e347-2cb4-49b7-999d-abdfe9c3d2ac" containerName="mariadb-database-create" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.790809 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae3e347-2cb4-49b7-999d-abdfe9c3d2ac" containerName="mariadb-database-create" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.791024 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="add7217e-e0fc-4079-a1c4-b3a328588a9a" containerName="mariadb-database-create" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.791039 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="8355d3eb-8a59-4393-b04b-a44d9dd7824f" containerName="glance-httpd" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.791050 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a9afff1-311d-4881-a211-e405af09d4a7" containerName="mariadb-account-create-update" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.791061 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b0f1a53-15ed-455c-80a3-7f92a3851538" containerName="mariadb-account-create-update" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.791070 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="018e2b8e-63bb-41fd-8153-f0c8fc106af7" containerName="keystone-cron" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.791102 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="eae3e347-2cb4-49b7-999d-abdfe9c3d2ac" containerName="mariadb-database-create" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.791117 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="4718d022-41c2-4684-9093-1f83e23dc367" containerName="mariadb-database-create" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.791136 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f60a726-8cd2-4eca-b252-014d68fded35" containerName="mariadb-account-create-update" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.791154 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="8355d3eb-8a59-4393-b04b-a44d9dd7824f" containerName="glance-log" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.792526 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.798366 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.798597 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.807297 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/018e2b8e-63bb-41fd-8153-f0c8fc106af7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "018e2b8e-63bb-41fd-8153-f0c8fc106af7" (UID: "018e2b8e-63bb-41fd-8153-f0c8fc106af7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.808439 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7w8k\" (UniqueName: \"kubernetes.io/projected/018e2b8e-63bb-41fd-8153-f0c8fc106af7-kube-api-access-c7w8k\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.808468 4746 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/018e2b8e-63bb-41fd-8153-f0c8fc106af7-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.808480 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/018e2b8e-63bb-41fd-8153-f0c8fc106af7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.863218 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.877627 4746 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.878265 4746 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d") on node "crc" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.881284 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/018e2b8e-63bb-41fd-8153-f0c8fc106af7-config-data" (OuterVolumeSpecName: "config-data") pod "018e2b8e-63bb-41fd-8153-f0c8fc106af7" (UID: "018e2b8e-63bb-41fd-8153-f0c8fc106af7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.915463 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.915571 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.915645 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-logs\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.915676 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.915742 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.915800 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.915828 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q5mm\" (UniqueName: \"kubernetes.io/projected/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-kube-api-access-5q5mm\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.915868 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.915997 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/018e2b8e-63bb-41fd-8153-f0c8fc106af7-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.992005 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 21:01:07 crc kubenswrapper[4746]: I0128 21:01:07.992048 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/04ac8a11379bea80fa995f6a81e21b4d865afcddeb1b26b6c4ebff6fe431a0b6/globalmount\"" pod="openstack/glance-default-internal-api-0" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.031168 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.031598 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.031709 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-logs\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.031850 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.031955 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.032101 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q5mm\" (UniqueName: \"kubernetes.io/projected/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-kube-api-access-5q5mm\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.032213 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.033937 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-logs\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.050310 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.050682 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.051454 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.051633 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.068479 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.090863 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q5mm\" (UniqueName: \"kubernetes.io/projected/ffdc41d1-2cd3-446e-8d3f-6e374a19f56a-kube-api-access-5q5mm\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.192026 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0cac59c-8041-46c0-a355-d5d8d1738a5d\") pod \"glance-default-internal-api-0\" (UID: \"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a\") " pod="openstack/glance-default-internal-api-0" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.266508 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.391455 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29493901-2dcqp" event={"ID":"018e2b8e-63bb-41fd-8153-f0c8fc106af7","Type":"ContainerDied","Data":"26bffe4a0a61b171f2dde2b3eefcaddd8044f5071527405ed3183a277fe0f4a9"} Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.391497 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26bffe4a0a61b171f2dde2b3eefcaddd8044f5071527405ed3183a277fe0f4a9" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.391586 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29493901-2dcqp" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.414617 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8705e73-fddb-48b4-b1e2-fd3e40a001cb","Type":"ContainerStarted","Data":"e3daf895d75c3e9171bac119b3f24a81d3e51ec1f50f69e024a97553df93e640"} Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.414897 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.414953 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerName="sg-core" containerID="cri-o://44628f9cf49f9de7f54565f66b9289e264cbc17ae8325e0426f5dd73a3cacf38" gracePeriod=30 Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.415035 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerName="proxy-httpd" containerID="cri-o://e3daf895d75c3e9171bac119b3f24a81d3e51ec1f50f69e024a97553df93e640" gracePeriod=30 Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.415273 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerName="ceilometer-notification-agent" containerID="cri-o://67ead006986d589f8b439f4e84d5167c4c0324d7d33954b044f217d7c3804083" gracePeriod=30 Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.415335 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerName="ceilometer-central-agent" containerID="cri-o://a8e5786bdd005c4e878b83dd2881485458e62698a1aa4bac9b1a63762480c392" gracePeriod=30 Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.469315 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=11.12765832 podStartE2EDuration="16.469299453s" podCreationTimestamp="2026-01-28 21:00:52 +0000 UTC" firstStartedPulling="2026-01-28 21:01:01.980913504 +0000 UTC m=+1289.937099858" lastFinishedPulling="2026-01-28 21:01:07.322554637 +0000 UTC m=+1295.278740991" observedRunningTime="2026-01-28 21:01:08.455618005 +0000 UTC m=+1296.411804359" watchObservedRunningTime="2026-01-28 21:01:08.469299453 +0000 UTC m=+1296.425485807" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.882211 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8355d3eb-8a59-4393-b04b-a44d9dd7824f" path="/var/lib/kubelet/pods/8355d3eb-8a59-4393-b04b-a44d9dd7824f/volumes" Jan 28 21:01:08 crc kubenswrapper[4746]: I0128 21:01:08.883227 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.333725 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-z7sfr"] Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.335405 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-z7sfr" Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.341572 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.342204 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-9m4cq" Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.346518 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.354608 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-z7sfr"] Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.432324 4746 generic.go:334] "Generic (PLEG): container finished" podID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerID="e3daf895d75c3e9171bac119b3f24a81d3e51ec1f50f69e024a97553df93e640" exitCode=0 Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.432354 4746 generic.go:334] "Generic (PLEG): container finished" podID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerID="44628f9cf49f9de7f54565f66b9289e264cbc17ae8325e0426f5dd73a3cacf38" exitCode=2 Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.432364 4746 generic.go:334] "Generic (PLEG): container finished" podID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerID="67ead006986d589f8b439f4e84d5167c4c0324d7d33954b044f217d7c3804083" exitCode=0 Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.432410 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8705e73-fddb-48b4-b1e2-fd3e40a001cb","Type":"ContainerDied","Data":"e3daf895d75c3e9171bac119b3f24a81d3e51ec1f50f69e024a97553df93e640"} Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.432441 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8705e73-fddb-48b4-b1e2-fd3e40a001cb","Type":"ContainerDied","Data":"44628f9cf49f9de7f54565f66b9289e264cbc17ae8325e0426f5dd73a3cacf38"} Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.432455 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8705e73-fddb-48b4-b1e2-fd3e40a001cb","Type":"ContainerDied","Data":"67ead006986d589f8b439f4e84d5167c4c0324d7d33954b044f217d7c3804083"} Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.434121 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a","Type":"ContainerStarted","Data":"0c843f7b1f1bef605719f70896912c18d9fed7ae6f8bff0f1c99b5243bfdca52"} Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.464878 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c44n\" (UniqueName: \"kubernetes.io/projected/28e5e616-1a83-4053-8231-e3763118ca8e-kube-api-access-7c44n\") pod \"nova-cell0-conductor-db-sync-z7sfr\" (UID: \"28e5e616-1a83-4053-8231-e3763118ca8e\") " pod="openstack/nova-cell0-conductor-db-sync-z7sfr" Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.465051 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28e5e616-1a83-4053-8231-e3763118ca8e-scripts\") pod \"nova-cell0-conductor-db-sync-z7sfr\" (UID: \"28e5e616-1a83-4053-8231-e3763118ca8e\") " pod="openstack/nova-cell0-conductor-db-sync-z7sfr" Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.465503 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e5e616-1a83-4053-8231-e3763118ca8e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-z7sfr\" (UID: \"28e5e616-1a83-4053-8231-e3763118ca8e\") " pod="openstack/nova-cell0-conductor-db-sync-z7sfr" Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.465589 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e5e616-1a83-4053-8231-e3763118ca8e-config-data\") pod \"nova-cell0-conductor-db-sync-z7sfr\" (UID: \"28e5e616-1a83-4053-8231-e3763118ca8e\") " pod="openstack/nova-cell0-conductor-db-sync-z7sfr" Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.567614 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c44n\" (UniqueName: \"kubernetes.io/projected/28e5e616-1a83-4053-8231-e3763118ca8e-kube-api-access-7c44n\") pod \"nova-cell0-conductor-db-sync-z7sfr\" (UID: \"28e5e616-1a83-4053-8231-e3763118ca8e\") " pod="openstack/nova-cell0-conductor-db-sync-z7sfr" Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.567707 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28e5e616-1a83-4053-8231-e3763118ca8e-scripts\") pod \"nova-cell0-conductor-db-sync-z7sfr\" (UID: \"28e5e616-1a83-4053-8231-e3763118ca8e\") " pod="openstack/nova-cell0-conductor-db-sync-z7sfr" Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.567793 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e5e616-1a83-4053-8231-e3763118ca8e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-z7sfr\" (UID: \"28e5e616-1a83-4053-8231-e3763118ca8e\") " pod="openstack/nova-cell0-conductor-db-sync-z7sfr" Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.567835 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e5e616-1a83-4053-8231-e3763118ca8e-config-data\") pod \"nova-cell0-conductor-db-sync-z7sfr\" (UID: \"28e5e616-1a83-4053-8231-e3763118ca8e\") " pod="openstack/nova-cell0-conductor-db-sync-z7sfr" Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.571164 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e5e616-1a83-4053-8231-e3763118ca8e-config-data\") pod \"nova-cell0-conductor-db-sync-z7sfr\" (UID: \"28e5e616-1a83-4053-8231-e3763118ca8e\") " pod="openstack/nova-cell0-conductor-db-sync-z7sfr" Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.572651 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e5e616-1a83-4053-8231-e3763118ca8e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-z7sfr\" (UID: \"28e5e616-1a83-4053-8231-e3763118ca8e\") " pod="openstack/nova-cell0-conductor-db-sync-z7sfr" Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.578254 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28e5e616-1a83-4053-8231-e3763118ca8e-scripts\") pod \"nova-cell0-conductor-db-sync-z7sfr\" (UID: \"28e5e616-1a83-4053-8231-e3763118ca8e\") " pod="openstack/nova-cell0-conductor-db-sync-z7sfr" Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.586714 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c44n\" (UniqueName: \"kubernetes.io/projected/28e5e616-1a83-4053-8231-e3763118ca8e-kube-api-access-7c44n\") pod \"nova-cell0-conductor-db-sync-z7sfr\" (UID: \"28e5e616-1a83-4053-8231-e3763118ca8e\") " pod="openstack/nova-cell0-conductor-db-sync-z7sfr" Jan 28 21:01:09 crc kubenswrapper[4746]: I0128 21:01:09.665774 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-z7sfr" Jan 28 21:01:10 crc kubenswrapper[4746]: I0128 21:01:10.179391 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-z7sfr"] Jan 28 21:01:10 crc kubenswrapper[4746]: I0128 21:01:10.477706 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a","Type":"ContainerStarted","Data":"89a3e42742eeb4a58a852ff77805409f61ceb0c835684877a21e5d4058d7d2ef"} Jan 28 21:01:10 crc kubenswrapper[4746]: I0128 21:01:10.478114 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ffdc41d1-2cd3-446e-8d3f-6e374a19f56a","Type":"ContainerStarted","Data":"e92bb1f8aeebd5b2d22755680c02ea9573b93d0c948619417cbb64d6e13d2a66"} Jan 28 21:01:10 crc kubenswrapper[4746]: I0128 21:01:10.485654 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-z7sfr" event={"ID":"28e5e616-1a83-4053-8231-e3763118ca8e","Type":"ContainerStarted","Data":"663482718d34b1c50485ff975b8d342a767b3370fe90f6b001d58cfd12b29e89"} Jan 28 21:01:10 crc kubenswrapper[4746]: I0128 21:01:10.514036 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.514011423 podStartE2EDuration="3.514011423s" podCreationTimestamp="2026-01-28 21:01:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:01:10.501516367 +0000 UTC m=+1298.457702731" watchObservedRunningTime="2026-01-28 21:01:10.514011423 +0000 UTC m=+1298.470197777" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.060401 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.060659 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.102864 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.149679 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.309925 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.338367 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-scripts\") pod \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.338421 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-config-data\") pod \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.338495 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlg52\" (UniqueName: \"kubernetes.io/projected/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-kube-api-access-nlg52\") pod \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.338528 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-combined-ca-bundle\") pod \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.338595 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-run-httpd\") pod \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.338632 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-log-httpd\") pod \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.338666 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-sg-core-conf-yaml\") pod \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\" (UID: \"a8705e73-fddb-48b4-b1e2-fd3e40a001cb\") " Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.339171 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a8705e73-fddb-48b4-b1e2-fd3e40a001cb" (UID: "a8705e73-fddb-48b4-b1e2-fd3e40a001cb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.339613 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a8705e73-fddb-48b4-b1e2-fd3e40a001cb" (UID: "a8705e73-fddb-48b4-b1e2-fd3e40a001cb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.340899 4746 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.340931 4746 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.354625 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-scripts" (OuterVolumeSpecName: "scripts") pod "a8705e73-fddb-48b4-b1e2-fd3e40a001cb" (UID: "a8705e73-fddb-48b4-b1e2-fd3e40a001cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.358372 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-kube-api-access-nlg52" (OuterVolumeSpecName: "kube-api-access-nlg52") pod "a8705e73-fddb-48b4-b1e2-fd3e40a001cb" (UID: "a8705e73-fddb-48b4-b1e2-fd3e40a001cb"). InnerVolumeSpecName "kube-api-access-nlg52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.411706 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a8705e73-fddb-48b4-b1e2-fd3e40a001cb" (UID: "a8705e73-fddb-48b4-b1e2-fd3e40a001cb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.437782 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8705e73-fddb-48b4-b1e2-fd3e40a001cb" (UID: "a8705e73-fddb-48b4-b1e2-fd3e40a001cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.444722 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.444763 4746 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.444779 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.444792 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlg52\" (UniqueName: \"kubernetes.io/projected/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-kube-api-access-nlg52\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.499961 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-config-data" (OuterVolumeSpecName: "config-data") pod "a8705e73-fddb-48b4-b1e2-fd3e40a001cb" (UID: "a8705e73-fddb-48b4-b1e2-fd3e40a001cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.511607 4746 generic.go:334] "Generic (PLEG): container finished" podID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerID="a8e5786bdd005c4e878b83dd2881485458e62698a1aa4bac9b1a63762480c392" exitCode=0 Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.511882 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8705e73-fddb-48b4-b1e2-fd3e40a001cb","Type":"ContainerDied","Data":"a8e5786bdd005c4e878b83dd2881485458e62698a1aa4bac9b1a63762480c392"} Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.511946 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a8705e73-fddb-48b4-b1e2-fd3e40a001cb","Type":"ContainerDied","Data":"5ae55f14c86d0c04fb99527d531028fef55485fb1774a975b85c99d33c46a39a"} Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.511965 4746 scope.go:117] "RemoveContainer" containerID="e3daf895d75c3e9171bac119b3f24a81d3e51ec1f50f69e024a97553df93e640" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.512138 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.513486 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.513522 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.547113 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8705e73-fddb-48b4-b1e2-fd3e40a001cb-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.552988 4746 scope.go:117] "RemoveContainer" containerID="44628f9cf49f9de7f54565f66b9289e264cbc17ae8325e0426f5dd73a3cacf38" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.553624 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.572378 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.585192 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:12 crc kubenswrapper[4746]: E0128 21:01:12.585712 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerName="sg-core" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.585736 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerName="sg-core" Jan 28 21:01:12 crc kubenswrapper[4746]: E0128 21:01:12.585756 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerName="ceilometer-central-agent" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.585765 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerName="ceilometer-central-agent" Jan 28 21:01:12 crc kubenswrapper[4746]: E0128 21:01:12.585787 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerName="proxy-httpd" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.585794 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerName="proxy-httpd" Jan 28 21:01:12 crc kubenswrapper[4746]: E0128 21:01:12.585828 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerName="ceilometer-notification-agent" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.585835 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerName="ceilometer-notification-agent" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.586035 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerName="ceilometer-notification-agent" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.586053 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerName="ceilometer-central-agent" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.586092 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerName="sg-core" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.586103 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" containerName="proxy-httpd" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.588352 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.591293 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.591465 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.597647 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.609603 4746 scope.go:117] "RemoveContainer" containerID="67ead006986d589f8b439f4e84d5167c4c0324d7d33954b044f217d7c3804083" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.647967 4746 scope.go:117] "RemoveContainer" containerID="a8e5786bdd005c4e878b83dd2881485458e62698a1aa4bac9b1a63762480c392" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.648607 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58306125-d35d-4800-aa80-fa0f20754887-run-httpd\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.648680 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.648703 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.648794 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58306125-d35d-4800-aa80-fa0f20754887-log-httpd\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.648895 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-config-data\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.648922 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-scripts\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.648998 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxb4h\" (UniqueName: \"kubernetes.io/projected/58306125-d35d-4800-aa80-fa0f20754887-kube-api-access-nxb4h\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.692841 4746 scope.go:117] "RemoveContainer" containerID="e3daf895d75c3e9171bac119b3f24a81d3e51ec1f50f69e024a97553df93e640" Jan 28 21:01:12 crc kubenswrapper[4746]: E0128 21:01:12.693254 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3daf895d75c3e9171bac119b3f24a81d3e51ec1f50f69e024a97553df93e640\": container with ID starting with e3daf895d75c3e9171bac119b3f24a81d3e51ec1f50f69e024a97553df93e640 not found: ID does not exist" containerID="e3daf895d75c3e9171bac119b3f24a81d3e51ec1f50f69e024a97553df93e640" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.693289 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3daf895d75c3e9171bac119b3f24a81d3e51ec1f50f69e024a97553df93e640"} err="failed to get container status \"e3daf895d75c3e9171bac119b3f24a81d3e51ec1f50f69e024a97553df93e640\": rpc error: code = NotFound desc = could not find container \"e3daf895d75c3e9171bac119b3f24a81d3e51ec1f50f69e024a97553df93e640\": container with ID starting with e3daf895d75c3e9171bac119b3f24a81d3e51ec1f50f69e024a97553df93e640 not found: ID does not exist" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.693308 4746 scope.go:117] "RemoveContainer" containerID="44628f9cf49f9de7f54565f66b9289e264cbc17ae8325e0426f5dd73a3cacf38" Jan 28 21:01:12 crc kubenswrapper[4746]: E0128 21:01:12.693772 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44628f9cf49f9de7f54565f66b9289e264cbc17ae8325e0426f5dd73a3cacf38\": container with ID starting with 44628f9cf49f9de7f54565f66b9289e264cbc17ae8325e0426f5dd73a3cacf38 not found: ID does not exist" containerID="44628f9cf49f9de7f54565f66b9289e264cbc17ae8325e0426f5dd73a3cacf38" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.693819 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44628f9cf49f9de7f54565f66b9289e264cbc17ae8325e0426f5dd73a3cacf38"} err="failed to get container status \"44628f9cf49f9de7f54565f66b9289e264cbc17ae8325e0426f5dd73a3cacf38\": rpc error: code = NotFound desc = could not find container \"44628f9cf49f9de7f54565f66b9289e264cbc17ae8325e0426f5dd73a3cacf38\": container with ID starting with 44628f9cf49f9de7f54565f66b9289e264cbc17ae8325e0426f5dd73a3cacf38 not found: ID does not exist" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.693853 4746 scope.go:117] "RemoveContainer" containerID="67ead006986d589f8b439f4e84d5167c4c0324d7d33954b044f217d7c3804083" Jan 28 21:01:12 crc kubenswrapper[4746]: E0128 21:01:12.694337 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67ead006986d589f8b439f4e84d5167c4c0324d7d33954b044f217d7c3804083\": container with ID starting with 67ead006986d589f8b439f4e84d5167c4c0324d7d33954b044f217d7c3804083 not found: ID does not exist" containerID="67ead006986d589f8b439f4e84d5167c4c0324d7d33954b044f217d7c3804083" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.694371 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67ead006986d589f8b439f4e84d5167c4c0324d7d33954b044f217d7c3804083"} err="failed to get container status \"67ead006986d589f8b439f4e84d5167c4c0324d7d33954b044f217d7c3804083\": rpc error: code = NotFound desc = could not find container \"67ead006986d589f8b439f4e84d5167c4c0324d7d33954b044f217d7c3804083\": container with ID starting with 67ead006986d589f8b439f4e84d5167c4c0324d7d33954b044f217d7c3804083 not found: ID does not exist" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.694386 4746 scope.go:117] "RemoveContainer" containerID="a8e5786bdd005c4e878b83dd2881485458e62698a1aa4bac9b1a63762480c392" Jan 28 21:01:12 crc kubenswrapper[4746]: E0128 21:01:12.694646 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8e5786bdd005c4e878b83dd2881485458e62698a1aa4bac9b1a63762480c392\": container with ID starting with a8e5786bdd005c4e878b83dd2881485458e62698a1aa4bac9b1a63762480c392 not found: ID does not exist" containerID="a8e5786bdd005c4e878b83dd2881485458e62698a1aa4bac9b1a63762480c392" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.694675 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e5786bdd005c4e878b83dd2881485458e62698a1aa4bac9b1a63762480c392"} err="failed to get container status \"a8e5786bdd005c4e878b83dd2881485458e62698a1aa4bac9b1a63762480c392\": rpc error: code = NotFound desc = could not find container \"a8e5786bdd005c4e878b83dd2881485458e62698a1aa4bac9b1a63762480c392\": container with ID starting with a8e5786bdd005c4e878b83dd2881485458e62698a1aa4bac9b1a63762480c392 not found: ID does not exist" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.750517 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.750562 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.750693 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58306125-d35d-4800-aa80-fa0f20754887-log-httpd\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.751324 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58306125-d35d-4800-aa80-fa0f20754887-log-httpd\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.752597 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-config-data\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.752721 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-scripts\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.752909 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxb4h\" (UniqueName: \"kubernetes.io/projected/58306125-d35d-4800-aa80-fa0f20754887-kube-api-access-nxb4h\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.752981 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58306125-d35d-4800-aa80-fa0f20754887-run-httpd\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.753742 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58306125-d35d-4800-aa80-fa0f20754887-run-httpd\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.754657 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.756341 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.756999 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-scripts\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.757501 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-config-data\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.772710 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxb4h\" (UniqueName: \"kubernetes.io/projected/58306125-d35d-4800-aa80-fa0f20754887-kube-api-access-nxb4h\") pod \"ceilometer-0\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " pod="openstack/ceilometer-0" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.865165 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8705e73-fddb-48b4-b1e2-fd3e40a001cb" path="/var/lib/kubelet/pods/a8705e73-fddb-48b4-b1e2-fd3e40a001cb/volumes" Jan 28 21:01:12 crc kubenswrapper[4746]: I0128 21:01:12.913528 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:01:13 crc kubenswrapper[4746]: I0128 21:01:13.454558 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:13 crc kubenswrapper[4746]: I0128 21:01:13.527219 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58306125-d35d-4800-aa80-fa0f20754887","Type":"ContainerStarted","Data":"66f1820c4b87b456d680d7e01921cb10a10487733ab03559fde55266c3c3a8f6"} Jan 28 21:01:14 crc kubenswrapper[4746]: I0128 21:01:14.539707 4746 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 21:01:14 crc kubenswrapper[4746]: I0128 21:01:14.540023 4746 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 21:01:14 crc kubenswrapper[4746]: I0128 21:01:14.539898 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58306125-d35d-4800-aa80-fa0f20754887","Type":"ContainerStarted","Data":"0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f"} Jan 28 21:01:14 crc kubenswrapper[4746]: I0128 21:01:14.802192 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 28 21:01:14 crc kubenswrapper[4746]: I0128 21:01:14.807779 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 28 21:01:15 crc kubenswrapper[4746]: I0128 21:01:15.207213 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:15 crc kubenswrapper[4746]: I0128 21:01:15.551882 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58306125-d35d-4800-aa80-fa0f20754887","Type":"ContainerStarted","Data":"22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64"} Jan 28 21:01:16 crc kubenswrapper[4746]: I0128 21:01:16.591404 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58306125-d35d-4800-aa80-fa0f20754887","Type":"ContainerStarted","Data":"e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df"} Jan 28 21:01:18 crc kubenswrapper[4746]: I0128 21:01:18.267644 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 28 21:01:18 crc kubenswrapper[4746]: I0128 21:01:18.267895 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 28 21:01:18 crc kubenswrapper[4746]: I0128 21:01:18.301063 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 28 21:01:18 crc kubenswrapper[4746]: I0128 21:01:18.319433 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 28 21:01:18 crc kubenswrapper[4746]: I0128 21:01:18.410369 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Jan 28 21:01:18 crc kubenswrapper[4746]: I0128 21:01:18.611787 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 28 21:01:18 crc kubenswrapper[4746]: I0128 21:01:18.611825 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 28 21:01:20 crc kubenswrapper[4746]: I0128 21:01:20.706104 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 28 21:01:20 crc kubenswrapper[4746]: I0128 21:01:20.706619 4746 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 28 21:01:20 crc kubenswrapper[4746]: I0128 21:01:20.973009 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 28 21:01:24 crc kubenswrapper[4746]: I0128 21:01:24.723162 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58306125-d35d-4800-aa80-fa0f20754887","Type":"ContainerStarted","Data":"3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98"} Jan 28 21:01:24 crc kubenswrapper[4746]: I0128 21:01:24.723710 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 21:01:24 crc kubenswrapper[4746]: I0128 21:01:24.723302 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="58306125-d35d-4800-aa80-fa0f20754887" containerName="proxy-httpd" containerID="cri-o://3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98" gracePeriod=30 Jan 28 21:01:24 crc kubenswrapper[4746]: I0128 21:01:24.723242 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="58306125-d35d-4800-aa80-fa0f20754887" containerName="ceilometer-central-agent" containerID="cri-o://0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f" gracePeriod=30 Jan 28 21:01:24 crc kubenswrapper[4746]: I0128 21:01:24.723348 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="58306125-d35d-4800-aa80-fa0f20754887" containerName="sg-core" containerID="cri-o://e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df" gracePeriod=30 Jan 28 21:01:24 crc kubenswrapper[4746]: I0128 21:01:24.723340 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="58306125-d35d-4800-aa80-fa0f20754887" containerName="ceilometer-notification-agent" containerID="cri-o://22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64" gracePeriod=30 Jan 28 21:01:24 crc kubenswrapper[4746]: I0128 21:01:24.726576 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-z7sfr" event={"ID":"28e5e616-1a83-4053-8231-e3763118ca8e","Type":"ContainerStarted","Data":"74048ad3db6523e71f273f6b209d7e2631b278f3167e76f7d0ff1af092ac027f"} Jan 28 21:01:24 crc kubenswrapper[4746]: I0128 21:01:24.786231 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.0795272320000002 podStartE2EDuration="12.786209338s" podCreationTimestamp="2026-01-28 21:01:12 +0000 UTC" firstStartedPulling="2026-01-28 21:01:13.449102553 +0000 UTC m=+1301.405288907" lastFinishedPulling="2026-01-28 21:01:24.155784659 +0000 UTC m=+1312.111971013" observedRunningTime="2026-01-28 21:01:24.75137381 +0000 UTC m=+1312.707560164" watchObservedRunningTime="2026-01-28 21:01:24.786209338 +0000 UTC m=+1312.742395692" Jan 28 21:01:24 crc kubenswrapper[4746]: I0128 21:01:24.801202 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-z7sfr" podStartSLOduration=1.8399136600000001 podStartE2EDuration="15.80117995s" podCreationTimestamp="2026-01-28 21:01:09 +0000 UTC" firstStartedPulling="2026-01-28 21:01:10.198706802 +0000 UTC m=+1298.154893156" lastFinishedPulling="2026-01-28 21:01:24.159973102 +0000 UTC m=+1312.116159446" observedRunningTime="2026-01-28 21:01:24.784667515 +0000 UTC m=+1312.740853879" watchObservedRunningTime="2026-01-28 21:01:24.80117995 +0000 UTC m=+1312.757366304" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.657570 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.738269 4746 generic.go:334] "Generic (PLEG): container finished" podID="58306125-d35d-4800-aa80-fa0f20754887" containerID="3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98" exitCode=0 Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.738300 4746 generic.go:334] "Generic (PLEG): container finished" podID="58306125-d35d-4800-aa80-fa0f20754887" containerID="e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df" exitCode=2 Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.738308 4746 generic.go:334] "Generic (PLEG): container finished" podID="58306125-d35d-4800-aa80-fa0f20754887" containerID="22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64" exitCode=0 Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.738316 4746 generic.go:334] "Generic (PLEG): container finished" podID="58306125-d35d-4800-aa80-fa0f20754887" containerID="0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f" exitCode=0 Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.738345 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.738344 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58306125-d35d-4800-aa80-fa0f20754887","Type":"ContainerDied","Data":"3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98"} Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.738453 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58306125-d35d-4800-aa80-fa0f20754887","Type":"ContainerDied","Data":"e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df"} Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.738464 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58306125-d35d-4800-aa80-fa0f20754887","Type":"ContainerDied","Data":"22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64"} Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.738473 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58306125-d35d-4800-aa80-fa0f20754887","Type":"ContainerDied","Data":"0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f"} Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.738481 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58306125-d35d-4800-aa80-fa0f20754887","Type":"ContainerDied","Data":"66f1820c4b87b456d680d7e01921cb10a10487733ab03559fde55266c3c3a8f6"} Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.738494 4746 scope.go:117] "RemoveContainer" containerID="3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.761719 4746 scope.go:117] "RemoveContainer" containerID="e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.762379 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58306125-d35d-4800-aa80-fa0f20754887-run-httpd\") pod \"58306125-d35d-4800-aa80-fa0f20754887\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.762480 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-sg-core-conf-yaml\") pod \"58306125-d35d-4800-aa80-fa0f20754887\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.762536 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-config-data\") pod \"58306125-d35d-4800-aa80-fa0f20754887\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.762631 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-scripts\") pod \"58306125-d35d-4800-aa80-fa0f20754887\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.762689 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-combined-ca-bundle\") pod \"58306125-d35d-4800-aa80-fa0f20754887\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.762718 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxb4h\" (UniqueName: \"kubernetes.io/projected/58306125-d35d-4800-aa80-fa0f20754887-kube-api-access-nxb4h\") pod \"58306125-d35d-4800-aa80-fa0f20754887\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.762746 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58306125-d35d-4800-aa80-fa0f20754887-log-httpd\") pod \"58306125-d35d-4800-aa80-fa0f20754887\" (UID: \"58306125-d35d-4800-aa80-fa0f20754887\") " Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.762747 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58306125-d35d-4800-aa80-fa0f20754887-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "58306125-d35d-4800-aa80-fa0f20754887" (UID: "58306125-d35d-4800-aa80-fa0f20754887"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.763181 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58306125-d35d-4800-aa80-fa0f20754887-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "58306125-d35d-4800-aa80-fa0f20754887" (UID: "58306125-d35d-4800-aa80-fa0f20754887"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.763283 4746 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58306125-d35d-4800-aa80-fa0f20754887-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.763295 4746 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58306125-d35d-4800-aa80-fa0f20754887-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.777206 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-scripts" (OuterVolumeSpecName: "scripts") pod "58306125-d35d-4800-aa80-fa0f20754887" (UID: "58306125-d35d-4800-aa80-fa0f20754887"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.777217 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58306125-d35d-4800-aa80-fa0f20754887-kube-api-access-nxb4h" (OuterVolumeSpecName: "kube-api-access-nxb4h") pod "58306125-d35d-4800-aa80-fa0f20754887" (UID: "58306125-d35d-4800-aa80-fa0f20754887"). InnerVolumeSpecName "kube-api-access-nxb4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.807596 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "58306125-d35d-4800-aa80-fa0f20754887" (UID: "58306125-d35d-4800-aa80-fa0f20754887"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.825598 4746 scope.go:117] "RemoveContainer" containerID="22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.846956 4746 scope.go:117] "RemoveContainer" containerID="0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.848072 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58306125-d35d-4800-aa80-fa0f20754887" (UID: "58306125-d35d-4800-aa80-fa0f20754887"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.864752 4746 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.864789 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.864802 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.864816 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxb4h\" (UniqueName: \"kubernetes.io/projected/58306125-d35d-4800-aa80-fa0f20754887-kube-api-access-nxb4h\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.868579 4746 scope.go:117] "RemoveContainer" containerID="3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98" Jan 28 21:01:25 crc kubenswrapper[4746]: E0128 21:01:25.869053 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98\": container with ID starting with 3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98 not found: ID does not exist" containerID="3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.869109 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98"} err="failed to get container status \"3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98\": rpc error: code = NotFound desc = could not find container \"3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98\": container with ID starting with 3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98 not found: ID does not exist" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.869136 4746 scope.go:117] "RemoveContainer" containerID="e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df" Jan 28 21:01:25 crc kubenswrapper[4746]: E0128 21:01:25.869496 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df\": container with ID starting with e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df not found: ID does not exist" containerID="e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.869525 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df"} err="failed to get container status \"e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df\": rpc error: code = NotFound desc = could not find container \"e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df\": container with ID starting with e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df not found: ID does not exist" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.869543 4746 scope.go:117] "RemoveContainer" containerID="22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64" Jan 28 21:01:25 crc kubenswrapper[4746]: E0128 21:01:25.869803 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64\": container with ID starting with 22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64 not found: ID does not exist" containerID="22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.869830 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64"} err="failed to get container status \"22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64\": rpc error: code = NotFound desc = could not find container \"22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64\": container with ID starting with 22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64 not found: ID does not exist" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.869849 4746 scope.go:117] "RemoveContainer" containerID="0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f" Jan 28 21:01:25 crc kubenswrapper[4746]: E0128 21:01:25.870048 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f\": container with ID starting with 0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f not found: ID does not exist" containerID="0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.870073 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f"} err="failed to get container status \"0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f\": rpc error: code = NotFound desc = could not find container \"0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f\": container with ID starting with 0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f not found: ID does not exist" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.870108 4746 scope.go:117] "RemoveContainer" containerID="3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.870378 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98"} err="failed to get container status \"3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98\": rpc error: code = NotFound desc = could not find container \"3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98\": container with ID starting with 3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98 not found: ID does not exist" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.870404 4746 scope.go:117] "RemoveContainer" containerID="e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.870655 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df"} err="failed to get container status \"e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df\": rpc error: code = NotFound desc = could not find container \"e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df\": container with ID starting with e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df not found: ID does not exist" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.870690 4746 scope.go:117] "RemoveContainer" containerID="22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.871142 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64"} err="failed to get container status \"22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64\": rpc error: code = NotFound desc = could not find container \"22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64\": container with ID starting with 22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64 not found: ID does not exist" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.871160 4746 scope.go:117] "RemoveContainer" containerID="0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.872298 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f"} err="failed to get container status \"0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f\": rpc error: code = NotFound desc = could not find container \"0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f\": container with ID starting with 0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f not found: ID does not exist" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.872348 4746 scope.go:117] "RemoveContainer" containerID="3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.872833 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98"} err="failed to get container status \"3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98\": rpc error: code = NotFound desc = could not find container \"3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98\": container with ID starting with 3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98 not found: ID does not exist" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.872857 4746 scope.go:117] "RemoveContainer" containerID="e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.873100 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df"} err="failed to get container status \"e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df\": rpc error: code = NotFound desc = could not find container \"e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df\": container with ID starting with e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df not found: ID does not exist" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.873142 4746 scope.go:117] "RemoveContainer" containerID="22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.873457 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64"} err="failed to get container status \"22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64\": rpc error: code = NotFound desc = could not find container \"22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64\": container with ID starting with 22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64 not found: ID does not exist" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.873478 4746 scope.go:117] "RemoveContainer" containerID="0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.873732 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f"} err="failed to get container status \"0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f\": rpc error: code = NotFound desc = could not find container \"0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f\": container with ID starting with 0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f not found: ID does not exist" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.873778 4746 scope.go:117] "RemoveContainer" containerID="3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.874148 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98"} err="failed to get container status \"3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98\": rpc error: code = NotFound desc = could not find container \"3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98\": container with ID starting with 3130afbb1164bccfc5c79514e830e0facdb6bcc5da41297cd1ec1b3b3e82fc98 not found: ID does not exist" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.874174 4746 scope.go:117] "RemoveContainer" containerID="e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.874506 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df"} err="failed to get container status \"e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df\": rpc error: code = NotFound desc = could not find container \"e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df\": container with ID starting with e04ee1d66d02dfec0b195b5a4b4d22bec52240529a83b51219d42c90ca5052df not found: ID does not exist" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.874535 4746 scope.go:117] "RemoveContainer" containerID="22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.874839 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64"} err="failed to get container status \"22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64\": rpc error: code = NotFound desc = could not find container \"22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64\": container with ID starting with 22c7f34b71b3c48cbe751aa27d53a4e7fac2b98cf699cca33c52432d9aa04b64 not found: ID does not exist" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.874878 4746 scope.go:117] "RemoveContainer" containerID="0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.875178 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f"} err="failed to get container status \"0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f\": rpc error: code = NotFound desc = could not find container \"0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f\": container with ID starting with 0368031e3e6abd2e055372cfeb5d560deed7a8ef73692e8e3f6afeaa3541420f not found: ID does not exist" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.882213 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-config-data" (OuterVolumeSpecName: "config-data") pod "58306125-d35d-4800-aa80-fa0f20754887" (UID: "58306125-d35d-4800-aa80-fa0f20754887"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:25 crc kubenswrapper[4746]: I0128 21:01:25.966457 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58306125-d35d-4800-aa80-fa0f20754887-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.081276 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.092697 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.116357 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:26 crc kubenswrapper[4746]: E0128 21:01:26.116727 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58306125-d35d-4800-aa80-fa0f20754887" containerName="sg-core" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.116743 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="58306125-d35d-4800-aa80-fa0f20754887" containerName="sg-core" Jan 28 21:01:26 crc kubenswrapper[4746]: E0128 21:01:26.116759 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58306125-d35d-4800-aa80-fa0f20754887" containerName="ceilometer-central-agent" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.116766 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="58306125-d35d-4800-aa80-fa0f20754887" containerName="ceilometer-central-agent" Jan 28 21:01:26 crc kubenswrapper[4746]: E0128 21:01:26.116785 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58306125-d35d-4800-aa80-fa0f20754887" containerName="ceilometer-notification-agent" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.116792 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="58306125-d35d-4800-aa80-fa0f20754887" containerName="ceilometer-notification-agent" Jan 28 21:01:26 crc kubenswrapper[4746]: E0128 21:01:26.116810 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58306125-d35d-4800-aa80-fa0f20754887" containerName="proxy-httpd" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.116816 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="58306125-d35d-4800-aa80-fa0f20754887" containerName="proxy-httpd" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.116978 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="58306125-d35d-4800-aa80-fa0f20754887" containerName="sg-core" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.116997 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="58306125-d35d-4800-aa80-fa0f20754887" containerName="ceilometer-central-agent" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.117013 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="58306125-d35d-4800-aa80-fa0f20754887" containerName="ceilometer-notification-agent" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.117027 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="58306125-d35d-4800-aa80-fa0f20754887" containerName="proxy-httpd" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.119430 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.123180 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.124053 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.127143 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.169987 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-scripts\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.170238 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91435394-50f0-4f04-b4c5-59d4e4ef2b97-run-httpd\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.170437 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.170473 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-config-data\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.170566 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j48jg\" (UniqueName: \"kubernetes.io/projected/91435394-50f0-4f04-b4c5-59d4e4ef2b97-kube-api-access-j48jg\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.170686 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.170724 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91435394-50f0-4f04-b4c5-59d4e4ef2b97-log-httpd\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.272070 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.272120 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-config-data\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.272160 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j48jg\" (UniqueName: \"kubernetes.io/projected/91435394-50f0-4f04-b4c5-59d4e4ef2b97-kube-api-access-j48jg\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.272191 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.272210 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91435394-50f0-4f04-b4c5-59d4e4ef2b97-log-httpd\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.272235 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-scripts\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.272335 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91435394-50f0-4f04-b4c5-59d4e4ef2b97-run-httpd\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.272996 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91435394-50f0-4f04-b4c5-59d4e4ef2b97-run-httpd\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.273146 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91435394-50f0-4f04-b4c5-59d4e4ef2b97-log-httpd\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.279577 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.279712 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-scripts\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.280177 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.286185 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-config-data\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.290711 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j48jg\" (UniqueName: \"kubernetes.io/projected/91435394-50f0-4f04-b4c5-59d4e4ef2b97-kube-api-access-j48jg\") pod \"ceilometer-0\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.434734 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.847431 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58306125-d35d-4800-aa80-fa0f20754887" path="/var/lib/kubelet/pods/58306125-d35d-4800-aa80-fa0f20754887/volumes" Jan 28 21:01:26 crc kubenswrapper[4746]: I0128 21:01:26.939408 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:26 crc kubenswrapper[4746]: W0128 21:01:26.950458 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91435394_50f0_4f04_b4c5_59d4e4ef2b97.slice/crio-426f6e3f66e8188f436dce89a89fb90083929bf153d29433e5d9a7dd4d71e2c1 WatchSource:0}: Error finding container 426f6e3f66e8188f436dce89a89fb90083929bf153d29433e5d9a7dd4d71e2c1: Status 404 returned error can't find the container with id 426f6e3f66e8188f436dce89a89fb90083929bf153d29433e5d9a7dd4d71e2c1 Jan 28 21:01:27 crc kubenswrapper[4746]: I0128 21:01:27.761225 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91435394-50f0-4f04-b4c5-59d4e4ef2b97","Type":"ContainerStarted","Data":"984e2c3366ea860a2f7141209ad26d923f140dc570ad85a009043383dbd4880c"} Jan 28 21:01:27 crc kubenswrapper[4746]: I0128 21:01:27.761738 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91435394-50f0-4f04-b4c5-59d4e4ef2b97","Type":"ContainerStarted","Data":"426f6e3f66e8188f436dce89a89fb90083929bf153d29433e5d9a7dd4d71e2c1"} Jan 28 21:01:27 crc kubenswrapper[4746]: I0128 21:01:27.957907 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:28 crc kubenswrapper[4746]: I0128 21:01:28.773895 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91435394-50f0-4f04-b4c5-59d4e4ef2b97","Type":"ContainerStarted","Data":"dca8ed5b42701176f4abc868016576f4814f8f4b548975b492eb4c7d79aec7ca"} Jan 28 21:01:29 crc kubenswrapper[4746]: I0128 21:01:29.788572 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91435394-50f0-4f04-b4c5-59d4e4ef2b97","Type":"ContainerStarted","Data":"520cbe1e7dd00c248603510bc63540699320cc5552b9b3b63e24fa6930f29364"} Jan 28 21:01:31 crc kubenswrapper[4746]: I0128 21:01:31.811670 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91435394-50f0-4f04-b4c5-59d4e4ef2b97","Type":"ContainerStarted","Data":"9cd85b3c2ebff6a9001c45f7aa4c2c6ef360b6070ec339255c54f0de6f56bf11"} Jan 28 21:01:31 crc kubenswrapper[4746]: I0128 21:01:31.811902 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerName="ceilometer-central-agent" containerID="cri-o://984e2c3366ea860a2f7141209ad26d923f140dc570ad85a009043383dbd4880c" gracePeriod=30 Jan 28 21:01:31 crc kubenswrapper[4746]: I0128 21:01:31.812133 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 21:01:31 crc kubenswrapper[4746]: I0128 21:01:31.812218 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerName="proxy-httpd" containerID="cri-o://9cd85b3c2ebff6a9001c45f7aa4c2c6ef360b6070ec339255c54f0de6f56bf11" gracePeriod=30 Jan 28 21:01:31 crc kubenswrapper[4746]: I0128 21:01:31.812331 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerName="ceilometer-notification-agent" containerID="cri-o://dca8ed5b42701176f4abc868016576f4814f8f4b548975b492eb4c7d79aec7ca" gracePeriod=30 Jan 28 21:01:31 crc kubenswrapper[4746]: I0128 21:01:31.812376 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerName="sg-core" containerID="cri-o://520cbe1e7dd00c248603510bc63540699320cc5552b9b3b63e24fa6930f29364" gracePeriod=30 Jan 28 21:01:31 crc kubenswrapper[4746]: I0128 21:01:31.843747 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.4595214859999999 podStartE2EDuration="5.843729055s" podCreationTimestamp="2026-01-28 21:01:26 +0000 UTC" firstStartedPulling="2026-01-28 21:01:26.953614388 +0000 UTC m=+1314.909800752" lastFinishedPulling="2026-01-28 21:01:31.337821967 +0000 UTC m=+1319.294008321" observedRunningTime="2026-01-28 21:01:31.83757051 +0000 UTC m=+1319.793756864" watchObservedRunningTime="2026-01-28 21:01:31.843729055 +0000 UTC m=+1319.799915399" Jan 28 21:01:32 crc kubenswrapper[4746]: I0128 21:01:32.825348 4746 generic.go:334] "Generic (PLEG): container finished" podID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerID="9cd85b3c2ebff6a9001c45f7aa4c2c6ef360b6070ec339255c54f0de6f56bf11" exitCode=0 Jan 28 21:01:32 crc kubenswrapper[4746]: I0128 21:01:32.825604 4746 generic.go:334] "Generic (PLEG): container finished" podID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerID="520cbe1e7dd00c248603510bc63540699320cc5552b9b3b63e24fa6930f29364" exitCode=2 Jan 28 21:01:32 crc kubenswrapper[4746]: I0128 21:01:32.825613 4746 generic.go:334] "Generic (PLEG): container finished" podID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerID="dca8ed5b42701176f4abc868016576f4814f8f4b548975b492eb4c7d79aec7ca" exitCode=0 Jan 28 21:01:32 crc kubenswrapper[4746]: I0128 21:01:32.825407 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91435394-50f0-4f04-b4c5-59d4e4ef2b97","Type":"ContainerDied","Data":"9cd85b3c2ebff6a9001c45f7aa4c2c6ef360b6070ec339255c54f0de6f56bf11"} Jan 28 21:01:32 crc kubenswrapper[4746]: I0128 21:01:32.825645 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91435394-50f0-4f04-b4c5-59d4e4ef2b97","Type":"ContainerDied","Data":"520cbe1e7dd00c248603510bc63540699320cc5552b9b3b63e24fa6930f29364"} Jan 28 21:01:32 crc kubenswrapper[4746]: I0128 21:01:32.825659 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91435394-50f0-4f04-b4c5-59d4e4ef2b97","Type":"ContainerDied","Data":"dca8ed5b42701176f4abc868016576f4814f8f4b548975b492eb4c7d79aec7ca"} Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.562992 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.765393 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91435394-50f0-4f04-b4c5-59d4e4ef2b97-log-httpd\") pod \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.765448 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91435394-50f0-4f04-b4c5-59d4e4ef2b97-run-httpd\") pod \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.765507 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-scripts\") pod \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.765600 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-combined-ca-bundle\") pod \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.765673 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-config-data\") pod \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.765719 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-sg-core-conf-yaml\") pod \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.765873 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j48jg\" (UniqueName: \"kubernetes.io/projected/91435394-50f0-4f04-b4c5-59d4e4ef2b97-kube-api-access-j48jg\") pod \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\" (UID: \"91435394-50f0-4f04-b4c5-59d4e4ef2b97\") " Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.766613 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91435394-50f0-4f04-b4c5-59d4e4ef2b97-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "91435394-50f0-4f04-b4c5-59d4e4ef2b97" (UID: "91435394-50f0-4f04-b4c5-59d4e4ef2b97"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.766855 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91435394-50f0-4f04-b4c5-59d4e4ef2b97-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "91435394-50f0-4f04-b4c5-59d4e4ef2b97" (UID: "91435394-50f0-4f04-b4c5-59d4e4ef2b97"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.772148 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91435394-50f0-4f04-b4c5-59d4e4ef2b97-kube-api-access-j48jg" (OuterVolumeSpecName: "kube-api-access-j48jg") pod "91435394-50f0-4f04-b4c5-59d4e4ef2b97" (UID: "91435394-50f0-4f04-b4c5-59d4e4ef2b97"). InnerVolumeSpecName "kube-api-access-j48jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.772910 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-scripts" (OuterVolumeSpecName: "scripts") pod "91435394-50f0-4f04-b4c5-59d4e4ef2b97" (UID: "91435394-50f0-4f04-b4c5-59d4e4ef2b97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.802884 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "91435394-50f0-4f04-b4c5-59d4e4ef2b97" (UID: "91435394-50f0-4f04-b4c5-59d4e4ef2b97"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.901804 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j48jg\" (UniqueName: \"kubernetes.io/projected/91435394-50f0-4f04-b4c5-59d4e4ef2b97-kube-api-access-j48jg\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.901841 4746 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91435394-50f0-4f04-b4c5-59d4e4ef2b97-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.902007 4746 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/91435394-50f0-4f04-b4c5-59d4e4ef2b97-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.902025 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.902037 4746 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.950297 4746 generic.go:334] "Generic (PLEG): container finished" podID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerID="984e2c3366ea860a2f7141209ad26d923f140dc570ad85a009043383dbd4880c" exitCode=0 Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.950413 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91435394-50f0-4f04-b4c5-59d4e4ef2b97","Type":"ContainerDied","Data":"984e2c3366ea860a2f7141209ad26d923f140dc570ad85a009043383dbd4880c"} Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.950449 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"91435394-50f0-4f04-b4c5-59d4e4ef2b97","Type":"ContainerDied","Data":"426f6e3f66e8188f436dce89a89fb90083929bf153d29433e5d9a7dd4d71e2c1"} Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.950490 4746 scope.go:117] "RemoveContainer" containerID="9cd85b3c2ebff6a9001c45f7aa4c2c6ef360b6070ec339255c54f0de6f56bf11" Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.950719 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.951261 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91435394-50f0-4f04-b4c5-59d4e4ef2b97" (UID: "91435394-50f0-4f04-b4c5-59d4e4ef2b97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:34 crc kubenswrapper[4746]: I0128 21:01:34.963508 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-config-data" (OuterVolumeSpecName: "config-data") pod "91435394-50f0-4f04-b4c5-59d4e4ef2b97" (UID: "91435394-50f0-4f04-b4c5-59d4e4ef2b97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.003144 4746 scope.go:117] "RemoveContainer" containerID="520cbe1e7dd00c248603510bc63540699320cc5552b9b3b63e24fa6930f29364" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.039502 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.039541 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91435394-50f0-4f04-b4c5-59d4e4ef2b97-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.062245 4746 scope.go:117] "RemoveContainer" containerID="dca8ed5b42701176f4abc868016576f4814f8f4b548975b492eb4c7d79aec7ca" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.107018 4746 scope.go:117] "RemoveContainer" containerID="984e2c3366ea860a2f7141209ad26d923f140dc570ad85a009043383dbd4880c" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.129512 4746 scope.go:117] "RemoveContainer" containerID="9cd85b3c2ebff6a9001c45f7aa4c2c6ef360b6070ec339255c54f0de6f56bf11" Jan 28 21:01:35 crc kubenswrapper[4746]: E0128 21:01:35.131430 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cd85b3c2ebff6a9001c45f7aa4c2c6ef360b6070ec339255c54f0de6f56bf11\": container with ID starting with 9cd85b3c2ebff6a9001c45f7aa4c2c6ef360b6070ec339255c54f0de6f56bf11 not found: ID does not exist" containerID="9cd85b3c2ebff6a9001c45f7aa4c2c6ef360b6070ec339255c54f0de6f56bf11" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.131830 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd85b3c2ebff6a9001c45f7aa4c2c6ef360b6070ec339255c54f0de6f56bf11"} err="failed to get container status \"9cd85b3c2ebff6a9001c45f7aa4c2c6ef360b6070ec339255c54f0de6f56bf11\": rpc error: code = NotFound desc = could not find container \"9cd85b3c2ebff6a9001c45f7aa4c2c6ef360b6070ec339255c54f0de6f56bf11\": container with ID starting with 9cd85b3c2ebff6a9001c45f7aa4c2c6ef360b6070ec339255c54f0de6f56bf11 not found: ID does not exist" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.131863 4746 scope.go:117] "RemoveContainer" containerID="520cbe1e7dd00c248603510bc63540699320cc5552b9b3b63e24fa6930f29364" Jan 28 21:01:35 crc kubenswrapper[4746]: E0128 21:01:35.132368 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"520cbe1e7dd00c248603510bc63540699320cc5552b9b3b63e24fa6930f29364\": container with ID starting with 520cbe1e7dd00c248603510bc63540699320cc5552b9b3b63e24fa6930f29364 not found: ID does not exist" containerID="520cbe1e7dd00c248603510bc63540699320cc5552b9b3b63e24fa6930f29364" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.132405 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"520cbe1e7dd00c248603510bc63540699320cc5552b9b3b63e24fa6930f29364"} err="failed to get container status \"520cbe1e7dd00c248603510bc63540699320cc5552b9b3b63e24fa6930f29364\": rpc error: code = NotFound desc = could not find container \"520cbe1e7dd00c248603510bc63540699320cc5552b9b3b63e24fa6930f29364\": container with ID starting with 520cbe1e7dd00c248603510bc63540699320cc5552b9b3b63e24fa6930f29364 not found: ID does not exist" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.132432 4746 scope.go:117] "RemoveContainer" containerID="dca8ed5b42701176f4abc868016576f4814f8f4b548975b492eb4c7d79aec7ca" Jan 28 21:01:35 crc kubenswrapper[4746]: E0128 21:01:35.132765 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dca8ed5b42701176f4abc868016576f4814f8f4b548975b492eb4c7d79aec7ca\": container with ID starting with dca8ed5b42701176f4abc868016576f4814f8f4b548975b492eb4c7d79aec7ca not found: ID does not exist" containerID="dca8ed5b42701176f4abc868016576f4814f8f4b548975b492eb4c7d79aec7ca" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.132790 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dca8ed5b42701176f4abc868016576f4814f8f4b548975b492eb4c7d79aec7ca"} err="failed to get container status \"dca8ed5b42701176f4abc868016576f4814f8f4b548975b492eb4c7d79aec7ca\": rpc error: code = NotFound desc = could not find container \"dca8ed5b42701176f4abc868016576f4814f8f4b548975b492eb4c7d79aec7ca\": container with ID starting with dca8ed5b42701176f4abc868016576f4814f8f4b548975b492eb4c7d79aec7ca not found: ID does not exist" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.132805 4746 scope.go:117] "RemoveContainer" containerID="984e2c3366ea860a2f7141209ad26d923f140dc570ad85a009043383dbd4880c" Jan 28 21:01:35 crc kubenswrapper[4746]: E0128 21:01:35.133166 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"984e2c3366ea860a2f7141209ad26d923f140dc570ad85a009043383dbd4880c\": container with ID starting with 984e2c3366ea860a2f7141209ad26d923f140dc570ad85a009043383dbd4880c not found: ID does not exist" containerID="984e2c3366ea860a2f7141209ad26d923f140dc570ad85a009043383dbd4880c" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.133217 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"984e2c3366ea860a2f7141209ad26d923f140dc570ad85a009043383dbd4880c"} err="failed to get container status \"984e2c3366ea860a2f7141209ad26d923f140dc570ad85a009043383dbd4880c\": rpc error: code = NotFound desc = could not find container \"984e2c3366ea860a2f7141209ad26d923f140dc570ad85a009043383dbd4880c\": container with ID starting with 984e2c3366ea860a2f7141209ad26d923f140dc570ad85a009043383dbd4880c not found: ID does not exist" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.283387 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.300539 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.310986 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:35 crc kubenswrapper[4746]: E0128 21:01:35.311433 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerName="sg-core" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.311448 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerName="sg-core" Jan 28 21:01:35 crc kubenswrapper[4746]: E0128 21:01:35.311467 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerName="proxy-httpd" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.311474 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerName="proxy-httpd" Jan 28 21:01:35 crc kubenswrapper[4746]: E0128 21:01:35.311502 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerName="ceilometer-central-agent" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.311508 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerName="ceilometer-central-agent" Jan 28 21:01:35 crc kubenswrapper[4746]: E0128 21:01:35.311521 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerName="ceilometer-notification-agent" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.311527 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerName="ceilometer-notification-agent" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.311766 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerName="sg-core" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.311782 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerName="proxy-httpd" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.311815 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerName="ceilometer-notification-agent" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.311832 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" containerName="ceilometer-central-agent" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.313788 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.317097 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.321735 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.332444 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.346461 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.346547 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-config-data\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.346584 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whbw7\" (UniqueName: \"kubernetes.io/projected/ac700cb5-2363-4223-b64c-ce8b49fede0b-kube-api-access-whbw7\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.346616 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac700cb5-2363-4223-b64c-ce8b49fede0b-run-httpd\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.346651 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.346705 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac700cb5-2363-4223-b64c-ce8b49fede0b-log-httpd\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.346739 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-scripts\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.448509 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.448577 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-config-data\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.448606 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whbw7\" (UniqueName: \"kubernetes.io/projected/ac700cb5-2363-4223-b64c-ce8b49fede0b-kube-api-access-whbw7\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.448633 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac700cb5-2363-4223-b64c-ce8b49fede0b-run-httpd\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.448661 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.448704 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac700cb5-2363-4223-b64c-ce8b49fede0b-log-httpd\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.448729 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-scripts\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.449064 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac700cb5-2363-4223-b64c-ce8b49fede0b-run-httpd\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.449434 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac700cb5-2363-4223-b64c-ce8b49fede0b-log-httpd\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.453882 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.454504 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-scripts\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.461653 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.462147 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-config-data\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.470760 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whbw7\" (UniqueName: \"kubernetes.io/projected/ac700cb5-2363-4223-b64c-ce8b49fede0b-kube-api-access-whbw7\") pod \"ceilometer-0\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " pod="openstack/ceilometer-0" Jan 28 21:01:35 crc kubenswrapper[4746]: I0128 21:01:35.633233 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:01:36 crc kubenswrapper[4746]: I0128 21:01:36.155477 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:36 crc kubenswrapper[4746]: W0128 21:01:36.166286 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac700cb5_2363_4223_b64c_ce8b49fede0b.slice/crio-8f6a16304a1ca2382cc2d1481ec8f61659e597e6d46a76531d94662a5e48e810 WatchSource:0}: Error finding container 8f6a16304a1ca2382cc2d1481ec8f61659e597e6d46a76531d94662a5e48e810: Status 404 returned error can't find the container with id 8f6a16304a1ca2382cc2d1481ec8f61659e597e6d46a76531d94662a5e48e810 Jan 28 21:01:36 crc kubenswrapper[4746]: I0128 21:01:36.851824 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91435394-50f0-4f04-b4c5-59d4e4ef2b97" path="/var/lib/kubelet/pods/91435394-50f0-4f04-b4c5-59d4e4ef2b97/volumes" Jan 28 21:01:36 crc kubenswrapper[4746]: I0128 21:01:36.977283 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac700cb5-2363-4223-b64c-ce8b49fede0b","Type":"ContainerStarted","Data":"8f6a16304a1ca2382cc2d1481ec8f61659e597e6d46a76531d94662a5e48e810"} Jan 28 21:01:38 crc kubenswrapper[4746]: I0128 21:01:38.004523 4746 generic.go:334] "Generic (PLEG): container finished" podID="28e5e616-1a83-4053-8231-e3763118ca8e" containerID="74048ad3db6523e71f273f6b209d7e2631b278f3167e76f7d0ff1af092ac027f" exitCode=0 Jan 28 21:01:38 crc kubenswrapper[4746]: I0128 21:01:38.004775 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-z7sfr" event={"ID":"28e5e616-1a83-4053-8231-e3763118ca8e","Type":"ContainerDied","Data":"74048ad3db6523e71f273f6b209d7e2631b278f3167e76f7d0ff1af092ac027f"} Jan 28 21:01:38 crc kubenswrapper[4746]: I0128 21:01:38.007872 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac700cb5-2363-4223-b64c-ce8b49fede0b","Type":"ContainerStarted","Data":"35ba04f239fe059c515dd7957ca078df99c49e377b8595092ddd274dc8857306"} Jan 28 21:01:38 crc kubenswrapper[4746]: I0128 21:01:38.007933 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac700cb5-2363-4223-b64c-ce8b49fede0b","Type":"ContainerStarted","Data":"d1d3cc6d86c3875422bd9d16643f59c264f59e5f36baa5e59642407acc0d04a8"} Jan 28 21:01:39 crc kubenswrapper[4746]: I0128 21:01:39.020305 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac700cb5-2363-4223-b64c-ce8b49fede0b","Type":"ContainerStarted","Data":"425b35e63efc0aa88654bf7621c444e520b92e6d39c47e8f77238502bd74fb9a"} Jan 28 21:01:39 crc kubenswrapper[4746]: I0128 21:01:39.484342 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-z7sfr" Jan 28 21:01:39 crc kubenswrapper[4746]: I0128 21:01:39.535131 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e5e616-1a83-4053-8231-e3763118ca8e-combined-ca-bundle\") pod \"28e5e616-1a83-4053-8231-e3763118ca8e\" (UID: \"28e5e616-1a83-4053-8231-e3763118ca8e\") " Jan 28 21:01:39 crc kubenswrapper[4746]: I0128 21:01:39.535199 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28e5e616-1a83-4053-8231-e3763118ca8e-scripts\") pod \"28e5e616-1a83-4053-8231-e3763118ca8e\" (UID: \"28e5e616-1a83-4053-8231-e3763118ca8e\") " Jan 28 21:01:39 crc kubenswrapper[4746]: I0128 21:01:39.535296 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c44n\" (UniqueName: \"kubernetes.io/projected/28e5e616-1a83-4053-8231-e3763118ca8e-kube-api-access-7c44n\") pod \"28e5e616-1a83-4053-8231-e3763118ca8e\" (UID: \"28e5e616-1a83-4053-8231-e3763118ca8e\") " Jan 28 21:01:39 crc kubenswrapper[4746]: I0128 21:01:39.535422 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e5e616-1a83-4053-8231-e3763118ca8e-config-data\") pod \"28e5e616-1a83-4053-8231-e3763118ca8e\" (UID: \"28e5e616-1a83-4053-8231-e3763118ca8e\") " Jan 28 21:01:39 crc kubenswrapper[4746]: I0128 21:01:39.542265 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e5e616-1a83-4053-8231-e3763118ca8e-scripts" (OuterVolumeSpecName: "scripts") pod "28e5e616-1a83-4053-8231-e3763118ca8e" (UID: "28e5e616-1a83-4053-8231-e3763118ca8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:39 crc kubenswrapper[4746]: I0128 21:01:39.548427 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28e5e616-1a83-4053-8231-e3763118ca8e-kube-api-access-7c44n" (OuterVolumeSpecName: "kube-api-access-7c44n") pod "28e5e616-1a83-4053-8231-e3763118ca8e" (UID: "28e5e616-1a83-4053-8231-e3763118ca8e"). InnerVolumeSpecName "kube-api-access-7c44n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:01:39 crc kubenswrapper[4746]: I0128 21:01:39.578322 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e5e616-1a83-4053-8231-e3763118ca8e-config-data" (OuterVolumeSpecName: "config-data") pod "28e5e616-1a83-4053-8231-e3763118ca8e" (UID: "28e5e616-1a83-4053-8231-e3763118ca8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:39 crc kubenswrapper[4746]: I0128 21:01:39.598689 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e5e616-1a83-4053-8231-e3763118ca8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28e5e616-1a83-4053-8231-e3763118ca8e" (UID: "28e5e616-1a83-4053-8231-e3763118ca8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:39 crc kubenswrapper[4746]: I0128 21:01:39.637545 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e5e616-1a83-4053-8231-e3763118ca8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:39 crc kubenswrapper[4746]: I0128 21:01:39.637578 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28e5e616-1a83-4053-8231-e3763118ca8e-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:39 crc kubenswrapper[4746]: I0128 21:01:39.637588 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c44n\" (UniqueName: \"kubernetes.io/projected/28e5e616-1a83-4053-8231-e3763118ca8e-kube-api-access-7c44n\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:39 crc kubenswrapper[4746]: I0128 21:01:39.637597 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e5e616-1a83-4053-8231-e3763118ca8e-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.033745 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-z7sfr" event={"ID":"28e5e616-1a83-4053-8231-e3763118ca8e","Type":"ContainerDied","Data":"663482718d34b1c50485ff975b8d342a767b3370fe90f6b001d58cfd12b29e89"} Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.033806 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="663482718d34b1c50485ff975b8d342a767b3370fe90f6b001d58cfd12b29e89" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.033878 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-z7sfr" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.123295 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 21:01:40 crc kubenswrapper[4746]: E0128 21:01:40.123855 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e5e616-1a83-4053-8231-e3763118ca8e" containerName="nova-cell0-conductor-db-sync" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.123886 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e5e616-1a83-4053-8231-e3763118ca8e" containerName="nova-cell0-conductor-db-sync" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.124064 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e5e616-1a83-4053-8231-e3763118ca8e" containerName="nova-cell0-conductor-db-sync" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.124821 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.126599 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-9m4cq" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.126990 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.137916 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.249425 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7933917c-bf24-4c2b-b37e-dae93337cfa3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7933917c-bf24-4c2b-b37e-dae93337cfa3\") " pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.249815 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjr6c\" (UniqueName: \"kubernetes.io/projected/7933917c-bf24-4c2b-b37e-dae93337cfa3-kube-api-access-pjr6c\") pod \"nova-cell0-conductor-0\" (UID: \"7933917c-bf24-4c2b-b37e-dae93337cfa3\") " pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.249977 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7933917c-bf24-4c2b-b37e-dae93337cfa3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7933917c-bf24-4c2b-b37e-dae93337cfa3\") " pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.351946 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7933917c-bf24-4c2b-b37e-dae93337cfa3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7933917c-bf24-4c2b-b37e-dae93337cfa3\") " pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.352026 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7933917c-bf24-4c2b-b37e-dae93337cfa3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7933917c-bf24-4c2b-b37e-dae93337cfa3\") " pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.352150 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjr6c\" (UniqueName: \"kubernetes.io/projected/7933917c-bf24-4c2b-b37e-dae93337cfa3-kube-api-access-pjr6c\") pod \"nova-cell0-conductor-0\" (UID: \"7933917c-bf24-4c2b-b37e-dae93337cfa3\") " pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.357339 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7933917c-bf24-4c2b-b37e-dae93337cfa3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7933917c-bf24-4c2b-b37e-dae93337cfa3\") " pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.357800 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7933917c-bf24-4c2b-b37e-dae93337cfa3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7933917c-bf24-4c2b-b37e-dae93337cfa3\") " pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.369561 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjr6c\" (UniqueName: \"kubernetes.io/projected/7933917c-bf24-4c2b-b37e-dae93337cfa3-kube-api-access-pjr6c\") pod \"nova-cell0-conductor-0\" (UID: \"7933917c-bf24-4c2b-b37e-dae93337cfa3\") " pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.444833 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:40 crc kubenswrapper[4746]: I0128 21:01:40.925538 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 21:01:41 crc kubenswrapper[4746]: I0128 21:01:41.085427 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7933917c-bf24-4c2b-b37e-dae93337cfa3","Type":"ContainerStarted","Data":"73a7a43ea9e0f7d69e4bf1ae2d23d3b178a286cffddbf3329ddec1f6e8030d55"} Jan 28 21:01:42 crc kubenswrapper[4746]: I0128 21:01:42.118886 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7933917c-bf24-4c2b-b37e-dae93337cfa3","Type":"ContainerStarted","Data":"0955cbe58489207a2e79b170dbe9a441f0c9db3e67f89f2a307ceac5ef88540a"} Jan 28 21:01:42 crc kubenswrapper[4746]: I0128 21:01:42.119222 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:42 crc kubenswrapper[4746]: I0128 21:01:42.125061 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac700cb5-2363-4223-b64c-ce8b49fede0b","Type":"ContainerStarted","Data":"fa5dd243bdbdaf9c1a3e8d89b06890ad683eba38bc2d27074ffd06017c2a78d8"} Jan 28 21:01:42 crc kubenswrapper[4746]: I0128 21:01:42.125971 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 21:01:42 crc kubenswrapper[4746]: I0128 21:01:42.137055 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.137026062 podStartE2EDuration="2.137026062s" podCreationTimestamp="2026-01-28 21:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:01:42.135890291 +0000 UTC m=+1330.092076635" watchObservedRunningTime="2026-01-28 21:01:42.137026062 +0000 UTC m=+1330.093212416" Jan 28 21:01:42 crc kubenswrapper[4746]: I0128 21:01:42.172822 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.609750676 podStartE2EDuration="7.172801044s" podCreationTimestamp="2026-01-28 21:01:35 +0000 UTC" firstStartedPulling="2026-01-28 21:01:36.17123052 +0000 UTC m=+1324.127416864" lastFinishedPulling="2026-01-28 21:01:41.734280878 +0000 UTC m=+1329.690467232" observedRunningTime="2026-01-28 21:01:42.154669266 +0000 UTC m=+1330.110855620" watchObservedRunningTime="2026-01-28 21:01:42.172801044 +0000 UTC m=+1330.128987388" Jan 28 21:01:46 crc kubenswrapper[4746]: I0128 21:01:46.094657 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 21:01:46 crc kubenswrapper[4746]: I0128 21:01:46.095391 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="7933917c-bf24-4c2b-b37e-dae93337cfa3" containerName="nova-cell0-conductor-conductor" containerID="cri-o://0955cbe58489207a2e79b170dbe9a441f0c9db3e67f89f2a307ceac5ef88540a" gracePeriod=30 Jan 28 21:01:46 crc kubenswrapper[4746]: I0128 21:01:46.127163 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:46 crc kubenswrapper[4746]: E0128 21:01:46.993538 4746 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7933917c_bf24_4c2b_b37e_dae93337cfa3.slice/crio-conmon-0955cbe58489207a2e79b170dbe9a441f0c9db3e67f89f2a307ceac5ef88540a.scope\": RecentStats: unable to find data in memory cache]" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.207319 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-28g8p"] Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.209421 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-28g8p" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.218318 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.231808 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.240733 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9krz\" (UniqueName: \"kubernetes.io/projected/8b92f820-9bba-4102-b4ee-1c541c3a05d7-kube-api-access-c9krz\") pod \"nova-cell0-cell-mapping-28g8p\" (UID: \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\") " pod="openstack/nova-cell0-cell-mapping-28g8p" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.240829 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b92f820-9bba-4102-b4ee-1c541c3a05d7-config-data\") pod \"nova-cell0-cell-mapping-28g8p\" (UID: \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\") " pod="openstack/nova-cell0-cell-mapping-28g8p" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.240895 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b92f820-9bba-4102-b4ee-1c541c3a05d7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-28g8p\" (UID: \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\") " pod="openstack/nova-cell0-cell-mapping-28g8p" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.240913 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b92f820-9bba-4102-b4ee-1c541c3a05d7-scripts\") pod \"nova-cell0-cell-mapping-28g8p\" (UID: \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\") " pod="openstack/nova-cell0-cell-mapping-28g8p" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.267687 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-28g8p"] Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.285273 4746 generic.go:334] "Generic (PLEG): container finished" podID="7933917c-bf24-4c2b-b37e-dae93337cfa3" containerID="0955cbe58489207a2e79b170dbe9a441f0c9db3e67f89f2a307ceac5ef88540a" exitCode=0 Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.285346 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7933917c-bf24-4c2b-b37e-dae93337cfa3","Type":"ContainerDied","Data":"0955cbe58489207a2e79b170dbe9a441f0c9db3e67f89f2a307ceac5ef88540a"} Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.343255 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b92f820-9bba-4102-b4ee-1c541c3a05d7-config-data\") pod \"nova-cell0-cell-mapping-28g8p\" (UID: \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\") " pod="openstack/nova-cell0-cell-mapping-28g8p" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.343351 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b92f820-9bba-4102-b4ee-1c541c3a05d7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-28g8p\" (UID: \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\") " pod="openstack/nova-cell0-cell-mapping-28g8p" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.343369 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b92f820-9bba-4102-b4ee-1c541c3a05d7-scripts\") pod \"nova-cell0-cell-mapping-28g8p\" (UID: \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\") " pod="openstack/nova-cell0-cell-mapping-28g8p" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.343450 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9krz\" (UniqueName: \"kubernetes.io/projected/8b92f820-9bba-4102-b4ee-1c541c3a05d7-kube-api-access-c9krz\") pod \"nova-cell0-cell-mapping-28g8p\" (UID: \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\") " pod="openstack/nova-cell0-cell-mapping-28g8p" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.357150 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b92f820-9bba-4102-b4ee-1c541c3a05d7-config-data\") pod \"nova-cell0-cell-mapping-28g8p\" (UID: \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\") " pod="openstack/nova-cell0-cell-mapping-28g8p" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.358728 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b92f820-9bba-4102-b4ee-1c541c3a05d7-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-28g8p\" (UID: \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\") " pod="openstack/nova-cell0-cell-mapping-28g8p" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.366595 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b92f820-9bba-4102-b4ee-1c541c3a05d7-scripts\") pod \"nova-cell0-cell-mapping-28g8p\" (UID: \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\") " pod="openstack/nova-cell0-cell-mapping-28g8p" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.425142 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9krz\" (UniqueName: \"kubernetes.io/projected/8b92f820-9bba-4102-b4ee-1c541c3a05d7-kube-api-access-c9krz\") pod \"nova-cell0-cell-mapping-28g8p\" (UID: \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\") " pod="openstack/nova-cell0-cell-mapping-28g8p" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.490339 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.492725 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.526014 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.548742 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.552705 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6509be40-b5da-4c87-bf0d-8b6a75084e60-logs\") pod \"nova-api-0\" (UID: \"6509be40-b5da-4c87-bf0d-8b6a75084e60\") " pod="openstack/nova-api-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.552783 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjzsd\" (UniqueName: \"kubernetes.io/projected/6509be40-b5da-4c87-bf0d-8b6a75084e60-kube-api-access-jjzsd\") pod \"nova-api-0\" (UID: \"6509be40-b5da-4c87-bf0d-8b6a75084e60\") " pod="openstack/nova-api-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.552809 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6509be40-b5da-4c87-bf0d-8b6a75084e60-config-data\") pod \"nova-api-0\" (UID: \"6509be40-b5da-4c87-bf0d-8b6a75084e60\") " pod="openstack/nova-api-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.552824 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6509be40-b5da-4c87-bf0d-8b6a75084e60-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6509be40-b5da-4c87-bf0d-8b6a75084e60\") " pod="openstack/nova-api-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.560112 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-28g8p" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.560673 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.653892 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6c\" (UniqueName: \"kubernetes.io/projected/7933917c-bf24-4c2b-b37e-dae93337cfa3-kube-api-access-pjr6c\") pod \"7933917c-bf24-4c2b-b37e-dae93337cfa3\" (UID: \"7933917c-bf24-4c2b-b37e-dae93337cfa3\") " Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.653971 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7933917c-bf24-4c2b-b37e-dae93337cfa3-config-data\") pod \"7933917c-bf24-4c2b-b37e-dae93337cfa3\" (UID: \"7933917c-bf24-4c2b-b37e-dae93337cfa3\") " Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.654062 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7933917c-bf24-4c2b-b37e-dae93337cfa3-combined-ca-bundle\") pod \"7933917c-bf24-4c2b-b37e-dae93337cfa3\" (UID: \"7933917c-bf24-4c2b-b37e-dae93337cfa3\") " Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.654471 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6509be40-b5da-4c87-bf0d-8b6a75084e60-logs\") pod \"nova-api-0\" (UID: \"6509be40-b5da-4c87-bf0d-8b6a75084e60\") " pod="openstack/nova-api-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.654563 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjzsd\" (UniqueName: \"kubernetes.io/projected/6509be40-b5da-4c87-bf0d-8b6a75084e60-kube-api-access-jjzsd\") pod \"nova-api-0\" (UID: \"6509be40-b5da-4c87-bf0d-8b6a75084e60\") " pod="openstack/nova-api-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.654598 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6509be40-b5da-4c87-bf0d-8b6a75084e60-config-data\") pod \"nova-api-0\" (UID: \"6509be40-b5da-4c87-bf0d-8b6a75084e60\") " pod="openstack/nova-api-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.654620 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6509be40-b5da-4c87-bf0d-8b6a75084e60-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6509be40-b5da-4c87-bf0d-8b6a75084e60\") " pod="openstack/nova-api-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.660490 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6509be40-b5da-4c87-bf0d-8b6a75084e60-logs\") pod \"nova-api-0\" (UID: \"6509be40-b5da-4c87-bf0d-8b6a75084e60\") " pod="openstack/nova-api-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.699601 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7933917c-bf24-4c2b-b37e-dae93337cfa3-kube-api-access-pjr6c" (OuterVolumeSpecName: "kube-api-access-pjr6c") pod "7933917c-bf24-4c2b-b37e-dae93337cfa3" (UID: "7933917c-bf24-4c2b-b37e-dae93337cfa3"). InnerVolumeSpecName "kube-api-access-pjr6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.699698 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6509be40-b5da-4c87-bf0d-8b6a75084e60-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6509be40-b5da-4c87-bf0d-8b6a75084e60\") " pod="openstack/nova-api-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.712151 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6509be40-b5da-4c87-bf0d-8b6a75084e60-config-data\") pod \"nova-api-0\" (UID: \"6509be40-b5da-4c87-bf0d-8b6a75084e60\") " pod="openstack/nova-api-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.712632 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjzsd\" (UniqueName: \"kubernetes.io/projected/6509be40-b5da-4c87-bf0d-8b6a75084e60-kube-api-access-jjzsd\") pod \"nova-api-0\" (UID: \"6509be40-b5da-4c87-bf0d-8b6a75084e60\") " pod="openstack/nova-api-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.732800 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 21:01:47 crc kubenswrapper[4746]: E0128 21:01:47.733260 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7933917c-bf24-4c2b-b37e-dae93337cfa3" containerName="nova-cell0-conductor-conductor" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.733277 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="7933917c-bf24-4c2b-b37e-dae93337cfa3" containerName="nova-cell0-conductor-conductor" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.733465 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="7933917c-bf24-4c2b-b37e-dae93337cfa3" containerName="nova-cell0-conductor-conductor" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.739373 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.770938 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.778666 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z78c7\" (UniqueName: \"kubernetes.io/projected/a812f7c0-8a46-4727-9944-7266f70244bd-kube-api-access-z78c7\") pod \"nova-scheduler-0\" (UID: \"a812f7c0-8a46-4727-9944-7266f70244bd\") " pod="openstack/nova-scheduler-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.778709 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a812f7c0-8a46-4727-9944-7266f70244bd-config-data\") pod \"nova-scheduler-0\" (UID: \"a812f7c0-8a46-4727-9944-7266f70244bd\") " pod="openstack/nova-scheduler-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.778781 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a812f7c0-8a46-4727-9944-7266f70244bd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a812f7c0-8a46-4727-9944-7266f70244bd\") " pod="openstack/nova-scheduler-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.778881 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6c\" (UniqueName: \"kubernetes.io/projected/7933917c-bf24-4c2b-b37e-dae93337cfa3-kube-api-access-pjr6c\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.780785 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7933917c-bf24-4c2b-b37e-dae93337cfa3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7933917c-bf24-4c2b-b37e-dae93337cfa3" (UID: "7933917c-bf24-4c2b-b37e-dae93337cfa3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.831917 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7933917c-bf24-4c2b-b37e-dae93337cfa3-config-data" (OuterVolumeSpecName: "config-data") pod "7933917c-bf24-4c2b-b37e-dae93337cfa3" (UID: "7933917c-bf24-4c2b-b37e-dae93337cfa3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.878136 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.878609 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.880102 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z78c7\" (UniqueName: \"kubernetes.io/projected/a812f7c0-8a46-4727-9944-7266f70244bd-kube-api-access-z78c7\") pod \"nova-scheduler-0\" (UID: \"a812f7c0-8a46-4727-9944-7266f70244bd\") " pod="openstack/nova-scheduler-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.880142 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a812f7c0-8a46-4727-9944-7266f70244bd-config-data\") pod \"nova-scheduler-0\" (UID: \"a812f7c0-8a46-4727-9944-7266f70244bd\") " pod="openstack/nova-scheduler-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.880220 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a812f7c0-8a46-4727-9944-7266f70244bd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a812f7c0-8a46-4727-9944-7266f70244bd\") " pod="openstack/nova-scheduler-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.880281 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7933917c-bf24-4c2b-b37e-dae93337cfa3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.880299 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7933917c-bf24-4c2b-b37e-dae93337cfa3-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.894157 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a812f7c0-8a46-4727-9944-7266f70244bd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a812f7c0-8a46-4727-9944-7266f70244bd\") " pod="openstack/nova-scheduler-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.899562 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a812f7c0-8a46-4727-9944-7266f70244bd-config-data\") pod \"nova-scheduler-0\" (UID: \"a812f7c0-8a46-4727-9944-7266f70244bd\") " pod="openstack/nova-scheduler-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.952194 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.953979 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:01:47 crc kubenswrapper[4746]: I0128 21:01:47.988464 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.012118 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z78c7\" (UniqueName: \"kubernetes.io/projected/a812f7c0-8a46-4727-9944-7266f70244bd-kube-api-access-z78c7\") pod \"nova-scheduler-0\" (UID: \"a812f7c0-8a46-4727-9944-7266f70244bd\") " pod="openstack/nova-scheduler-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.021153 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.055620 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.071495 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.076002 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.095677 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\") " pod="openstack/nova-metadata-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.095765 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjrgt\" (UniqueName: \"kubernetes.io/projected/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-kube-api-access-gjrgt\") pod \"nova-metadata-0\" (UID: \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\") " pod="openstack/nova-metadata-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.095793 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkqck\" (UniqueName: \"kubernetes.io/projected/66366de9-a831-472a-a11f-f749b76d2007-kube-api-access-xkqck\") pod \"nova-cell1-novncproxy-0\" (UID: \"66366de9-a831-472a-a11f-f749b76d2007\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.095814 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66366de9-a831-472a-a11f-f749b76d2007-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"66366de9-a831-472a-a11f-f749b76d2007\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.095864 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-logs\") pod \"nova-metadata-0\" (UID: \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\") " pod="openstack/nova-metadata-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.095887 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-config-data\") pod \"nova-metadata-0\" (UID: \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\") " pod="openstack/nova-metadata-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.095913 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66366de9-a831-472a-a11f-f749b76d2007-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"66366de9-a831-472a-a11f-f749b76d2007\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.116921 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.155173 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.198388 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\") " pod="openstack/nova-metadata-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.198475 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjrgt\" (UniqueName: \"kubernetes.io/projected/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-kube-api-access-gjrgt\") pod \"nova-metadata-0\" (UID: \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\") " pod="openstack/nova-metadata-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.198506 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkqck\" (UniqueName: \"kubernetes.io/projected/66366de9-a831-472a-a11f-f749b76d2007-kube-api-access-xkqck\") pod \"nova-cell1-novncproxy-0\" (UID: \"66366de9-a831-472a-a11f-f749b76d2007\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.198532 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66366de9-a831-472a-a11f-f749b76d2007-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"66366de9-a831-472a-a11f-f749b76d2007\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.198580 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-logs\") pod \"nova-metadata-0\" (UID: \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\") " pod="openstack/nova-metadata-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.198606 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-config-data\") pod \"nova-metadata-0\" (UID: \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\") " pod="openstack/nova-metadata-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.198640 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66366de9-a831-472a-a11f-f749b76d2007-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"66366de9-a831-472a-a11f-f749b76d2007\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.210017 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-logs\") pod \"nova-metadata-0\" (UID: \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\") " pod="openstack/nova-metadata-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.213496 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\") " pod="openstack/nova-metadata-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.216619 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-config-data\") pod \"nova-metadata-0\" (UID: \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\") " pod="openstack/nova-metadata-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.220280 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkqck\" (UniqueName: \"kubernetes.io/projected/66366de9-a831-472a-a11f-f749b76d2007-kube-api-access-xkqck\") pod \"nova-cell1-novncproxy-0\" (UID: \"66366de9-a831-472a-a11f-f749b76d2007\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.221940 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66366de9-a831-472a-a11f-f749b76d2007-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"66366de9-a831-472a-a11f-f749b76d2007\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.224625 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjrgt\" (UniqueName: \"kubernetes.io/projected/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-kube-api-access-gjrgt\") pod \"nova-metadata-0\" (UID: \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\") " pod="openstack/nova-metadata-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.245150 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cd565959-hcnml"] Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.245804 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66366de9-a831-472a-a11f-f749b76d2007-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"66366de9-a831-472a-a11f-f749b76d2007\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.246999 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.253650 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-hcnml"] Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.312172 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-hcnml\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.312260 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-dns-svc\") pod \"dnsmasq-dns-78cd565959-hcnml\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.312298 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-hcnml\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.312340 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-hcnml\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.312379 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-config\") pod \"dnsmasq-dns-78cd565959-hcnml\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.312410 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58mwn\" (UniqueName: \"kubernetes.io/projected/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-kube-api-access-58mwn\") pod \"dnsmasq-dns-78cd565959-hcnml\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.314930 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7933917c-bf24-4c2b-b37e-dae93337cfa3","Type":"ContainerDied","Data":"73a7a43ea9e0f7d69e4bf1ae2d23d3b178a286cffddbf3329ddec1f6e8030d55"} Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.316971 4746 scope.go:117] "RemoveContainer" containerID="0955cbe58489207a2e79b170dbe9a441f0c9db3e67f89f2a307ceac5ef88540a" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.317230 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.360351 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8xq2h"] Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.362318 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8xq2h" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.372546 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.372796 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.388141 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8xq2h"] Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.409501 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.415640 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-hcnml\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.417102 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487dd560-95f2-45e5-b09b-f8d3aae5548a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8xq2h\" (UID: \"487dd560-95f2-45e5-b09b-f8d3aae5548a\") " pod="openstack/nova-cell1-conductor-db-sync-8xq2h" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.417143 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/487dd560-95f2-45e5-b09b-f8d3aae5548a-scripts\") pod \"nova-cell1-conductor-db-sync-8xq2h\" (UID: \"487dd560-95f2-45e5-b09b-f8d3aae5548a\") " pod="openstack/nova-cell1-conductor-db-sync-8xq2h" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.417186 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-dns-svc\") pod \"dnsmasq-dns-78cd565959-hcnml\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.417210 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrg7b\" (UniqueName: \"kubernetes.io/projected/487dd560-95f2-45e5-b09b-f8d3aae5548a-kube-api-access-mrg7b\") pod \"nova-cell1-conductor-db-sync-8xq2h\" (UID: \"487dd560-95f2-45e5-b09b-f8d3aae5548a\") " pod="openstack/nova-cell1-conductor-db-sync-8xq2h" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.417234 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487dd560-95f2-45e5-b09b-f8d3aae5548a-config-data\") pod \"nova-cell1-conductor-db-sync-8xq2h\" (UID: \"487dd560-95f2-45e5-b09b-f8d3aae5548a\") " pod="openstack/nova-cell1-conductor-db-sync-8xq2h" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.417273 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-hcnml\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.417352 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-hcnml\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.417422 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-config\") pod \"dnsmasq-dns-78cd565959-hcnml\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.417456 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58mwn\" (UniqueName: \"kubernetes.io/projected/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-kube-api-access-58mwn\") pod \"dnsmasq-dns-78cd565959-hcnml\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.420527 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-dns-svc\") pod \"dnsmasq-dns-78cd565959-hcnml\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.421471 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-hcnml\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.421856 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-hcnml\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.422525 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-hcnml\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.422845 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-config\") pod \"dnsmasq-dns-78cd565959-hcnml\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.424640 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.432143 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.441885 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.443052 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58mwn\" (UniqueName: \"kubernetes.io/projected/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-kube-api-access-58mwn\") pod \"dnsmasq-dns-78cd565959-hcnml\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.443823 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.460184 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.461578 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.472130 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.488118 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-28g8p"] Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.527236 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487dd560-95f2-45e5-b09b-f8d3aae5548a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8xq2h\" (UID: \"487dd560-95f2-45e5-b09b-f8d3aae5548a\") " pod="openstack/nova-cell1-conductor-db-sync-8xq2h" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.527346 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/487dd560-95f2-45e5-b09b-f8d3aae5548a-scripts\") pod \"nova-cell1-conductor-db-sync-8xq2h\" (UID: \"487dd560-95f2-45e5-b09b-f8d3aae5548a\") " pod="openstack/nova-cell1-conductor-db-sync-8xq2h" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.527406 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrg7b\" (UniqueName: \"kubernetes.io/projected/487dd560-95f2-45e5-b09b-f8d3aae5548a-kube-api-access-mrg7b\") pod \"nova-cell1-conductor-db-sync-8xq2h\" (UID: \"487dd560-95f2-45e5-b09b-f8d3aae5548a\") " pod="openstack/nova-cell1-conductor-db-sync-8xq2h" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.527447 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487dd560-95f2-45e5-b09b-f8d3aae5548a-config-data\") pod \"nova-cell1-conductor-db-sync-8xq2h\" (UID: \"487dd560-95f2-45e5-b09b-f8d3aae5548a\") " pod="openstack/nova-cell1-conductor-db-sync-8xq2h" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.531213 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/487dd560-95f2-45e5-b09b-f8d3aae5548a-scripts\") pod \"nova-cell1-conductor-db-sync-8xq2h\" (UID: \"487dd560-95f2-45e5-b09b-f8d3aae5548a\") " pod="openstack/nova-cell1-conductor-db-sync-8xq2h" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.531730 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487dd560-95f2-45e5-b09b-f8d3aae5548a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8xq2h\" (UID: \"487dd560-95f2-45e5-b09b-f8d3aae5548a\") " pod="openstack/nova-cell1-conductor-db-sync-8xq2h" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.531882 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487dd560-95f2-45e5-b09b-f8d3aae5548a-config-data\") pod \"nova-cell1-conductor-db-sync-8xq2h\" (UID: \"487dd560-95f2-45e5-b09b-f8d3aae5548a\") " pod="openstack/nova-cell1-conductor-db-sync-8xq2h" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.560404 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrg7b\" (UniqueName: \"kubernetes.io/projected/487dd560-95f2-45e5-b09b-f8d3aae5548a-kube-api-access-mrg7b\") pod \"nova-cell1-conductor-db-sync-8xq2h\" (UID: \"487dd560-95f2-45e5-b09b-f8d3aae5548a\") " pod="openstack/nova-cell1-conductor-db-sync-8xq2h" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.602643 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.633452 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a34b42b7-85e9-4934-bbe2-487072111391-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a34b42b7-85e9-4934-bbe2-487072111391\") " pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.633559 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lds4x\" (UniqueName: \"kubernetes.io/projected/a34b42b7-85e9-4934-bbe2-487072111391-kube-api-access-lds4x\") pod \"nova-cell0-conductor-0\" (UID: \"a34b42b7-85e9-4934-bbe2-487072111391\") " pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.633582 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a34b42b7-85e9-4934-bbe2-487072111391-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a34b42b7-85e9-4934-bbe2-487072111391\") " pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.707222 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8xq2h" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.709605 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.735199 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a34b42b7-85e9-4934-bbe2-487072111391-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a34b42b7-85e9-4934-bbe2-487072111391\") " pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.736009 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lds4x\" (UniqueName: \"kubernetes.io/projected/a34b42b7-85e9-4934-bbe2-487072111391-kube-api-access-lds4x\") pod \"nova-cell0-conductor-0\" (UID: \"a34b42b7-85e9-4934-bbe2-487072111391\") " pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.736169 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a34b42b7-85e9-4934-bbe2-487072111391-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a34b42b7-85e9-4934-bbe2-487072111391\") " pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.742323 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a34b42b7-85e9-4934-bbe2-487072111391-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a34b42b7-85e9-4934-bbe2-487072111391\") " pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.752355 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a34b42b7-85e9-4934-bbe2-487072111391-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a34b42b7-85e9-4934-bbe2-487072111391\") " pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.754706 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lds4x\" (UniqueName: \"kubernetes.io/projected/a34b42b7-85e9-4934-bbe2-487072111391-kube-api-access-lds4x\") pod \"nova-cell0-conductor-0\" (UID: \"a34b42b7-85e9-4934-bbe2-487072111391\") " pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.786742 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:48 crc kubenswrapper[4746]: I0128 21:01:48.865913 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7933917c-bf24-4c2b-b37e-dae93337cfa3" path="/var/lib/kubelet/pods/7933917c-bf24-4c2b-b37e-dae93337cfa3/volumes" Jan 28 21:01:49 crc kubenswrapper[4746]: I0128 21:01:49.025968 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 21:01:49 crc kubenswrapper[4746]: I0128 21:01:49.157458 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 21:01:49 crc kubenswrapper[4746]: I0128 21:01:49.180359 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 21:01:49 crc kubenswrapper[4746]: I0128 21:01:49.304671 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-hcnml"] Jan 28 21:01:49 crc kubenswrapper[4746]: I0128 21:01:49.333587 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-28g8p" event={"ID":"8b92f820-9bba-4102-b4ee-1c541c3a05d7","Type":"ContainerStarted","Data":"7c214f6d26474e5fb2d9b8ea32566cfc23faa619a749415b5082e72af23ad209"} Jan 28 21:01:49 crc kubenswrapper[4746]: I0128 21:01:49.333647 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-28g8p" event={"ID":"8b92f820-9bba-4102-b4ee-1c541c3a05d7","Type":"ContainerStarted","Data":"5ef99f1b1768da50031514281db28d141f4edb814d5d9eca44686c733576fe8d"} Jan 28 21:01:49 crc kubenswrapper[4746]: I0128 21:01:49.335276 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-hcnml" event={"ID":"ac1b9c96-c2d1-43fb-96bb-e79328b627a6","Type":"ContainerStarted","Data":"0f3278d9541726ab41caffdcd78b1335643d68733959c1eb48df08d0b136330d"} Jan 28 21:01:49 crc kubenswrapper[4746]: I0128 21:01:49.344860 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d76b1f13-0ce0-4706-a7d8-d70670e82b8f","Type":"ContainerStarted","Data":"0d2b6ab1f43894d7dc92dd5353c1cea8c017ead965222c013bd63d386f2a2399"} Jan 28 21:01:49 crc kubenswrapper[4746]: I0128 21:01:49.356132 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a812f7c0-8a46-4727-9944-7266f70244bd","Type":"ContainerStarted","Data":"cc7be7ff3f15cfc9a933e1d3f743ef4c92439ebbaa8d9c8491500a3d000cc0f1"} Jan 28 21:01:49 crc kubenswrapper[4746]: I0128 21:01:49.362636 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-28g8p" podStartSLOduration=2.362615651 podStartE2EDuration="2.362615651s" podCreationTimestamp="2026-01-28 21:01:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:01:49.356531678 +0000 UTC m=+1337.312718032" watchObservedRunningTime="2026-01-28 21:01:49.362615651 +0000 UTC m=+1337.318802015" Jan 28 21:01:49 crc kubenswrapper[4746]: I0128 21:01:49.372309 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"66366de9-a831-472a-a11f-f749b76d2007","Type":"ContainerStarted","Data":"b9d3b224f8157eda177a5959e9aa575569ccbe5b5039314e32f5adef0cd73dca"} Jan 28 21:01:49 crc kubenswrapper[4746]: I0128 21:01:49.381054 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6509be40-b5da-4c87-bf0d-8b6a75084e60","Type":"ContainerStarted","Data":"4096bf769d0dab06efa6153492ae287b346d786915ee35454cdbc4c4014731b3"} Jan 28 21:01:49 crc kubenswrapper[4746]: I0128 21:01:49.546933 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8xq2h"] Jan 28 21:01:49 crc kubenswrapper[4746]: I0128 21:01:49.694333 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 28 21:01:49 crc kubenswrapper[4746]: W0128 21:01:49.709709 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda34b42b7_85e9_4934_bbe2_487072111391.slice/crio-2cffadff56db27146a15a59e769c12d6a5ad62a05893e607aa82c0560408ff3b WatchSource:0}: Error finding container 2cffadff56db27146a15a59e769c12d6a5ad62a05893e607aa82c0560408ff3b: Status 404 returned error can't find the container with id 2cffadff56db27146a15a59e769c12d6a5ad62a05893e607aa82c0560408ff3b Jan 28 21:01:50 crc kubenswrapper[4746]: I0128 21:01:50.420907 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8xq2h" event={"ID":"487dd560-95f2-45e5-b09b-f8d3aae5548a","Type":"ContainerStarted","Data":"02357381c64547dca72c19b1c8d373c82811515aadec52b3bfa42aa59d1bc3c3"} Jan 28 21:01:50 crc kubenswrapper[4746]: I0128 21:01:50.421381 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8xq2h" event={"ID":"487dd560-95f2-45e5-b09b-f8d3aae5548a","Type":"ContainerStarted","Data":"02ee997cba73463e991795ba89bbba276487cadfcd5ab2b765f77045cb2ca9c8"} Jan 28 21:01:50 crc kubenswrapper[4746]: I0128 21:01:50.447109 4746 generic.go:334] "Generic (PLEG): container finished" podID="ac1b9c96-c2d1-43fb-96bb-e79328b627a6" containerID="2757f52655aadfb3734a5117d6fbd18efe3eae7d26e3e6abf772479f8fc34f8b" exitCode=0 Jan 28 21:01:50 crc kubenswrapper[4746]: I0128 21:01:50.447177 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-hcnml" event={"ID":"ac1b9c96-c2d1-43fb-96bb-e79328b627a6","Type":"ContainerDied","Data":"2757f52655aadfb3734a5117d6fbd18efe3eae7d26e3e6abf772479f8fc34f8b"} Jan 28 21:01:50 crc kubenswrapper[4746]: I0128 21:01:50.458821 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a34b42b7-85e9-4934-bbe2-487072111391","Type":"ContainerStarted","Data":"60f8f8a8fa839524a7627858d5ff9494dceadfc2367f94ac697bb6da6ca64d4d"} Jan 28 21:01:50 crc kubenswrapper[4746]: I0128 21:01:50.458861 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a34b42b7-85e9-4934-bbe2-487072111391","Type":"ContainerStarted","Data":"2cffadff56db27146a15a59e769c12d6a5ad62a05893e607aa82c0560408ff3b"} Jan 28 21:01:50 crc kubenswrapper[4746]: I0128 21:01:50.458874 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:50 crc kubenswrapper[4746]: I0128 21:01:50.479815 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-8xq2h" podStartSLOduration=2.479793392 podStartE2EDuration="2.479793392s" podCreationTimestamp="2026-01-28 21:01:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:01:50.454324267 +0000 UTC m=+1338.410510631" watchObservedRunningTime="2026-01-28 21:01:50.479793392 +0000 UTC m=+1338.435979746" Jan 28 21:01:50 crc kubenswrapper[4746]: I0128 21:01:50.527860 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.527845205 podStartE2EDuration="2.527845205s" podCreationTimestamp="2026-01-28 21:01:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:01:50.527344041 +0000 UTC m=+1338.483530385" watchObservedRunningTime="2026-01-28 21:01:50.527845205 +0000 UTC m=+1338.484031559" Jan 28 21:01:50 crc kubenswrapper[4746]: I0128 21:01:50.626142 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:50 crc kubenswrapper[4746]: I0128 21:01:50.626446 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerName="ceilometer-central-agent" containerID="cri-o://35ba04f239fe059c515dd7957ca078df99c49e377b8595092ddd274dc8857306" gracePeriod=30 Jan 28 21:01:50 crc kubenswrapper[4746]: I0128 21:01:50.627199 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerName="proxy-httpd" containerID="cri-o://fa5dd243bdbdaf9c1a3e8d89b06890ad683eba38bc2d27074ffd06017c2a78d8" gracePeriod=30 Jan 28 21:01:50 crc kubenswrapper[4746]: I0128 21:01:50.627389 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerName="ceilometer-notification-agent" containerID="cri-o://d1d3cc6d86c3875422bd9d16643f59c264f59e5f36baa5e59642407acc0d04a8" gracePeriod=30 Jan 28 21:01:50 crc kubenswrapper[4746]: I0128 21:01:50.627434 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerName="sg-core" containerID="cri-o://425b35e63efc0aa88654bf7621c444e520b92e6d39c47e8f77238502bd74fb9a" gracePeriod=30 Jan 28 21:01:51 crc kubenswrapper[4746]: I0128 21:01:51.478357 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-hcnml" event={"ID":"ac1b9c96-c2d1-43fb-96bb-e79328b627a6","Type":"ContainerStarted","Data":"c5b4b1463b13700962c4ecfa0f03dc72ed0b20a3635db0d9f5dc5f100f396cee"} Jan 28 21:01:51 crc kubenswrapper[4746]: I0128 21:01:51.479281 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:51 crc kubenswrapper[4746]: I0128 21:01:51.484873 4746 generic.go:334] "Generic (PLEG): container finished" podID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerID="fa5dd243bdbdaf9c1a3e8d89b06890ad683eba38bc2d27074ffd06017c2a78d8" exitCode=0 Jan 28 21:01:51 crc kubenswrapper[4746]: I0128 21:01:51.484906 4746 generic.go:334] "Generic (PLEG): container finished" podID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerID="425b35e63efc0aa88654bf7621c444e520b92e6d39c47e8f77238502bd74fb9a" exitCode=2 Jan 28 21:01:51 crc kubenswrapper[4746]: I0128 21:01:51.484916 4746 generic.go:334] "Generic (PLEG): container finished" podID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerID="35ba04f239fe059c515dd7957ca078df99c49e377b8595092ddd274dc8857306" exitCode=0 Jan 28 21:01:51 crc kubenswrapper[4746]: I0128 21:01:51.485526 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac700cb5-2363-4223-b64c-ce8b49fede0b","Type":"ContainerDied","Data":"fa5dd243bdbdaf9c1a3e8d89b06890ad683eba38bc2d27074ffd06017c2a78d8"} Jan 28 21:01:51 crc kubenswrapper[4746]: I0128 21:01:51.485557 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac700cb5-2363-4223-b64c-ce8b49fede0b","Type":"ContainerDied","Data":"425b35e63efc0aa88654bf7621c444e520b92e6d39c47e8f77238502bd74fb9a"} Jan 28 21:01:51 crc kubenswrapper[4746]: I0128 21:01:51.485567 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac700cb5-2363-4223-b64c-ce8b49fede0b","Type":"ContainerDied","Data":"35ba04f239fe059c515dd7957ca078df99c49e377b8595092ddd274dc8857306"} Jan 28 21:01:51 crc kubenswrapper[4746]: I0128 21:01:51.511687 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cd565959-hcnml" podStartSLOduration=3.511667018 podStartE2EDuration="3.511667018s" podCreationTimestamp="2026-01-28 21:01:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:01:51.502239554 +0000 UTC m=+1339.458425928" watchObservedRunningTime="2026-01-28 21:01:51.511667018 +0000 UTC m=+1339.467853372" Jan 28 21:01:52 crc kubenswrapper[4746]: I0128 21:01:52.499350 4746 generic.go:334] "Generic (PLEG): container finished" podID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerID="d1d3cc6d86c3875422bd9d16643f59c264f59e5f36baa5e59642407acc0d04a8" exitCode=0 Jan 28 21:01:52 crc kubenswrapper[4746]: I0128 21:01:52.499496 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac700cb5-2363-4223-b64c-ce8b49fede0b","Type":"ContainerDied","Data":"d1d3cc6d86c3875422bd9d16643f59c264f59e5f36baa5e59642407acc0d04a8"} Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.325316 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.434030 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac700cb5-2363-4223-b64c-ce8b49fede0b-log-httpd\") pod \"ac700cb5-2363-4223-b64c-ce8b49fede0b\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.434145 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-combined-ca-bundle\") pod \"ac700cb5-2363-4223-b64c-ce8b49fede0b\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.434312 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whbw7\" (UniqueName: \"kubernetes.io/projected/ac700cb5-2363-4223-b64c-ce8b49fede0b-kube-api-access-whbw7\") pod \"ac700cb5-2363-4223-b64c-ce8b49fede0b\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.434341 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-scripts\") pod \"ac700cb5-2363-4223-b64c-ce8b49fede0b\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.434369 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-config-data\") pod \"ac700cb5-2363-4223-b64c-ce8b49fede0b\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.434395 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-sg-core-conf-yaml\") pod \"ac700cb5-2363-4223-b64c-ce8b49fede0b\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.434481 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac700cb5-2363-4223-b64c-ce8b49fede0b-run-httpd\") pod \"ac700cb5-2363-4223-b64c-ce8b49fede0b\" (UID: \"ac700cb5-2363-4223-b64c-ce8b49fede0b\") " Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.435487 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac700cb5-2363-4223-b64c-ce8b49fede0b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ac700cb5-2363-4223-b64c-ce8b49fede0b" (UID: "ac700cb5-2363-4223-b64c-ce8b49fede0b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.435745 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac700cb5-2363-4223-b64c-ce8b49fede0b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ac700cb5-2363-4223-b64c-ce8b49fede0b" (UID: "ac700cb5-2363-4223-b64c-ce8b49fede0b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.445652 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac700cb5-2363-4223-b64c-ce8b49fede0b-kube-api-access-whbw7" (OuterVolumeSpecName: "kube-api-access-whbw7") pod "ac700cb5-2363-4223-b64c-ce8b49fede0b" (UID: "ac700cb5-2363-4223-b64c-ce8b49fede0b"). InnerVolumeSpecName "kube-api-access-whbw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.452844 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-scripts" (OuterVolumeSpecName: "scripts") pod "ac700cb5-2363-4223-b64c-ce8b49fede0b" (UID: "ac700cb5-2363-4223-b64c-ce8b49fede0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.478217 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ac700cb5-2363-4223-b64c-ce8b49fede0b" (UID: "ac700cb5-2363-4223-b64c-ce8b49fede0b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.536664 4746 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac700cb5-2363-4223-b64c-ce8b49fede0b-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.536948 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whbw7\" (UniqueName: \"kubernetes.io/projected/ac700cb5-2363-4223-b64c-ce8b49fede0b-kube-api-access-whbw7\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.536960 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.536970 4746 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.536977 4746 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac700cb5-2363-4223-b64c-ce8b49fede0b-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.574379 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac700cb5-2363-4223-b64c-ce8b49fede0b","Type":"ContainerDied","Data":"8f6a16304a1ca2382cc2d1481ec8f61659e597e6d46a76531d94662a5e48e810"} Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.574552 4746 scope.go:117] "RemoveContainer" containerID="fa5dd243bdbdaf9c1a3e8d89b06890ad683eba38bc2d27074ffd06017c2a78d8" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.574937 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.600817 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac700cb5-2363-4223-b64c-ce8b49fede0b" (UID: "ac700cb5-2363-4223-b64c-ce8b49fede0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.621184 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-config-data" (OuterVolumeSpecName: "config-data") pod "ac700cb5-2363-4223-b64c-ce8b49fede0b" (UID: "ac700cb5-2363-4223-b64c-ce8b49fede0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.638905 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.638939 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac700cb5-2363-4223-b64c-ce8b49fede0b-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.760656 4746 scope.go:117] "RemoveContainer" containerID="425b35e63efc0aa88654bf7621c444e520b92e6d39c47e8f77238502bd74fb9a" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.796970 4746 scope.go:117] "RemoveContainer" containerID="d1d3cc6d86c3875422bd9d16643f59c264f59e5f36baa5e59642407acc0d04a8" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.819962 4746 scope.go:117] "RemoveContainer" containerID="35ba04f239fe059c515dd7957ca078df99c49e377b8595092ddd274dc8857306" Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.932363 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:54 crc kubenswrapper[4746]: I0128 21:01:54.941793 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.008196 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:55 crc kubenswrapper[4746]: E0128 21:01:55.008932 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerName="proxy-httpd" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.008949 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerName="proxy-httpd" Jan 28 21:01:55 crc kubenswrapper[4746]: E0128 21:01:55.008962 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerName="sg-core" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.008968 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerName="sg-core" Jan 28 21:01:55 crc kubenswrapper[4746]: E0128 21:01:55.008982 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerName="ceilometer-central-agent" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.008990 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerName="ceilometer-central-agent" Jan 28 21:01:55 crc kubenswrapper[4746]: E0128 21:01:55.009020 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerName="ceilometer-notification-agent" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.009026 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerName="ceilometer-notification-agent" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.009283 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerName="ceilometer-central-agent" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.009310 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerName="proxy-httpd" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.009320 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerName="ceilometer-notification-agent" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.009329 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac700cb5-2363-4223-b64c-ce8b49fede0b" containerName="sg-core" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.011180 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.016696 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.016748 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.025214 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.151566 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-scripts\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.151631 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-config-data\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.151836 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-run-httpd\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.151889 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m522\" (UniqueName: \"kubernetes.io/projected/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-kube-api-access-7m522\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.152009 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-log-httpd\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.152180 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.152375 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.254486 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-run-httpd\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.254547 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m522\" (UniqueName: \"kubernetes.io/projected/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-kube-api-access-7m522\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.254616 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-log-httpd\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.254694 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.254776 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.254809 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-scripts\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.254862 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-config-data\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.255047 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-run-httpd\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.255204 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-log-httpd\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.259400 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.259579 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-config-data\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.259692 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-scripts\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.260017 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.279788 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m522\" (UniqueName: \"kubernetes.io/projected/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-kube-api-access-7m522\") pod \"ceilometer-0\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.387813 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.591886 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6509be40-b5da-4c87-bf0d-8b6a75084e60","Type":"ContainerStarted","Data":"c5392fe62e250472df5cf8e4e0ffaed45046c8c1e507a59c0064fefb6d3c2229"} Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.592231 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6509be40-b5da-4c87-bf0d-8b6a75084e60","Type":"ContainerStarted","Data":"2d5c6aeb41f3c79d4826fc65a5bf26beed2e4f16a9fe3ac7fb01fd14427dec9b"} Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.610556 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a812f7c0-8a46-4727-9944-7266f70244bd","Type":"ContainerStarted","Data":"446c2979b187ee17480edf3f2ff55a150d3fb08121eca5c0761921ce2fe79b92"} Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.623038 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d76b1f13-0ce0-4706-a7d8-d70670e82b8f","Type":"ContainerStarted","Data":"bd57e2df9d9e364ec7582d491ba8a47746adfa26a358a8c959ff8ec34ca6095c"} Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.623308 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d76b1f13-0ce0-4706-a7d8-d70670e82b8f","Type":"ContainerStarted","Data":"27471481cbaf8842ff3ee5bbb4fc59b4b77b4df3e987a2f6bb7dd3cc7327c88a"} Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.624772 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"66366de9-a831-472a-a11f-f749b76d2007","Type":"ContainerStarted","Data":"3a8783764d91750ca85f5feeb2514d839d6de31cf5bf63a7a028debec1f34e57"} Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.626549 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.870942864 podStartE2EDuration="8.626525002s" podCreationTimestamp="2026-01-28 21:01:47 +0000 UTC" firstStartedPulling="2026-01-28 21:01:48.743248851 +0000 UTC m=+1336.699435205" lastFinishedPulling="2026-01-28 21:01:54.498830989 +0000 UTC m=+1342.455017343" observedRunningTime="2026-01-28 21:01:55.622615047 +0000 UTC m=+1343.578801401" watchObservedRunningTime="2026-01-28 21:01:55.626525002 +0000 UTC m=+1343.582711356" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.682418 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.258714745 podStartE2EDuration="8.682392895s" podCreationTimestamp="2026-01-28 21:01:47 +0000 UTC" firstStartedPulling="2026-01-28 21:01:49.071414458 +0000 UTC m=+1337.027600812" lastFinishedPulling="2026-01-28 21:01:54.495092608 +0000 UTC m=+1342.451278962" observedRunningTime="2026-01-28 21:01:55.665361507 +0000 UTC m=+1343.621547861" watchObservedRunningTime="2026-01-28 21:01:55.682392895 +0000 UTC m=+1343.638579249" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.703425 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.367616123 podStartE2EDuration="8.70340361s" podCreationTimestamp="2026-01-28 21:01:47 +0000 UTC" firstStartedPulling="2026-01-28 21:01:49.152131859 +0000 UTC m=+1337.108318213" lastFinishedPulling="2026-01-28 21:01:54.487919346 +0000 UTC m=+1342.444105700" observedRunningTime="2026-01-28 21:01:55.689593449 +0000 UTC m=+1343.645779803" watchObservedRunningTime="2026-01-28 21:01:55.70340361 +0000 UTC m=+1343.659589954" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.716404 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.379165015 podStartE2EDuration="8.716382689s" podCreationTimestamp="2026-01-28 21:01:47 +0000 UTC" firstStartedPulling="2026-01-28 21:01:49.152182881 +0000 UTC m=+1337.108369235" lastFinishedPulling="2026-01-28 21:01:54.489400555 +0000 UTC m=+1342.445586909" observedRunningTime="2026-01-28 21:01:55.711359805 +0000 UTC m=+1343.667546159" watchObservedRunningTime="2026-01-28 21:01:55.716382689 +0000 UTC m=+1343.672569043" Jan 28 21:01:55 crc kubenswrapper[4746]: I0128 21:01:55.976574 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:01:56 crc kubenswrapper[4746]: I0128 21:01:56.640245 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cbb8deb-abbe-4971-aef9-e1a801eb55eb","Type":"ContainerStarted","Data":"bd92a726f169e6882618a8bda0a39e84badd73a38af13ac6c385c650235928e5"} Jan 28 21:01:56 crc kubenswrapper[4746]: I0128 21:01:56.847432 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac700cb5-2363-4223-b64c-ce8b49fede0b" path="/var/lib/kubelet/pods/ac700cb5-2363-4223-b64c-ce8b49fede0b/volumes" Jan 28 21:01:57 crc kubenswrapper[4746]: I0128 21:01:57.652360 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cbb8deb-abbe-4971-aef9-e1a801eb55eb","Type":"ContainerStarted","Data":"629e76dea151bd0f814f1082d64186f4b063dca70fa7f1a8f1f73aeec446f671"} Jan 28 21:01:57 crc kubenswrapper[4746]: I0128 21:01:57.652724 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cbb8deb-abbe-4971-aef9-e1a801eb55eb","Type":"ContainerStarted","Data":"b2ad84d6551f116608ef3dc3a9bbb9e32265de65399db25e132d0a1a69fd7ed5"} Jan 28 21:01:57 crc kubenswrapper[4746]: I0128 21:01:57.879820 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 21:01:57 crc kubenswrapper[4746]: I0128 21:01:57.880258 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.121748 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.122094 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.160722 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.425206 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.425323 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.438349 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.461808 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.461920 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.461935 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.462153 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.605222 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.674582 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-wpkx6"] Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.675053 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" podUID="8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7" containerName="dnsmasq-dns" containerID="cri-o://ed92c85d85d05f882b4652d6fab0ad079fda2326af5a70d2e8cc1a622430551f" gracePeriod=10 Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.688169 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cbb8deb-abbe-4971-aef9-e1a801eb55eb","Type":"ContainerStarted","Data":"ee473725abd27899722f37d0953e40f88fcf03f78dea819d20fd5c74d953a493"} Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.722375 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.771449 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.937882 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.973397 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6509be40-b5da-4c87-bf0d-8b6a75084e60" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 21:01:58 crc kubenswrapper[4746]: I0128 21:01:58.973670 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6509be40-b5da-4c87-bf0d-8b6a75084e60" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.217:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.504571 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d76b1f13-0ce0-4706-a7d8-d70670e82b8f" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.0.220:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.546269 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d76b1f13-0ce0-4706-a7d8-d70670e82b8f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.220:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.631655 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.708764 4746 generic.go:334] "Generic (PLEG): container finished" podID="8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7" containerID="ed92c85d85d05f882b4652d6fab0ad079fda2326af5a70d2e8cc1a622430551f" exitCode=0 Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.709162 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" event={"ID":"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7","Type":"ContainerDied","Data":"ed92c85d85d05f882b4652d6fab0ad079fda2326af5a70d2e8cc1a622430551f"} Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.709194 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" event={"ID":"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7","Type":"ContainerDied","Data":"3629995d4fda40f9859d8bcb57b0d7c16909c9d6e84bf3d481632363b64381c7"} Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.709216 4746 scope.go:117] "RemoveContainer" containerID="ed92c85d85d05f882b4652d6fab0ad079fda2326af5a70d2e8cc1a622430551f" Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.709398 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-wpkx6" Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.721854 4746 generic.go:334] "Generic (PLEG): container finished" podID="8b92f820-9bba-4102-b4ee-1c541c3a05d7" containerID="7c214f6d26474e5fb2d9b8ea32566cfc23faa619a749415b5082e72af23ad209" exitCode=0 Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.722864 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-28g8p" event={"ID":"8b92f820-9bba-4102-b4ee-1c541c3a05d7","Type":"ContainerDied","Data":"7c214f6d26474e5fb2d9b8ea32566cfc23faa619a749415b5082e72af23ad209"} Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.783991 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4qfz\" (UniqueName: \"kubernetes.io/projected/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-kube-api-access-f4qfz\") pod \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.784063 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-dns-swift-storage-0\") pod \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.784191 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-config\") pod \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.784224 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-ovsdbserver-sb\") pod \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.784269 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-dns-svc\") pod \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.784384 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-ovsdbserver-nb\") pod \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\" (UID: \"8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7\") " Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.793237 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-kube-api-access-f4qfz" (OuterVolumeSpecName: "kube-api-access-f4qfz") pod "8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7" (UID: "8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7"). InnerVolumeSpecName "kube-api-access-f4qfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.821148 4746 scope.go:117] "RemoveContainer" containerID="8e5056677d97b78160110caaa2719f7a0de3c0c2b784a82d52054567943aaa26" Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.893718 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7" (UID: "8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.909627 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4qfz\" (UniqueName: \"kubernetes.io/projected/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-kube-api-access-f4qfz\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.909654 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.910773 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.938631 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7" (UID: "8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.955736 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.955991 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d76b1f13-0ce0-4706-a7d8-d70670e82b8f" containerName="nova-metadata-log" containerID="cri-o://27471481cbaf8842ff3ee5bbb4fc59b4b77b4df3e987a2f6bb7dd3cc7327c88a" gracePeriod=30 Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.956498 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d76b1f13-0ce0-4706-a7d8-d70670e82b8f" containerName="nova-metadata-metadata" containerID="cri-o://bd57e2df9d9e364ec7582d491ba8a47746adfa26a358a8c959ff8ec34ca6095c" gracePeriod=30 Jan 28 21:01:59 crc kubenswrapper[4746]: I0128 21:01:59.965712 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7" (UID: "8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:02:00 crc kubenswrapper[4746]: I0128 21:02:00.012392 4746 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:00 crc kubenswrapper[4746]: I0128 21:02:00.012422 4746 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:00 crc kubenswrapper[4746]: I0128 21:02:00.023324 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7" (UID: "8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:02:00 crc kubenswrapper[4746]: I0128 21:02:00.026521 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-config" (OuterVolumeSpecName: "config") pod "8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7" (UID: "8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:02:00 crc kubenswrapper[4746]: I0128 21:02:00.051299 4746 scope.go:117] "RemoveContainer" containerID="ed92c85d85d05f882b4652d6fab0ad079fda2326af5a70d2e8cc1a622430551f" Jan 28 21:02:00 crc kubenswrapper[4746]: E0128 21:02:00.051769 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed92c85d85d05f882b4652d6fab0ad079fda2326af5a70d2e8cc1a622430551f\": container with ID starting with ed92c85d85d05f882b4652d6fab0ad079fda2326af5a70d2e8cc1a622430551f not found: ID does not exist" containerID="ed92c85d85d05f882b4652d6fab0ad079fda2326af5a70d2e8cc1a622430551f" Jan 28 21:02:00 crc kubenswrapper[4746]: I0128 21:02:00.051802 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed92c85d85d05f882b4652d6fab0ad079fda2326af5a70d2e8cc1a622430551f"} err="failed to get container status \"ed92c85d85d05f882b4652d6fab0ad079fda2326af5a70d2e8cc1a622430551f\": rpc error: code = NotFound desc = could not find container \"ed92c85d85d05f882b4652d6fab0ad079fda2326af5a70d2e8cc1a622430551f\": container with ID starting with ed92c85d85d05f882b4652d6fab0ad079fda2326af5a70d2e8cc1a622430551f not found: ID does not exist" Jan 28 21:02:00 crc kubenswrapper[4746]: I0128 21:02:00.051824 4746 scope.go:117] "RemoveContainer" containerID="8e5056677d97b78160110caaa2719f7a0de3c0c2b784a82d52054567943aaa26" Jan 28 21:02:00 crc kubenswrapper[4746]: E0128 21:02:00.057611 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e5056677d97b78160110caaa2719f7a0de3c0c2b784a82d52054567943aaa26\": container with ID starting with 8e5056677d97b78160110caaa2719f7a0de3c0c2b784a82d52054567943aaa26 not found: ID does not exist" containerID="8e5056677d97b78160110caaa2719f7a0de3c0c2b784a82d52054567943aaa26" Jan 28 21:02:00 crc kubenswrapper[4746]: I0128 21:02:00.057670 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e5056677d97b78160110caaa2719f7a0de3c0c2b784a82d52054567943aaa26"} err="failed to get container status \"8e5056677d97b78160110caaa2719f7a0de3c0c2b784a82d52054567943aaa26\": rpc error: code = NotFound desc = could not find container \"8e5056677d97b78160110caaa2719f7a0de3c0c2b784a82d52054567943aaa26\": container with ID starting with 8e5056677d97b78160110caaa2719f7a0de3c0c2b784a82d52054567943aaa26 not found: ID does not exist" Jan 28 21:02:00 crc kubenswrapper[4746]: I0128 21:02:00.113961 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:00 crc kubenswrapper[4746]: I0128 21:02:00.113999 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7-config\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:00 crc kubenswrapper[4746]: I0128 21:02:00.350203 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-wpkx6"] Jan 28 21:02:00 crc kubenswrapper[4746]: I0128 21:02:00.363126 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-wpkx6"] Jan 28 21:02:00 crc kubenswrapper[4746]: I0128 21:02:00.744839 4746 generic.go:334] "Generic (PLEG): container finished" podID="d76b1f13-0ce0-4706-a7d8-d70670e82b8f" containerID="27471481cbaf8842ff3ee5bbb4fc59b4b77b4df3e987a2f6bb7dd3cc7327c88a" exitCode=143 Jan 28 21:02:00 crc kubenswrapper[4746]: I0128 21:02:00.744925 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d76b1f13-0ce0-4706-a7d8-d70670e82b8f","Type":"ContainerDied","Data":"27471481cbaf8842ff3ee5bbb4fc59b4b77b4df3e987a2f6bb7dd3cc7327c88a"} Jan 28 21:02:00 crc kubenswrapper[4746]: I0128 21:02:00.848794 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7" path="/var/lib/kubelet/pods/8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7/volumes" Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.262613 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-28g8p" Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.352777 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b92f820-9bba-4102-b4ee-1c541c3a05d7-config-data\") pod \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\" (UID: \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\") " Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.353164 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9krz\" (UniqueName: \"kubernetes.io/projected/8b92f820-9bba-4102-b4ee-1c541c3a05d7-kube-api-access-c9krz\") pod \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\" (UID: \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\") " Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.353311 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b92f820-9bba-4102-b4ee-1c541c3a05d7-combined-ca-bundle\") pod \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\" (UID: \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\") " Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.353356 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b92f820-9bba-4102-b4ee-1c541c3a05d7-scripts\") pod \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\" (UID: \"8b92f820-9bba-4102-b4ee-1c541c3a05d7\") " Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.358748 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b92f820-9bba-4102-b4ee-1c541c3a05d7-kube-api-access-c9krz" (OuterVolumeSpecName: "kube-api-access-c9krz") pod "8b92f820-9bba-4102-b4ee-1c541c3a05d7" (UID: "8b92f820-9bba-4102-b4ee-1c541c3a05d7"). InnerVolumeSpecName "kube-api-access-c9krz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.364049 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b92f820-9bba-4102-b4ee-1c541c3a05d7-scripts" (OuterVolumeSpecName: "scripts") pod "8b92f820-9bba-4102-b4ee-1c541c3a05d7" (UID: "8b92f820-9bba-4102-b4ee-1c541c3a05d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.408991 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b92f820-9bba-4102-b4ee-1c541c3a05d7-config-data" (OuterVolumeSpecName: "config-data") pod "8b92f820-9bba-4102-b4ee-1c541c3a05d7" (UID: "8b92f820-9bba-4102-b4ee-1c541c3a05d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.417116 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b92f820-9bba-4102-b4ee-1c541c3a05d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b92f820-9bba-4102-b4ee-1c541c3a05d7" (UID: "8b92f820-9bba-4102-b4ee-1c541c3a05d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.456116 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b92f820-9bba-4102-b4ee-1c541c3a05d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.456152 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b92f820-9bba-4102-b4ee-1c541c3a05d7-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.456162 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b92f820-9bba-4102-b4ee-1c541c3a05d7-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.456171 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9krz\" (UniqueName: \"kubernetes.io/projected/8b92f820-9bba-4102-b4ee-1c541c3a05d7-kube-api-access-c9krz\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.756985 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cbb8deb-abbe-4971-aef9-e1a801eb55eb","Type":"ContainerStarted","Data":"a588d0fde33b7e202fa8ce27e993cd9efb4be5bf280c684d972489d45edd2e24"} Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.757131 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.758607 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-28g8p" event={"ID":"8b92f820-9bba-4102-b4ee-1c541c3a05d7","Type":"ContainerDied","Data":"5ef99f1b1768da50031514281db28d141f4edb814d5d9eca44686c733576fe8d"} Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.758637 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="66366de9-a831-472a-a11f-f749b76d2007" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://3a8783764d91750ca85f5feeb2514d839d6de31cf5bf63a7a028debec1f34e57" gracePeriod=30 Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.758655 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-28g8p" Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.758646 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ef99f1b1768da50031514281db28d141f4edb814d5d9eca44686c733576fe8d" Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.795407 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.183704629 podStartE2EDuration="7.795384637s" podCreationTimestamp="2026-01-28 21:01:54 +0000 UTC" firstStartedPulling="2026-01-28 21:01:55.960027533 +0000 UTC m=+1343.916213887" lastFinishedPulling="2026-01-28 21:02:00.571707541 +0000 UTC m=+1348.527893895" observedRunningTime="2026-01-28 21:02:01.784851293 +0000 UTC m=+1349.741037647" watchObservedRunningTime="2026-01-28 21:02:01.795384637 +0000 UTC m=+1349.751570991" Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.901371 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.901673 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6509be40-b5da-4c87-bf0d-8b6a75084e60" containerName="nova-api-log" containerID="cri-o://2d5c6aeb41f3c79d4826fc65a5bf26beed2e4f16a9fe3ac7fb01fd14427dec9b" gracePeriod=30 Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.901976 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6509be40-b5da-4c87-bf0d-8b6a75084e60" containerName="nova-api-api" containerID="cri-o://c5392fe62e250472df5cf8e4e0ffaed45046c8c1e507a59c0064fefb6d3c2229" gracePeriod=30 Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.914113 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 21:02:01 crc kubenswrapper[4746]: I0128 21:02:01.914306 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a812f7c0-8a46-4727-9944-7266f70244bd" containerName="nova-scheduler-scheduler" containerID="cri-o://446c2979b187ee17480edf3f2ff55a150d3fb08121eca5c0761921ce2fe79b92" gracePeriod=30 Jan 28 21:02:02 crc kubenswrapper[4746]: I0128 21:02:02.772806 4746 generic.go:334] "Generic (PLEG): container finished" podID="66366de9-a831-472a-a11f-f749b76d2007" containerID="3a8783764d91750ca85f5feeb2514d839d6de31cf5bf63a7a028debec1f34e57" exitCode=0 Jan 28 21:02:02 crc kubenswrapper[4746]: I0128 21:02:02.772888 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"66366de9-a831-472a-a11f-f749b76d2007","Type":"ContainerDied","Data":"3a8783764d91750ca85f5feeb2514d839d6de31cf5bf63a7a028debec1f34e57"} Jan 28 21:02:02 crc kubenswrapper[4746]: I0128 21:02:02.780462 4746 generic.go:334] "Generic (PLEG): container finished" podID="6509be40-b5da-4c87-bf0d-8b6a75084e60" containerID="2d5c6aeb41f3c79d4826fc65a5bf26beed2e4f16a9fe3ac7fb01fd14427dec9b" exitCode=143 Jan 28 21:02:02 crc kubenswrapper[4746]: I0128 21:02:02.780535 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6509be40-b5da-4c87-bf0d-8b6a75084e60","Type":"ContainerDied","Data":"2d5c6aeb41f3c79d4826fc65a5bf26beed2e4f16a9fe3ac7fb01fd14427dec9b"} Jan 28 21:02:02 crc kubenswrapper[4746]: I0128 21:02:02.785515 4746 generic.go:334] "Generic (PLEG): container finished" podID="a812f7c0-8a46-4727-9944-7266f70244bd" containerID="446c2979b187ee17480edf3f2ff55a150d3fb08121eca5c0761921ce2fe79b92" exitCode=0 Jan 28 21:02:02 crc kubenswrapper[4746]: I0128 21:02:02.785628 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a812f7c0-8a46-4727-9944-7266f70244bd","Type":"ContainerDied","Data":"446c2979b187ee17480edf3f2ff55a150d3fb08121eca5c0761921ce2fe79b92"} Jan 28 21:02:03 crc kubenswrapper[4746]: E0128 21:02:03.123412 4746 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 446c2979b187ee17480edf3f2ff55a150d3fb08121eca5c0761921ce2fe79b92 is running failed: container process not found" containerID="446c2979b187ee17480edf3f2ff55a150d3fb08121eca5c0761921ce2fe79b92" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 28 21:02:03 crc kubenswrapper[4746]: E0128 21:02:03.124115 4746 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 446c2979b187ee17480edf3f2ff55a150d3fb08121eca5c0761921ce2fe79b92 is running failed: container process not found" containerID="446c2979b187ee17480edf3f2ff55a150d3fb08121eca5c0761921ce2fe79b92" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 28 21:02:03 crc kubenswrapper[4746]: E0128 21:02:03.125386 4746 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 446c2979b187ee17480edf3f2ff55a150d3fb08121eca5c0761921ce2fe79b92 is running failed: container process not found" containerID="446c2979b187ee17480edf3f2ff55a150d3fb08121eca5c0761921ce2fe79b92" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 28 21:02:03 crc kubenswrapper[4746]: E0128 21:02:03.125441 4746 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 446c2979b187ee17480edf3f2ff55a150d3fb08121eca5c0761921ce2fe79b92 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a812f7c0-8a46-4727-9944-7266f70244bd" containerName="nova-scheduler-scheduler" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.454944 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.462518 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.499640 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a812f7c0-8a46-4727-9944-7266f70244bd-config-data\") pod \"a812f7c0-8a46-4727-9944-7266f70244bd\" (UID: \"a812f7c0-8a46-4727-9944-7266f70244bd\") " Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.499824 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z78c7\" (UniqueName: \"kubernetes.io/projected/a812f7c0-8a46-4727-9944-7266f70244bd-kube-api-access-z78c7\") pod \"a812f7c0-8a46-4727-9944-7266f70244bd\" (UID: \"a812f7c0-8a46-4727-9944-7266f70244bd\") " Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.499910 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a812f7c0-8a46-4727-9944-7266f70244bd-combined-ca-bundle\") pod \"a812f7c0-8a46-4727-9944-7266f70244bd\" (UID: \"a812f7c0-8a46-4727-9944-7266f70244bd\") " Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.515295 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a812f7c0-8a46-4727-9944-7266f70244bd-kube-api-access-z78c7" (OuterVolumeSpecName: "kube-api-access-z78c7") pod "a812f7c0-8a46-4727-9944-7266f70244bd" (UID: "a812f7c0-8a46-4727-9944-7266f70244bd"). InnerVolumeSpecName "kube-api-access-z78c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.542174 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a812f7c0-8a46-4727-9944-7266f70244bd-config-data" (OuterVolumeSpecName: "config-data") pod "a812f7c0-8a46-4727-9944-7266f70244bd" (UID: "a812f7c0-8a46-4727-9944-7266f70244bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.565789 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a812f7c0-8a46-4727-9944-7266f70244bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a812f7c0-8a46-4727-9944-7266f70244bd" (UID: "a812f7c0-8a46-4727-9944-7266f70244bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.601388 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66366de9-a831-472a-a11f-f749b76d2007-config-data\") pod \"66366de9-a831-472a-a11f-f749b76d2007\" (UID: \"66366de9-a831-472a-a11f-f749b76d2007\") " Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.601477 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkqck\" (UniqueName: \"kubernetes.io/projected/66366de9-a831-472a-a11f-f749b76d2007-kube-api-access-xkqck\") pod \"66366de9-a831-472a-a11f-f749b76d2007\" (UID: \"66366de9-a831-472a-a11f-f749b76d2007\") " Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.601679 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66366de9-a831-472a-a11f-f749b76d2007-combined-ca-bundle\") pod \"66366de9-a831-472a-a11f-f749b76d2007\" (UID: \"66366de9-a831-472a-a11f-f749b76d2007\") " Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.602287 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a812f7c0-8a46-4727-9944-7266f70244bd-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.602302 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z78c7\" (UniqueName: \"kubernetes.io/projected/a812f7c0-8a46-4727-9944-7266f70244bd-kube-api-access-z78c7\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.602313 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a812f7c0-8a46-4727-9944-7266f70244bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.609879 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66366de9-a831-472a-a11f-f749b76d2007-kube-api-access-xkqck" (OuterVolumeSpecName: "kube-api-access-xkqck") pod "66366de9-a831-472a-a11f-f749b76d2007" (UID: "66366de9-a831-472a-a11f-f749b76d2007"). InnerVolumeSpecName "kube-api-access-xkqck". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.651352 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66366de9-a831-472a-a11f-f749b76d2007-config-data" (OuterVolumeSpecName: "config-data") pod "66366de9-a831-472a-a11f-f749b76d2007" (UID: "66366de9-a831-472a-a11f-f749b76d2007"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.652212 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66366de9-a831-472a-a11f-f749b76d2007-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66366de9-a831-472a-a11f-f749b76d2007" (UID: "66366de9-a831-472a-a11f-f749b76d2007"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.704051 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66366de9-a831-472a-a11f-f749b76d2007-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.704117 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkqck\" (UniqueName: \"kubernetes.io/projected/66366de9-a831-472a-a11f-f749b76d2007-kube-api-access-xkqck\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.704134 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66366de9-a831-472a-a11f-f749b76d2007-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.801932 4746 generic.go:334] "Generic (PLEG): container finished" podID="487dd560-95f2-45e5-b09b-f8d3aae5548a" containerID="02357381c64547dca72c19b1c8d373c82811515aadec52b3bfa42aa59d1bc3c3" exitCode=0 Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.803165 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8xq2h" event={"ID":"487dd560-95f2-45e5-b09b-f8d3aae5548a","Type":"ContainerDied","Data":"02357381c64547dca72c19b1c8d373c82811515aadec52b3bfa42aa59d1bc3c3"} Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.806103 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a812f7c0-8a46-4727-9944-7266f70244bd","Type":"ContainerDied","Data":"cc7be7ff3f15cfc9a933e1d3f743ef4c92439ebbaa8d9c8491500a3d000cc0f1"} Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.806236 4746 scope.go:117] "RemoveContainer" containerID="446c2979b187ee17480edf3f2ff55a150d3fb08121eca5c0761921ce2fe79b92" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.806446 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.818389 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"66366de9-a831-472a-a11f-f749b76d2007","Type":"ContainerDied","Data":"b9d3b224f8157eda177a5959e9aa575569ccbe5b5039314e32f5adef0cd73dca"} Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.818463 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.851519 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.863265 4746 scope.go:117] "RemoveContainer" containerID="3a8783764d91750ca85f5feeb2514d839d6de31cf5bf63a7a028debec1f34e57" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.875958 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.906271 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.925417 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.930749 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 21:02:03 crc kubenswrapper[4746]: E0128 21:02:03.931239 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7" containerName="init" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.931257 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7" containerName="init" Jan 28 21:02:03 crc kubenswrapper[4746]: E0128 21:02:03.931268 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a812f7c0-8a46-4727-9944-7266f70244bd" containerName="nova-scheduler-scheduler" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.931274 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="a812f7c0-8a46-4727-9944-7266f70244bd" containerName="nova-scheduler-scheduler" Jan 28 21:02:03 crc kubenswrapper[4746]: E0128 21:02:03.931291 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66366de9-a831-472a-a11f-f749b76d2007" containerName="nova-cell1-novncproxy-novncproxy" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.931297 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="66366de9-a831-472a-a11f-f749b76d2007" containerName="nova-cell1-novncproxy-novncproxy" Jan 28 21:02:03 crc kubenswrapper[4746]: E0128 21:02:03.931325 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b92f820-9bba-4102-b4ee-1c541c3a05d7" containerName="nova-manage" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.931331 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b92f820-9bba-4102-b4ee-1c541c3a05d7" containerName="nova-manage" Jan 28 21:02:03 crc kubenswrapper[4746]: E0128 21:02:03.931339 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7" containerName="dnsmasq-dns" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.931344 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7" containerName="dnsmasq-dns" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.931525 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bc2ca65-f7aa-42c6-ac4f-accc7a46a0c7" containerName="dnsmasq-dns" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.931546 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b92f820-9bba-4102-b4ee-1c541c3a05d7" containerName="nova-manage" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.931559 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="a812f7c0-8a46-4727-9944-7266f70244bd" containerName="nova-scheduler-scheduler" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.931568 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="66366de9-a831-472a-a11f-f749b76d2007" containerName="nova-cell1-novncproxy-novncproxy" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.932279 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.934225 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.939225 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.950210 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.960503 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.960611 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.963011 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.963089 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 28 21:02:03 crc kubenswrapper[4746]: I0128 21:02:03.963390 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.020032 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8p69\" (UniqueName: \"kubernetes.io/projected/44b99e27-4d9c-4167-a68d-ceb2e627bb95-kube-api-access-t8p69\") pod \"nova-scheduler-0\" (UID: \"44b99e27-4d9c-4167-a68d-ceb2e627bb95\") " pod="openstack/nova-scheduler-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.020119 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3466ae6e-8f00-4a2c-896e-cf1268924542-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3466ae6e-8f00-4a2c-896e-cf1268924542\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.020351 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3466ae6e-8f00-4a2c-896e-cf1268924542-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3466ae6e-8f00-4a2c-896e-cf1268924542\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.020797 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3466ae6e-8f00-4a2c-896e-cf1268924542-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3466ae6e-8f00-4a2c-896e-cf1268924542\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.020910 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3466ae6e-8f00-4a2c-896e-cf1268924542-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3466ae6e-8f00-4a2c-896e-cf1268924542\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.021614 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b99e27-4d9c-4167-a68d-ceb2e627bb95-config-data\") pod \"nova-scheduler-0\" (UID: \"44b99e27-4d9c-4167-a68d-ceb2e627bb95\") " pod="openstack/nova-scheduler-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.021681 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8d4z\" (UniqueName: \"kubernetes.io/projected/3466ae6e-8f00-4a2c-896e-cf1268924542-kube-api-access-m8d4z\") pod \"nova-cell1-novncproxy-0\" (UID: \"3466ae6e-8f00-4a2c-896e-cf1268924542\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.022003 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b99e27-4d9c-4167-a68d-ceb2e627bb95-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"44b99e27-4d9c-4167-a68d-ceb2e627bb95\") " pod="openstack/nova-scheduler-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.124051 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b99e27-4d9c-4167-a68d-ceb2e627bb95-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"44b99e27-4d9c-4167-a68d-ceb2e627bb95\") " pod="openstack/nova-scheduler-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.124188 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8p69\" (UniqueName: \"kubernetes.io/projected/44b99e27-4d9c-4167-a68d-ceb2e627bb95-kube-api-access-t8p69\") pod \"nova-scheduler-0\" (UID: \"44b99e27-4d9c-4167-a68d-ceb2e627bb95\") " pod="openstack/nova-scheduler-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.124210 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3466ae6e-8f00-4a2c-896e-cf1268924542-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3466ae6e-8f00-4a2c-896e-cf1268924542\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.124233 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3466ae6e-8f00-4a2c-896e-cf1268924542-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3466ae6e-8f00-4a2c-896e-cf1268924542\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.124306 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3466ae6e-8f00-4a2c-896e-cf1268924542-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3466ae6e-8f00-4a2c-896e-cf1268924542\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.124325 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3466ae6e-8f00-4a2c-896e-cf1268924542-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3466ae6e-8f00-4a2c-896e-cf1268924542\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.124359 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b99e27-4d9c-4167-a68d-ceb2e627bb95-config-data\") pod \"nova-scheduler-0\" (UID: \"44b99e27-4d9c-4167-a68d-ceb2e627bb95\") " pod="openstack/nova-scheduler-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.124381 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8d4z\" (UniqueName: \"kubernetes.io/projected/3466ae6e-8f00-4a2c-896e-cf1268924542-kube-api-access-m8d4z\") pod \"nova-cell1-novncproxy-0\" (UID: \"3466ae6e-8f00-4a2c-896e-cf1268924542\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.131243 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3466ae6e-8f00-4a2c-896e-cf1268924542-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3466ae6e-8f00-4a2c-896e-cf1268924542\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.131513 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3466ae6e-8f00-4a2c-896e-cf1268924542-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3466ae6e-8f00-4a2c-896e-cf1268924542\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.132342 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3466ae6e-8f00-4a2c-896e-cf1268924542-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3466ae6e-8f00-4a2c-896e-cf1268924542\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.132551 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3466ae6e-8f00-4a2c-896e-cf1268924542-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3466ae6e-8f00-4a2c-896e-cf1268924542\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.132964 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b99e27-4d9c-4167-a68d-ceb2e627bb95-config-data\") pod \"nova-scheduler-0\" (UID: \"44b99e27-4d9c-4167-a68d-ceb2e627bb95\") " pod="openstack/nova-scheduler-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.134487 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b99e27-4d9c-4167-a68d-ceb2e627bb95-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"44b99e27-4d9c-4167-a68d-ceb2e627bb95\") " pod="openstack/nova-scheduler-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.146671 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8d4z\" (UniqueName: \"kubernetes.io/projected/3466ae6e-8f00-4a2c-896e-cf1268924542-kube-api-access-m8d4z\") pod \"nova-cell1-novncproxy-0\" (UID: \"3466ae6e-8f00-4a2c-896e-cf1268924542\") " pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.150489 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8p69\" (UniqueName: \"kubernetes.io/projected/44b99e27-4d9c-4167-a68d-ceb2e627bb95-kube-api-access-t8p69\") pod \"nova-scheduler-0\" (UID: \"44b99e27-4d9c-4167-a68d-ceb2e627bb95\") " pod="openstack/nova-scheduler-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.255751 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.295679 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.802972 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.834451 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"44b99e27-4d9c-4167-a68d-ceb2e627bb95","Type":"ContainerStarted","Data":"61a6255ff980906e8ae735560db7d128e432f9d7cbcccf8e00d35c1b016fe812"} Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.854283 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66366de9-a831-472a-a11f-f749b76d2007" path="/var/lib/kubelet/pods/66366de9-a831-472a-a11f-f749b76d2007/volumes" Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.855139 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a812f7c0-8a46-4727-9944-7266f70244bd" path="/var/lib/kubelet/pods/a812f7c0-8a46-4727-9944-7266f70244bd/volumes" Jan 28 21:02:04 crc kubenswrapper[4746]: W0128 21:02:04.911799 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3466ae6e_8f00_4a2c_896e_cf1268924542.slice/crio-42db9c222d38a562e137c1944bb48754dd199fe4778dca143548d825949fa566 WatchSource:0}: Error finding container 42db9c222d38a562e137c1944bb48754dd199fe4778dca143548d825949fa566: Status 404 returned error can't find the container with id 42db9c222d38a562e137c1944bb48754dd199fe4778dca143548d825949fa566 Jan 28 21:02:04 crc kubenswrapper[4746]: I0128 21:02:04.912850 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.291251 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8xq2h" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.360871 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/487dd560-95f2-45e5-b09b-f8d3aae5548a-scripts\") pod \"487dd560-95f2-45e5-b09b-f8d3aae5548a\" (UID: \"487dd560-95f2-45e5-b09b-f8d3aae5548a\") " Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.360945 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487dd560-95f2-45e5-b09b-f8d3aae5548a-config-data\") pod \"487dd560-95f2-45e5-b09b-f8d3aae5548a\" (UID: \"487dd560-95f2-45e5-b09b-f8d3aae5548a\") " Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.361142 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487dd560-95f2-45e5-b09b-f8d3aae5548a-combined-ca-bundle\") pod \"487dd560-95f2-45e5-b09b-f8d3aae5548a\" (UID: \"487dd560-95f2-45e5-b09b-f8d3aae5548a\") " Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.361213 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrg7b\" (UniqueName: \"kubernetes.io/projected/487dd560-95f2-45e5-b09b-f8d3aae5548a-kube-api-access-mrg7b\") pod \"487dd560-95f2-45e5-b09b-f8d3aae5548a\" (UID: \"487dd560-95f2-45e5-b09b-f8d3aae5548a\") " Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.372729 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/487dd560-95f2-45e5-b09b-f8d3aae5548a-kube-api-access-mrg7b" (OuterVolumeSpecName: "kube-api-access-mrg7b") pod "487dd560-95f2-45e5-b09b-f8d3aae5548a" (UID: "487dd560-95f2-45e5-b09b-f8d3aae5548a"). InnerVolumeSpecName "kube-api-access-mrg7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.374236 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487dd560-95f2-45e5-b09b-f8d3aae5548a-scripts" (OuterVolumeSpecName: "scripts") pod "487dd560-95f2-45e5-b09b-f8d3aae5548a" (UID: "487dd560-95f2-45e5-b09b-f8d3aae5548a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.409196 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487dd560-95f2-45e5-b09b-f8d3aae5548a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "487dd560-95f2-45e5-b09b-f8d3aae5548a" (UID: "487dd560-95f2-45e5-b09b-f8d3aae5548a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.425234 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487dd560-95f2-45e5-b09b-f8d3aae5548a-config-data" (OuterVolumeSpecName: "config-data") pod "487dd560-95f2-45e5-b09b-f8d3aae5548a" (UID: "487dd560-95f2-45e5-b09b-f8d3aae5548a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.463303 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/487dd560-95f2-45e5-b09b-f8d3aae5548a-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.463337 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/487dd560-95f2-45e5-b09b-f8d3aae5548a-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.463348 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/487dd560-95f2-45e5-b09b-f8d3aae5548a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.463359 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrg7b\" (UniqueName: \"kubernetes.io/projected/487dd560-95f2-45e5-b09b-f8d3aae5548a-kube-api-access-mrg7b\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.845942 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3466ae6e-8f00-4a2c-896e-cf1268924542","Type":"ContainerStarted","Data":"4d4c19996b95376795d511949a92efcf8adfe127bb244b65f867ee1c5c0e515c"} Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.846287 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3466ae6e-8f00-4a2c-896e-cf1268924542","Type":"ContainerStarted","Data":"42db9c222d38a562e137c1944bb48754dd199fe4778dca143548d825949fa566"} Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.850110 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8xq2h" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.850098 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8xq2h" event={"ID":"487dd560-95f2-45e5-b09b-f8d3aae5548a","Type":"ContainerDied","Data":"02ee997cba73463e991795ba89bbba276487cadfcd5ab2b765f77045cb2ca9c8"} Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.850219 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02ee997cba73463e991795ba89bbba276487cadfcd5ab2b765f77045cb2ca9c8" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.853637 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"44b99e27-4d9c-4167-a68d-ceb2e627bb95","Type":"ContainerStarted","Data":"c75bf35b9356d72388cfe4007ab257f8d183a8bcfb3197200c57896767a7dbdf"} Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.888990 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.888968719 podStartE2EDuration="2.888968719s" podCreationTimestamp="2026-01-28 21:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:02:05.86670141 +0000 UTC m=+1353.822887764" watchObservedRunningTime="2026-01-28 21:02:05.888968719 +0000 UTC m=+1353.845155073" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.903649 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 28 21:02:05 crc kubenswrapper[4746]: E0128 21:02:05.904142 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487dd560-95f2-45e5-b09b-f8d3aae5548a" containerName="nova-cell1-conductor-db-sync" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.904162 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="487dd560-95f2-45e5-b09b-f8d3aae5548a" containerName="nova-cell1-conductor-db-sync" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.904432 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="487dd560-95f2-45e5-b09b-f8d3aae5548a" containerName="nova-cell1-conductor-db-sync" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.905286 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.909384 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.920677 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.9206506709999998 podStartE2EDuration="2.920650671s" podCreationTimestamp="2026-01-28 21:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:02:05.897680123 +0000 UTC m=+1353.853866477" watchObservedRunningTime="2026-01-28 21:02:05.920650671 +0000 UTC m=+1353.876837035" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.947126 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.972029 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d56ad0-5928-4211-a272-59aaab5e538b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b1d56ad0-5928-4211-a272-59aaab5e538b\") " pod="openstack/nova-cell1-conductor-0" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.972285 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzdwv\" (UniqueName: \"kubernetes.io/projected/b1d56ad0-5928-4211-a272-59aaab5e538b-kube-api-access-dzdwv\") pod \"nova-cell1-conductor-0\" (UID: \"b1d56ad0-5928-4211-a272-59aaab5e538b\") " pod="openstack/nova-cell1-conductor-0" Jan 28 21:02:05 crc kubenswrapper[4746]: I0128 21:02:05.972326 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d56ad0-5928-4211-a272-59aaab5e538b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b1d56ad0-5928-4211-a272-59aaab5e538b\") " pod="openstack/nova-cell1-conductor-0" Jan 28 21:02:06 crc kubenswrapper[4746]: I0128 21:02:06.073566 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzdwv\" (UniqueName: \"kubernetes.io/projected/b1d56ad0-5928-4211-a272-59aaab5e538b-kube-api-access-dzdwv\") pod \"nova-cell1-conductor-0\" (UID: \"b1d56ad0-5928-4211-a272-59aaab5e538b\") " pod="openstack/nova-cell1-conductor-0" Jan 28 21:02:06 crc kubenswrapper[4746]: I0128 21:02:06.073618 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d56ad0-5928-4211-a272-59aaab5e538b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b1d56ad0-5928-4211-a272-59aaab5e538b\") " pod="openstack/nova-cell1-conductor-0" Jan 28 21:02:06 crc kubenswrapper[4746]: I0128 21:02:06.073673 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d56ad0-5928-4211-a272-59aaab5e538b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b1d56ad0-5928-4211-a272-59aaab5e538b\") " pod="openstack/nova-cell1-conductor-0" Jan 28 21:02:06 crc kubenswrapper[4746]: I0128 21:02:06.077638 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d56ad0-5928-4211-a272-59aaab5e538b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b1d56ad0-5928-4211-a272-59aaab5e538b\") " pod="openstack/nova-cell1-conductor-0" Jan 28 21:02:06 crc kubenswrapper[4746]: I0128 21:02:06.088810 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d56ad0-5928-4211-a272-59aaab5e538b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b1d56ad0-5928-4211-a272-59aaab5e538b\") " pod="openstack/nova-cell1-conductor-0" Jan 28 21:02:06 crc kubenswrapper[4746]: I0128 21:02:06.134733 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzdwv\" (UniqueName: \"kubernetes.io/projected/b1d56ad0-5928-4211-a272-59aaab5e538b-kube-api-access-dzdwv\") pod \"nova-cell1-conductor-0\" (UID: \"b1d56ad0-5928-4211-a272-59aaab5e538b\") " pod="openstack/nova-cell1-conductor-0" Jan 28 21:02:06 crc kubenswrapper[4746]: I0128 21:02:06.235918 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 28 21:02:06 crc kubenswrapper[4746]: I0128 21:02:06.831856 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 28 21:02:06 crc kubenswrapper[4746]: I0128 21:02:06.896100 4746 generic.go:334] "Generic (PLEG): container finished" podID="6509be40-b5da-4c87-bf0d-8b6a75084e60" containerID="c5392fe62e250472df5cf8e4e0ffaed45046c8c1e507a59c0064fefb6d3c2229" exitCode=0 Jan 28 21:02:06 crc kubenswrapper[4746]: I0128 21:02:06.896204 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6509be40-b5da-4c87-bf0d-8b6a75084e60","Type":"ContainerDied","Data":"c5392fe62e250472df5cf8e4e0ffaed45046c8c1e507a59c0064fefb6d3c2229"} Jan 28 21:02:06 crc kubenswrapper[4746]: I0128 21:02:06.917397 4746 generic.go:334] "Generic (PLEG): container finished" podID="d76b1f13-0ce0-4706-a7d8-d70670e82b8f" containerID="bd57e2df9d9e364ec7582d491ba8a47746adfa26a358a8c959ff8ec34ca6095c" exitCode=0 Jan 28 21:02:06 crc kubenswrapper[4746]: I0128 21:02:06.918286 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d76b1f13-0ce0-4706-a7d8-d70670e82b8f","Type":"ContainerDied","Data":"bd57e2df9d9e364ec7582d491ba8a47746adfa26a358a8c959ff8ec34ca6095c"} Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.144335 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.236135 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6509be40-b5da-4c87-bf0d-8b6a75084e60-combined-ca-bundle\") pod \"6509be40-b5da-4c87-bf0d-8b6a75084e60\" (UID: \"6509be40-b5da-4c87-bf0d-8b6a75084e60\") " Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.236337 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjzsd\" (UniqueName: \"kubernetes.io/projected/6509be40-b5da-4c87-bf0d-8b6a75084e60-kube-api-access-jjzsd\") pod \"6509be40-b5da-4c87-bf0d-8b6a75084e60\" (UID: \"6509be40-b5da-4c87-bf0d-8b6a75084e60\") " Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.236388 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6509be40-b5da-4c87-bf0d-8b6a75084e60-logs\") pod \"6509be40-b5da-4c87-bf0d-8b6a75084e60\" (UID: \"6509be40-b5da-4c87-bf0d-8b6a75084e60\") " Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.236416 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6509be40-b5da-4c87-bf0d-8b6a75084e60-config-data\") pod \"6509be40-b5da-4c87-bf0d-8b6a75084e60\" (UID: \"6509be40-b5da-4c87-bf0d-8b6a75084e60\") " Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.238412 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6509be40-b5da-4c87-bf0d-8b6a75084e60-logs" (OuterVolumeSpecName: "logs") pod "6509be40-b5da-4c87-bf0d-8b6a75084e60" (UID: "6509be40-b5da-4c87-bf0d-8b6a75084e60"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.242213 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509be40-b5da-4c87-bf0d-8b6a75084e60-kube-api-access-jjzsd" (OuterVolumeSpecName: "kube-api-access-jjzsd") pod "6509be40-b5da-4c87-bf0d-8b6a75084e60" (UID: "6509be40-b5da-4c87-bf0d-8b6a75084e60"). InnerVolumeSpecName "kube-api-access-jjzsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.280958 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509be40-b5da-4c87-bf0d-8b6a75084e60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6509be40-b5da-4c87-bf0d-8b6a75084e60" (UID: "6509be40-b5da-4c87-bf0d-8b6a75084e60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.289984 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509be40-b5da-4c87-bf0d-8b6a75084e60-config-data" (OuterVolumeSpecName: "config-data") pod "6509be40-b5da-4c87-bf0d-8b6a75084e60" (UID: "6509be40-b5da-4c87-bf0d-8b6a75084e60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.330705 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.338433 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjzsd\" (UniqueName: \"kubernetes.io/projected/6509be40-b5da-4c87-bf0d-8b6a75084e60-kube-api-access-jjzsd\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.338464 4746 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6509be40-b5da-4c87-bf0d-8b6a75084e60-logs\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.338473 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6509be40-b5da-4c87-bf0d-8b6a75084e60-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.338482 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6509be40-b5da-4c87-bf0d-8b6a75084e60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.439250 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-combined-ca-bundle\") pod \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\" (UID: \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\") " Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.439413 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-logs\") pod \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\" (UID: \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\") " Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.439514 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-config-data\") pod \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\" (UID: \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\") " Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.439596 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjrgt\" (UniqueName: \"kubernetes.io/projected/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-kube-api-access-gjrgt\") pod \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\" (UID: \"d76b1f13-0ce0-4706-a7d8-d70670e82b8f\") " Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.440207 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-logs" (OuterVolumeSpecName: "logs") pod "d76b1f13-0ce0-4706-a7d8-d70670e82b8f" (UID: "d76b1f13-0ce0-4706-a7d8-d70670e82b8f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.449381 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-kube-api-access-gjrgt" (OuterVolumeSpecName: "kube-api-access-gjrgt") pod "d76b1f13-0ce0-4706-a7d8-d70670e82b8f" (UID: "d76b1f13-0ce0-4706-a7d8-d70670e82b8f"). InnerVolumeSpecName "kube-api-access-gjrgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.473702 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-config-data" (OuterVolumeSpecName: "config-data") pod "d76b1f13-0ce0-4706-a7d8-d70670e82b8f" (UID: "d76b1f13-0ce0-4706-a7d8-d70670e82b8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.479811 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d76b1f13-0ce0-4706-a7d8-d70670e82b8f" (UID: "d76b1f13-0ce0-4706-a7d8-d70670e82b8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.541481 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjrgt\" (UniqueName: \"kubernetes.io/projected/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-kube-api-access-gjrgt\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.541717 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.541792 4746 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-logs\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.541887 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d76b1f13-0ce0-4706-a7d8-d70670e82b8f-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.931392 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6509be40-b5da-4c87-bf0d-8b6a75084e60","Type":"ContainerDied","Data":"4096bf769d0dab06efa6153492ae287b346d786915ee35454cdbc4c4014731b3"} Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.931678 4746 scope.go:117] "RemoveContainer" containerID="c5392fe62e250472df5cf8e4e0ffaed45046c8c1e507a59c0064fefb6d3c2229" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.931811 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.940479 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d76b1f13-0ce0-4706-a7d8-d70670e82b8f","Type":"ContainerDied","Data":"0d2b6ab1f43894d7dc92dd5353c1cea8c017ead965222c013bd63d386f2a2399"} Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.940561 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.945782 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b1d56ad0-5928-4211-a272-59aaab5e538b","Type":"ContainerStarted","Data":"540d4cd7a87903048a9a6e53b6e7b16f7d79e8b7d6ce8c901e9a11dd418bf943"} Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.945835 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b1d56ad0-5928-4211-a272-59aaab5e538b","Type":"ContainerStarted","Data":"bcf687c7a097f578387a759d6e981f93bb259015e43b230fa6a1e35815095ec2"} Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.946296 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.983378 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.983355054 podStartE2EDuration="2.983355054s" podCreationTimestamp="2026-01-28 21:02:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:02:07.967854637 +0000 UTC m=+1355.924041011" watchObservedRunningTime="2026-01-28 21:02:07.983355054 +0000 UTC m=+1355.939541408" Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.993885 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 28 21:02:07 crc kubenswrapper[4746]: I0128 21:02:07.996614 4746 scope.go:117] "RemoveContainer" containerID="2d5c6aeb41f3c79d4826fc65a5bf26beed2e4f16a9fe3ac7fb01fd14427dec9b" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.004832 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.024315 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.036829 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 28 21:02:08 crc kubenswrapper[4746]: E0128 21:02:08.037328 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d76b1f13-0ce0-4706-a7d8-d70670e82b8f" containerName="nova-metadata-log" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.037347 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="d76b1f13-0ce0-4706-a7d8-d70670e82b8f" containerName="nova-metadata-log" Jan 28 21:02:08 crc kubenswrapper[4746]: E0128 21:02:08.037368 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6509be40-b5da-4c87-bf0d-8b6a75084e60" containerName="nova-api-api" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.037373 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="6509be40-b5da-4c87-bf0d-8b6a75084e60" containerName="nova-api-api" Jan 28 21:02:08 crc kubenswrapper[4746]: E0128 21:02:08.037400 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6509be40-b5da-4c87-bf0d-8b6a75084e60" containerName="nova-api-log" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.037406 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="6509be40-b5da-4c87-bf0d-8b6a75084e60" containerName="nova-api-log" Jan 28 21:02:08 crc kubenswrapper[4746]: E0128 21:02:08.037418 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d76b1f13-0ce0-4706-a7d8-d70670e82b8f" containerName="nova-metadata-metadata" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.037423 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="d76b1f13-0ce0-4706-a7d8-d70670e82b8f" containerName="nova-metadata-metadata" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.037601 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="6509be40-b5da-4c87-bf0d-8b6a75084e60" containerName="nova-api-log" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.037610 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="d76b1f13-0ce0-4706-a7d8-d70670e82b8f" containerName="nova-metadata-metadata" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.037632 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="6509be40-b5da-4c87-bf0d-8b6a75084e60" containerName="nova-api-api" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.037642 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="d76b1f13-0ce0-4706-a7d8-d70670e82b8f" containerName="nova-metadata-log" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.038722 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.044417 4746 scope.go:117] "RemoveContainer" containerID="bd57e2df9d9e364ec7582d491ba8a47746adfa26a358a8c959ff8ec34ca6095c" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.045902 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.055146 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.072037 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.084328 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.086722 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.092509 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.092667 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.094197 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.121366 4746 scope.go:117] "RemoveContainer" containerID="27471481cbaf8842ff3ee5bbb4fc59b4b77b4df3e987a2f6bb7dd3cc7327c88a" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.158240 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9426df49-b969-4890-8ea2-d37f2312e5e3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " pod="openstack/nova-metadata-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.158308 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9426df49-b969-4890-8ea2-d37f2312e5e3-config-data\") pod \"nova-metadata-0\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " pod="openstack/nova-metadata-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.158327 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9426df49-b969-4890-8ea2-d37f2312e5e3-logs\") pod \"nova-metadata-0\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " pod="openstack/nova-metadata-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.158349 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67g5z\" (UniqueName: \"kubernetes.io/projected/d3ee29df-412f-4e87-9ec2-c0431746e3a0-kube-api-access-67g5z\") pod \"nova-api-0\" (UID: \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\") " pod="openstack/nova-api-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.158417 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9426df49-b969-4890-8ea2-d37f2312e5e3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " pod="openstack/nova-metadata-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.158447 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ee29df-412f-4e87-9ec2-c0431746e3a0-config-data\") pod \"nova-api-0\" (UID: \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\") " pod="openstack/nova-api-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.158467 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ee29df-412f-4e87-9ec2-c0431746e3a0-logs\") pod \"nova-api-0\" (UID: \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\") " pod="openstack/nova-api-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.158487 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ee29df-412f-4e87-9ec2-c0431746e3a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\") " pod="openstack/nova-api-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.158535 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdxkd\" (UniqueName: \"kubernetes.io/projected/9426df49-b969-4890-8ea2-d37f2312e5e3-kube-api-access-tdxkd\") pod \"nova-metadata-0\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " pod="openstack/nova-metadata-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.260678 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdxkd\" (UniqueName: \"kubernetes.io/projected/9426df49-b969-4890-8ea2-d37f2312e5e3-kube-api-access-tdxkd\") pod \"nova-metadata-0\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " pod="openstack/nova-metadata-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.261216 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9426df49-b969-4890-8ea2-d37f2312e5e3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " pod="openstack/nova-metadata-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.262053 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9426df49-b969-4890-8ea2-d37f2312e5e3-config-data\") pod \"nova-metadata-0\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " pod="openstack/nova-metadata-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.262094 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9426df49-b969-4890-8ea2-d37f2312e5e3-logs\") pod \"nova-metadata-0\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " pod="openstack/nova-metadata-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.262118 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67g5z\" (UniqueName: \"kubernetes.io/projected/d3ee29df-412f-4e87-9ec2-c0431746e3a0-kube-api-access-67g5z\") pod \"nova-api-0\" (UID: \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\") " pod="openstack/nova-api-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.262207 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9426df49-b969-4890-8ea2-d37f2312e5e3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " pod="openstack/nova-metadata-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.262244 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ee29df-412f-4e87-9ec2-c0431746e3a0-config-data\") pod \"nova-api-0\" (UID: \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\") " pod="openstack/nova-api-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.262265 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ee29df-412f-4e87-9ec2-c0431746e3a0-logs\") pod \"nova-api-0\" (UID: \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\") " pod="openstack/nova-api-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.262285 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ee29df-412f-4e87-9ec2-c0431746e3a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\") " pod="openstack/nova-api-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.262638 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9426df49-b969-4890-8ea2-d37f2312e5e3-logs\") pod \"nova-metadata-0\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " pod="openstack/nova-metadata-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.263373 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ee29df-412f-4e87-9ec2-c0431746e3a0-logs\") pod \"nova-api-0\" (UID: \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\") " pod="openstack/nova-api-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.269109 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ee29df-412f-4e87-9ec2-c0431746e3a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\") " pod="openstack/nova-api-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.269798 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9426df49-b969-4890-8ea2-d37f2312e5e3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " pod="openstack/nova-metadata-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.270818 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ee29df-412f-4e87-9ec2-c0431746e3a0-config-data\") pod \"nova-api-0\" (UID: \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\") " pod="openstack/nova-api-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.271490 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9426df49-b969-4890-8ea2-d37f2312e5e3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " pod="openstack/nova-metadata-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.282771 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdxkd\" (UniqueName: \"kubernetes.io/projected/9426df49-b969-4890-8ea2-d37f2312e5e3-kube-api-access-tdxkd\") pod \"nova-metadata-0\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " pod="openstack/nova-metadata-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.283736 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9426df49-b969-4890-8ea2-d37f2312e5e3-config-data\") pod \"nova-metadata-0\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " pod="openstack/nova-metadata-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.286338 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67g5z\" (UniqueName: \"kubernetes.io/projected/d3ee29df-412f-4e87-9ec2-c0431746e3a0-kube-api-access-67g5z\") pod \"nova-api-0\" (UID: \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\") " pod="openstack/nova-api-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.366152 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.414780 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.427208 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="66366de9-a831-472a-a11f-f749b76d2007" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"http://10.217.0.219:6080/vnc_lite.html\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.879834 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509be40-b5da-4c87-bf0d-8b6a75084e60" path="/var/lib/kubelet/pods/6509be40-b5da-4c87-bf0d-8b6a75084e60/volumes" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.881954 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d76b1f13-0ce0-4706-a7d8-d70670e82b8f" path="/var/lib/kubelet/pods/d76b1f13-0ce0-4706-a7d8-d70670e82b8f/volumes" Jan 28 21:02:08 crc kubenswrapper[4746]: I0128 21:02:08.958397 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 21:02:09 crc kubenswrapper[4746]: I0128 21:02:09.164606 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 21:02:09 crc kubenswrapper[4746]: W0128 21:02:09.174715 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9426df49_b969_4890_8ea2_d37f2312e5e3.slice/crio-8ee002287db0e8c07208dd73a2a54e545acbbc3037d6d025ec41d5ffd58339d7 WatchSource:0}: Error finding container 8ee002287db0e8c07208dd73a2a54e545acbbc3037d6d025ec41d5ffd58339d7: Status 404 returned error can't find the container with id 8ee002287db0e8c07208dd73a2a54e545acbbc3037d6d025ec41d5ffd58339d7 Jan 28 21:02:09 crc kubenswrapper[4746]: I0128 21:02:09.257530 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 28 21:02:09 crc kubenswrapper[4746]: I0128 21:02:09.296345 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:09 crc kubenswrapper[4746]: I0128 21:02:09.400767 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7826h"] Jan 28 21:02:09 crc kubenswrapper[4746]: I0128 21:02:09.402996 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7826h" Jan 28 21:02:09 crc kubenswrapper[4746]: I0128 21:02:09.433465 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7826h"] Jan 28 21:02:09 crc kubenswrapper[4746]: I0128 21:02:09.500703 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04eb3dcd-859e-4fad-829e-c24bf2c954b4-catalog-content\") pod \"redhat-operators-7826h\" (UID: \"04eb3dcd-859e-4fad-829e-c24bf2c954b4\") " pod="openshift-marketplace/redhat-operators-7826h" Jan 28 21:02:09 crc kubenswrapper[4746]: I0128 21:02:09.500762 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szzg4\" (UniqueName: \"kubernetes.io/projected/04eb3dcd-859e-4fad-829e-c24bf2c954b4-kube-api-access-szzg4\") pod \"redhat-operators-7826h\" (UID: \"04eb3dcd-859e-4fad-829e-c24bf2c954b4\") " pod="openshift-marketplace/redhat-operators-7826h" Jan 28 21:02:09 crc kubenswrapper[4746]: I0128 21:02:09.500788 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04eb3dcd-859e-4fad-829e-c24bf2c954b4-utilities\") pod \"redhat-operators-7826h\" (UID: \"04eb3dcd-859e-4fad-829e-c24bf2c954b4\") " pod="openshift-marketplace/redhat-operators-7826h" Jan 28 21:02:09 crc kubenswrapper[4746]: I0128 21:02:09.603027 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04eb3dcd-859e-4fad-829e-c24bf2c954b4-catalog-content\") pod \"redhat-operators-7826h\" (UID: \"04eb3dcd-859e-4fad-829e-c24bf2c954b4\") " pod="openshift-marketplace/redhat-operators-7826h" Jan 28 21:02:09 crc kubenswrapper[4746]: I0128 21:02:09.603100 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szzg4\" (UniqueName: \"kubernetes.io/projected/04eb3dcd-859e-4fad-829e-c24bf2c954b4-kube-api-access-szzg4\") pod \"redhat-operators-7826h\" (UID: \"04eb3dcd-859e-4fad-829e-c24bf2c954b4\") " pod="openshift-marketplace/redhat-operators-7826h" Jan 28 21:02:09 crc kubenswrapper[4746]: I0128 21:02:09.603123 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04eb3dcd-859e-4fad-829e-c24bf2c954b4-utilities\") pod \"redhat-operators-7826h\" (UID: \"04eb3dcd-859e-4fad-829e-c24bf2c954b4\") " pod="openshift-marketplace/redhat-operators-7826h" Jan 28 21:02:09 crc kubenswrapper[4746]: I0128 21:02:09.603715 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04eb3dcd-859e-4fad-829e-c24bf2c954b4-utilities\") pod \"redhat-operators-7826h\" (UID: \"04eb3dcd-859e-4fad-829e-c24bf2c954b4\") " pod="openshift-marketplace/redhat-operators-7826h" Jan 28 21:02:09 crc kubenswrapper[4746]: I0128 21:02:09.603979 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04eb3dcd-859e-4fad-829e-c24bf2c954b4-catalog-content\") pod \"redhat-operators-7826h\" (UID: \"04eb3dcd-859e-4fad-829e-c24bf2c954b4\") " pod="openshift-marketplace/redhat-operators-7826h" Jan 28 21:02:09 crc kubenswrapper[4746]: I0128 21:02:09.620730 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szzg4\" (UniqueName: \"kubernetes.io/projected/04eb3dcd-859e-4fad-829e-c24bf2c954b4-kube-api-access-szzg4\") pod \"redhat-operators-7826h\" (UID: \"04eb3dcd-859e-4fad-829e-c24bf2c954b4\") " pod="openshift-marketplace/redhat-operators-7826h" Jan 28 21:02:09 crc kubenswrapper[4746]: I0128 21:02:09.744195 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7826h" Jan 28 21:02:10 crc kubenswrapper[4746]: I0128 21:02:10.005258 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3ee29df-412f-4e87-9ec2-c0431746e3a0","Type":"ContainerStarted","Data":"65c7f70deddcaf9800c9f9004a3e7977ee14a93748ec078ec67bac98d78e0dac"} Jan 28 21:02:10 crc kubenswrapper[4746]: I0128 21:02:10.005498 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3ee29df-412f-4e87-9ec2-c0431746e3a0","Type":"ContainerStarted","Data":"2ecc9e969b4f94bb854b25b0bb46edb565ee8c8e19506d10e5b48e34cb27f53f"} Jan 28 21:02:10 crc kubenswrapper[4746]: I0128 21:02:10.005508 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3ee29df-412f-4e87-9ec2-c0431746e3a0","Type":"ContainerStarted","Data":"f03dd96af88bd834bf7a56df89f95eb22b540b72ba526f46d4e5e4aac10baa06"} Jan 28 21:02:10 crc kubenswrapper[4746]: I0128 21:02:10.029181 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9426df49-b969-4890-8ea2-d37f2312e5e3","Type":"ContainerStarted","Data":"525b216bf9c8a170b6745fa7879f9e6f31010e3a295beb89016f0788129848e2"} Jan 28 21:02:10 crc kubenswrapper[4746]: I0128 21:02:10.029233 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9426df49-b969-4890-8ea2-d37f2312e5e3","Type":"ContainerStarted","Data":"2d297a0d0f186621ca9c2e154b2d393b199f82be155c05fd32d8e2c3e95bddbf"} Jan 28 21:02:10 crc kubenswrapper[4746]: I0128 21:02:10.029244 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9426df49-b969-4890-8ea2-d37f2312e5e3","Type":"ContainerStarted","Data":"8ee002287db0e8c07208dd73a2a54e545acbbc3037d6d025ec41d5ffd58339d7"} Jan 28 21:02:10 crc kubenswrapper[4746]: I0128 21:02:10.036628 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.036613003 podStartE2EDuration="3.036613003s" podCreationTimestamp="2026-01-28 21:02:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:02:10.025121934 +0000 UTC m=+1357.981308288" watchObservedRunningTime="2026-01-28 21:02:10.036613003 +0000 UTC m=+1357.992799357" Jan 28 21:02:10 crc kubenswrapper[4746]: I0128 21:02:10.061834 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.061813101 podStartE2EDuration="2.061813101s" podCreationTimestamp="2026-01-28 21:02:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:02:10.056054787 +0000 UTC m=+1358.012241141" watchObservedRunningTime="2026-01-28 21:02:10.061813101 +0000 UTC m=+1358.017999455" Jan 28 21:02:10 crc kubenswrapper[4746]: W0128 21:02:10.210527 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04eb3dcd_859e_4fad_829e_c24bf2c954b4.slice/crio-cbef2dd49c2b4030e0f1172ae75c291591c9d79b1aca6fb5002054307addec15 WatchSource:0}: Error finding container cbef2dd49c2b4030e0f1172ae75c291591c9d79b1aca6fb5002054307addec15: Status 404 returned error can't find the container with id cbef2dd49c2b4030e0f1172ae75c291591c9d79b1aca6fb5002054307addec15 Jan 28 21:02:10 crc kubenswrapper[4746]: I0128 21:02:10.210901 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7826h"] Jan 28 21:02:11 crc kubenswrapper[4746]: I0128 21:02:11.039871 4746 generic.go:334] "Generic (PLEG): container finished" podID="04eb3dcd-859e-4fad-829e-c24bf2c954b4" containerID="8489fad5f60aef344c5116823427367409db1e4a669ebd8ad9708f77b31931f5" exitCode=0 Jan 28 21:02:11 crc kubenswrapper[4746]: I0128 21:02:11.039997 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7826h" event={"ID":"04eb3dcd-859e-4fad-829e-c24bf2c954b4","Type":"ContainerDied","Data":"8489fad5f60aef344c5116823427367409db1e4a669ebd8ad9708f77b31931f5"} Jan 28 21:02:11 crc kubenswrapper[4746]: I0128 21:02:11.040286 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7826h" event={"ID":"04eb3dcd-859e-4fad-829e-c24bf2c954b4","Type":"ContainerStarted","Data":"cbef2dd49c2b4030e0f1172ae75c291591c9d79b1aca6fb5002054307addec15"} Jan 28 21:02:12 crc kubenswrapper[4746]: I0128 21:02:12.052561 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7826h" event={"ID":"04eb3dcd-859e-4fad-829e-c24bf2c954b4","Type":"ContainerStarted","Data":"31e21fca11d41102e20b62cb8706f889109b63df5f24a367aa7d62c27d7cad7b"} Jan 28 21:02:13 crc kubenswrapper[4746]: I0128 21:02:13.071768 4746 generic.go:334] "Generic (PLEG): container finished" podID="04eb3dcd-859e-4fad-829e-c24bf2c954b4" containerID="31e21fca11d41102e20b62cb8706f889109b63df5f24a367aa7d62c27d7cad7b" exitCode=0 Jan 28 21:02:13 crc kubenswrapper[4746]: I0128 21:02:13.071854 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7826h" event={"ID":"04eb3dcd-859e-4fad-829e-c24bf2c954b4","Type":"ContainerDied","Data":"31e21fca11d41102e20b62cb8706f889109b63df5f24a367aa7d62c27d7cad7b"} Jan 28 21:02:13 crc kubenswrapper[4746]: I0128 21:02:13.416403 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 21:02:13 crc kubenswrapper[4746]: I0128 21:02:13.416506 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 21:02:14 crc kubenswrapper[4746]: I0128 21:02:14.102351 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7826h" event={"ID":"04eb3dcd-859e-4fad-829e-c24bf2c954b4","Type":"ContainerStarted","Data":"97a21ed0977d08d0eaf60fcb02cf46d34a01a47812b0539f645b301cf8b1f157"} Jan 28 21:02:14 crc kubenswrapper[4746]: I0128 21:02:14.148018 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7826h" podStartSLOduration=2.509633625 podStartE2EDuration="5.148000274s" podCreationTimestamp="2026-01-28 21:02:09 +0000 UTC" firstStartedPulling="2026-01-28 21:02:11.042293965 +0000 UTC m=+1358.998480319" lastFinishedPulling="2026-01-28 21:02:13.680660614 +0000 UTC m=+1361.636846968" observedRunningTime="2026-01-28 21:02:14.142984049 +0000 UTC m=+1362.099170423" watchObservedRunningTime="2026-01-28 21:02:14.148000274 +0000 UTC m=+1362.104186628" Jan 28 21:02:14 crc kubenswrapper[4746]: I0128 21:02:14.256465 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 28 21:02:14 crc kubenswrapper[4746]: I0128 21:02:14.297277 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:14 crc kubenswrapper[4746]: I0128 21:02:14.300741 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 28 21:02:14 crc kubenswrapper[4746]: I0128 21:02:14.338378 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:15 crc kubenswrapper[4746]: I0128 21:02:15.128573 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 28 21:02:15 crc kubenswrapper[4746]: I0128 21:02:15.136482 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 28 21:02:16 crc kubenswrapper[4746]: I0128 21:02:16.267410 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 28 21:02:16 crc kubenswrapper[4746]: I0128 21:02:16.785295 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-jng7h"] Jan 28 21:02:16 crc kubenswrapper[4746]: I0128 21:02:16.788374 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jng7h" Jan 28 21:02:16 crc kubenswrapper[4746]: I0128 21:02:16.793884 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 28 21:02:16 crc kubenswrapper[4746]: I0128 21:02:16.793916 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 28 21:02:16 crc kubenswrapper[4746]: I0128 21:02:16.810366 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-jng7h"] Jan 28 21:02:16 crc kubenswrapper[4746]: I0128 21:02:16.850264 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khh9m\" (UniqueName: \"kubernetes.io/projected/c3913d1a-3943-41bf-a670-cf63f257f3a4-kube-api-access-khh9m\") pod \"nova-cell1-cell-mapping-jng7h\" (UID: \"c3913d1a-3943-41bf-a670-cf63f257f3a4\") " pod="openstack/nova-cell1-cell-mapping-jng7h" Jan 28 21:02:16 crc kubenswrapper[4746]: I0128 21:02:16.850455 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3913d1a-3943-41bf-a670-cf63f257f3a4-config-data\") pod \"nova-cell1-cell-mapping-jng7h\" (UID: \"c3913d1a-3943-41bf-a670-cf63f257f3a4\") " pod="openstack/nova-cell1-cell-mapping-jng7h" Jan 28 21:02:16 crc kubenswrapper[4746]: I0128 21:02:16.850533 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3913d1a-3943-41bf-a670-cf63f257f3a4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jng7h\" (UID: \"c3913d1a-3943-41bf-a670-cf63f257f3a4\") " pod="openstack/nova-cell1-cell-mapping-jng7h" Jan 28 21:02:16 crc kubenswrapper[4746]: I0128 21:02:16.850557 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3913d1a-3943-41bf-a670-cf63f257f3a4-scripts\") pod \"nova-cell1-cell-mapping-jng7h\" (UID: \"c3913d1a-3943-41bf-a670-cf63f257f3a4\") " pod="openstack/nova-cell1-cell-mapping-jng7h" Jan 28 21:02:16 crc kubenswrapper[4746]: I0128 21:02:16.952963 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3913d1a-3943-41bf-a670-cf63f257f3a4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jng7h\" (UID: \"c3913d1a-3943-41bf-a670-cf63f257f3a4\") " pod="openstack/nova-cell1-cell-mapping-jng7h" Jan 28 21:02:16 crc kubenswrapper[4746]: I0128 21:02:16.953013 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3913d1a-3943-41bf-a670-cf63f257f3a4-scripts\") pod \"nova-cell1-cell-mapping-jng7h\" (UID: \"c3913d1a-3943-41bf-a670-cf63f257f3a4\") " pod="openstack/nova-cell1-cell-mapping-jng7h" Jan 28 21:02:16 crc kubenswrapper[4746]: I0128 21:02:16.953232 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khh9m\" (UniqueName: \"kubernetes.io/projected/c3913d1a-3943-41bf-a670-cf63f257f3a4-kube-api-access-khh9m\") pod \"nova-cell1-cell-mapping-jng7h\" (UID: \"c3913d1a-3943-41bf-a670-cf63f257f3a4\") " pod="openstack/nova-cell1-cell-mapping-jng7h" Jan 28 21:02:16 crc kubenswrapper[4746]: I0128 21:02:16.953263 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3913d1a-3943-41bf-a670-cf63f257f3a4-config-data\") pod \"nova-cell1-cell-mapping-jng7h\" (UID: \"c3913d1a-3943-41bf-a670-cf63f257f3a4\") " pod="openstack/nova-cell1-cell-mapping-jng7h" Jan 28 21:02:16 crc kubenswrapper[4746]: I0128 21:02:16.962429 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3913d1a-3943-41bf-a670-cf63f257f3a4-scripts\") pod \"nova-cell1-cell-mapping-jng7h\" (UID: \"c3913d1a-3943-41bf-a670-cf63f257f3a4\") " pod="openstack/nova-cell1-cell-mapping-jng7h" Jan 28 21:02:16 crc kubenswrapper[4746]: I0128 21:02:16.962517 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3913d1a-3943-41bf-a670-cf63f257f3a4-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jng7h\" (UID: \"c3913d1a-3943-41bf-a670-cf63f257f3a4\") " pod="openstack/nova-cell1-cell-mapping-jng7h" Jan 28 21:02:16 crc kubenswrapper[4746]: I0128 21:02:16.962595 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3913d1a-3943-41bf-a670-cf63f257f3a4-config-data\") pod \"nova-cell1-cell-mapping-jng7h\" (UID: \"c3913d1a-3943-41bf-a670-cf63f257f3a4\") " pod="openstack/nova-cell1-cell-mapping-jng7h" Jan 28 21:02:16 crc kubenswrapper[4746]: I0128 21:02:16.972854 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khh9m\" (UniqueName: \"kubernetes.io/projected/c3913d1a-3943-41bf-a670-cf63f257f3a4-kube-api-access-khh9m\") pod \"nova-cell1-cell-mapping-jng7h\" (UID: \"c3913d1a-3943-41bf-a670-cf63f257f3a4\") " pod="openstack/nova-cell1-cell-mapping-jng7h" Jan 28 21:02:17 crc kubenswrapper[4746]: I0128 21:02:17.105169 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jng7h" Jan 28 21:02:17 crc kubenswrapper[4746]: I0128 21:02:17.642999 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-jng7h"] Jan 28 21:02:18 crc kubenswrapper[4746]: I0128 21:02:18.138758 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jng7h" event={"ID":"c3913d1a-3943-41bf-a670-cf63f257f3a4","Type":"ContainerStarted","Data":"fa89a1688b3552ca7dec90db7ed13ec1efc1a868ef0d9c578ac3752d210eb6da"} Jan 28 21:02:18 crc kubenswrapper[4746]: I0128 21:02:18.139316 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jng7h" event={"ID":"c3913d1a-3943-41bf-a670-cf63f257f3a4","Type":"ContainerStarted","Data":"d75135f7a7f7de0666cbe7339f07228825c328721e2e61da2ee3cdb6e2687b33"} Jan 28 21:02:18 crc kubenswrapper[4746]: I0128 21:02:18.160045 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-jng7h" podStartSLOduration=2.160027063 podStartE2EDuration="2.160027063s" podCreationTimestamp="2026-01-28 21:02:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:02:18.157815663 +0000 UTC m=+1366.114002017" watchObservedRunningTime="2026-01-28 21:02:18.160027063 +0000 UTC m=+1366.116213417" Jan 28 21:02:18 crc kubenswrapper[4746]: I0128 21:02:18.366347 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 21:02:18 crc kubenswrapper[4746]: I0128 21:02:18.366405 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 21:02:18 crc kubenswrapper[4746]: I0128 21:02:18.416766 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 28 21:02:18 crc kubenswrapper[4746]: I0128 21:02:18.416812 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 28 21:02:19 crc kubenswrapper[4746]: I0128 21:02:19.449309 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d3ee29df-412f-4e87-9ec2-c0431746e3a0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.228:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 21:02:19 crc kubenswrapper[4746]: I0128 21:02:19.449413 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d3ee29df-412f-4e87-9ec2-c0431746e3a0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.228:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 21:02:19 crc kubenswrapper[4746]: I0128 21:02:19.464406 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9426df49-b969-4890-8ea2-d37f2312e5e3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.229:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 21:02:19 crc kubenswrapper[4746]: I0128 21:02:19.464579 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9426df49-b969-4890-8ea2-d37f2312e5e3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.229:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 21:02:19 crc kubenswrapper[4746]: I0128 21:02:19.745778 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7826h" Jan 28 21:02:19 crc kubenswrapper[4746]: I0128 21:02:19.747169 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7826h" Jan 28 21:02:20 crc kubenswrapper[4746]: I0128 21:02:20.813427 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7826h" podUID="04eb3dcd-859e-4fad-829e-c24bf2c954b4" containerName="registry-server" probeResult="failure" output=< Jan 28 21:02:20 crc kubenswrapper[4746]: timeout: failed to connect service ":50051" within 1s Jan 28 21:02:20 crc kubenswrapper[4746]: > Jan 28 21:02:24 crc kubenswrapper[4746]: I0128 21:02:24.201037 4746 generic.go:334] "Generic (PLEG): container finished" podID="c3913d1a-3943-41bf-a670-cf63f257f3a4" containerID="fa89a1688b3552ca7dec90db7ed13ec1efc1a868ef0d9c578ac3752d210eb6da" exitCode=0 Jan 28 21:02:24 crc kubenswrapper[4746]: I0128 21:02:24.201123 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jng7h" event={"ID":"c3913d1a-3943-41bf-a670-cf63f257f3a4","Type":"ContainerDied","Data":"fa89a1688b3552ca7dec90db7ed13ec1efc1a868ef0d9c578ac3752d210eb6da"} Jan 28 21:02:25 crc kubenswrapper[4746]: I0128 21:02:25.397235 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 28 21:02:25 crc kubenswrapper[4746]: I0128 21:02:25.768137 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jng7h" Jan 28 21:02:25 crc kubenswrapper[4746]: I0128 21:02:25.854870 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3913d1a-3943-41bf-a670-cf63f257f3a4-combined-ca-bundle\") pod \"c3913d1a-3943-41bf-a670-cf63f257f3a4\" (UID: \"c3913d1a-3943-41bf-a670-cf63f257f3a4\") " Jan 28 21:02:25 crc kubenswrapper[4746]: I0128 21:02:25.855049 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khh9m\" (UniqueName: \"kubernetes.io/projected/c3913d1a-3943-41bf-a670-cf63f257f3a4-kube-api-access-khh9m\") pod \"c3913d1a-3943-41bf-a670-cf63f257f3a4\" (UID: \"c3913d1a-3943-41bf-a670-cf63f257f3a4\") " Jan 28 21:02:25 crc kubenswrapper[4746]: I0128 21:02:25.855872 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3913d1a-3943-41bf-a670-cf63f257f3a4-scripts\") pod \"c3913d1a-3943-41bf-a670-cf63f257f3a4\" (UID: \"c3913d1a-3943-41bf-a670-cf63f257f3a4\") " Jan 28 21:02:25 crc kubenswrapper[4746]: I0128 21:02:25.856013 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3913d1a-3943-41bf-a670-cf63f257f3a4-config-data\") pod \"c3913d1a-3943-41bf-a670-cf63f257f3a4\" (UID: \"c3913d1a-3943-41bf-a670-cf63f257f3a4\") " Jan 28 21:02:25 crc kubenswrapper[4746]: I0128 21:02:25.860802 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3913d1a-3943-41bf-a670-cf63f257f3a4-scripts" (OuterVolumeSpecName: "scripts") pod "c3913d1a-3943-41bf-a670-cf63f257f3a4" (UID: "c3913d1a-3943-41bf-a670-cf63f257f3a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:25 crc kubenswrapper[4746]: I0128 21:02:25.861227 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3913d1a-3943-41bf-a670-cf63f257f3a4-kube-api-access-khh9m" (OuterVolumeSpecName: "kube-api-access-khh9m") pod "c3913d1a-3943-41bf-a670-cf63f257f3a4" (UID: "c3913d1a-3943-41bf-a670-cf63f257f3a4"). InnerVolumeSpecName "kube-api-access-khh9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:02:25 crc kubenswrapper[4746]: I0128 21:02:25.895739 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3913d1a-3943-41bf-a670-cf63f257f3a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3913d1a-3943-41bf-a670-cf63f257f3a4" (UID: "c3913d1a-3943-41bf-a670-cf63f257f3a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:25 crc kubenswrapper[4746]: I0128 21:02:25.898932 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3913d1a-3943-41bf-a670-cf63f257f3a4-config-data" (OuterVolumeSpecName: "config-data") pod "c3913d1a-3943-41bf-a670-cf63f257f3a4" (UID: "c3913d1a-3943-41bf-a670-cf63f257f3a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:25 crc kubenswrapper[4746]: I0128 21:02:25.958999 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3913d1a-3943-41bf-a670-cf63f257f3a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:25 crc kubenswrapper[4746]: I0128 21:02:25.959030 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khh9m\" (UniqueName: \"kubernetes.io/projected/c3913d1a-3943-41bf-a670-cf63f257f3a4-kube-api-access-khh9m\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:25 crc kubenswrapper[4746]: I0128 21:02:25.959040 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3913d1a-3943-41bf-a670-cf63f257f3a4-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:25 crc kubenswrapper[4746]: I0128 21:02:25.959048 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3913d1a-3943-41bf-a670-cf63f257f3a4-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:26 crc kubenswrapper[4746]: I0128 21:02:26.219857 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jng7h" event={"ID":"c3913d1a-3943-41bf-a670-cf63f257f3a4","Type":"ContainerDied","Data":"d75135f7a7f7de0666cbe7339f07228825c328721e2e61da2ee3cdb6e2687b33"} Jan 28 21:02:26 crc kubenswrapper[4746]: I0128 21:02:26.220167 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d75135f7a7f7de0666cbe7339f07228825c328721e2e61da2ee3cdb6e2687b33" Jan 28 21:02:26 crc kubenswrapper[4746]: I0128 21:02:26.219905 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jng7h" Jan 28 21:02:26 crc kubenswrapper[4746]: I0128 21:02:26.401929 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 28 21:02:26 crc kubenswrapper[4746]: I0128 21:02:26.402211 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d3ee29df-412f-4e87-9ec2-c0431746e3a0" containerName="nova-api-log" containerID="cri-o://2ecc9e969b4f94bb854b25b0bb46edb565ee8c8e19506d10e5b48e34cb27f53f" gracePeriod=30 Jan 28 21:02:26 crc kubenswrapper[4746]: I0128 21:02:26.402773 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d3ee29df-412f-4e87-9ec2-c0431746e3a0" containerName="nova-api-api" containerID="cri-o://65c7f70deddcaf9800c9f9004a3e7977ee14a93748ec078ec67bac98d78e0dac" gracePeriod=30 Jan 28 21:02:26 crc kubenswrapper[4746]: I0128 21:02:26.422275 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 21:02:26 crc kubenswrapper[4746]: I0128 21:02:26.422891 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="44b99e27-4d9c-4167-a68d-ceb2e627bb95" containerName="nova-scheduler-scheduler" containerID="cri-o://c75bf35b9356d72388cfe4007ab257f8d183a8bcfb3197200c57896767a7dbdf" gracePeriod=30 Jan 28 21:02:26 crc kubenswrapper[4746]: I0128 21:02:26.464711 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 21:02:26 crc kubenswrapper[4746]: I0128 21:02:26.464971 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9426df49-b969-4890-8ea2-d37f2312e5e3" containerName="nova-metadata-log" containerID="cri-o://2d297a0d0f186621ca9c2e154b2d393b199f82be155c05fd32d8e2c3e95bddbf" gracePeriod=30 Jan 28 21:02:26 crc kubenswrapper[4746]: I0128 21:02:26.466013 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9426df49-b969-4890-8ea2-d37f2312e5e3" containerName="nova-metadata-metadata" containerID="cri-o://525b216bf9c8a170b6745fa7879f9e6f31010e3a295beb89016f0788129848e2" gracePeriod=30 Jan 28 21:02:27 crc kubenswrapper[4746]: I0128 21:02:27.235944 4746 generic.go:334] "Generic (PLEG): container finished" podID="d3ee29df-412f-4e87-9ec2-c0431746e3a0" containerID="2ecc9e969b4f94bb854b25b0bb46edb565ee8c8e19506d10e5b48e34cb27f53f" exitCode=143 Jan 28 21:02:27 crc kubenswrapper[4746]: I0128 21:02:27.236052 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3ee29df-412f-4e87-9ec2-c0431746e3a0","Type":"ContainerDied","Data":"2ecc9e969b4f94bb854b25b0bb46edb565ee8c8e19506d10e5b48e34cb27f53f"} Jan 28 21:02:27 crc kubenswrapper[4746]: I0128 21:02:27.238323 4746 generic.go:334] "Generic (PLEG): container finished" podID="9426df49-b969-4890-8ea2-d37f2312e5e3" containerID="2d297a0d0f186621ca9c2e154b2d393b199f82be155c05fd32d8e2c3e95bddbf" exitCode=143 Jan 28 21:02:27 crc kubenswrapper[4746]: I0128 21:02:27.238359 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9426df49-b969-4890-8ea2-d37f2312e5e3","Type":"ContainerDied","Data":"2d297a0d0f186621ca9c2e154b2d393b199f82be155c05fd32d8e2c3e95bddbf"} Jan 28 21:02:28 crc kubenswrapper[4746]: I0128 21:02:28.250518 4746 generic.go:334] "Generic (PLEG): container finished" podID="44b99e27-4d9c-4167-a68d-ceb2e627bb95" containerID="c75bf35b9356d72388cfe4007ab257f8d183a8bcfb3197200c57896767a7dbdf" exitCode=0 Jan 28 21:02:28 crc kubenswrapper[4746]: I0128 21:02:28.250791 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"44b99e27-4d9c-4167-a68d-ceb2e627bb95","Type":"ContainerDied","Data":"c75bf35b9356d72388cfe4007ab257f8d183a8bcfb3197200c57896767a7dbdf"} Jan 28 21:02:28 crc kubenswrapper[4746]: I0128 21:02:28.516616 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 21:02:28 crc kubenswrapper[4746]: I0128 21:02:28.615730 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b99e27-4d9c-4167-a68d-ceb2e627bb95-config-data\") pod \"44b99e27-4d9c-4167-a68d-ceb2e627bb95\" (UID: \"44b99e27-4d9c-4167-a68d-ceb2e627bb95\") " Jan 28 21:02:28 crc kubenswrapper[4746]: I0128 21:02:28.615816 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b99e27-4d9c-4167-a68d-ceb2e627bb95-combined-ca-bundle\") pod \"44b99e27-4d9c-4167-a68d-ceb2e627bb95\" (UID: \"44b99e27-4d9c-4167-a68d-ceb2e627bb95\") " Jan 28 21:02:28 crc kubenswrapper[4746]: I0128 21:02:28.615911 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8p69\" (UniqueName: \"kubernetes.io/projected/44b99e27-4d9c-4167-a68d-ceb2e627bb95-kube-api-access-t8p69\") pod \"44b99e27-4d9c-4167-a68d-ceb2e627bb95\" (UID: \"44b99e27-4d9c-4167-a68d-ceb2e627bb95\") " Jan 28 21:02:28 crc kubenswrapper[4746]: I0128 21:02:28.621852 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44b99e27-4d9c-4167-a68d-ceb2e627bb95-kube-api-access-t8p69" (OuterVolumeSpecName: "kube-api-access-t8p69") pod "44b99e27-4d9c-4167-a68d-ceb2e627bb95" (UID: "44b99e27-4d9c-4167-a68d-ceb2e627bb95"). InnerVolumeSpecName "kube-api-access-t8p69". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:02:28 crc kubenswrapper[4746]: I0128 21:02:28.655172 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b99e27-4d9c-4167-a68d-ceb2e627bb95-config-data" (OuterVolumeSpecName: "config-data") pod "44b99e27-4d9c-4167-a68d-ceb2e627bb95" (UID: "44b99e27-4d9c-4167-a68d-ceb2e627bb95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:28 crc kubenswrapper[4746]: I0128 21:02:28.661460 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b99e27-4d9c-4167-a68d-ceb2e627bb95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44b99e27-4d9c-4167-a68d-ceb2e627bb95" (UID: "44b99e27-4d9c-4167-a68d-ceb2e627bb95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:28 crc kubenswrapper[4746]: I0128 21:02:28.718303 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b99e27-4d9c-4167-a68d-ceb2e627bb95-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:28 crc kubenswrapper[4746]: I0128 21:02:28.718345 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b99e27-4d9c-4167-a68d-ceb2e627bb95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:28 crc kubenswrapper[4746]: I0128 21:02:28.718361 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8p69\" (UniqueName: \"kubernetes.io/projected/44b99e27-4d9c-4167-a68d-ceb2e627bb95-kube-api-access-t8p69\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.263128 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"44b99e27-4d9c-4167-a68d-ceb2e627bb95","Type":"ContainerDied","Data":"61a6255ff980906e8ae735560db7d128e432f9d7cbcccf8e00d35c1b016fe812"} Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.263388 4746 scope.go:117] "RemoveContainer" containerID="c75bf35b9356d72388cfe4007ab257f8d183a8bcfb3197200c57896767a7dbdf" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.263442 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.300715 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.309839 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.328115 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 21:02:29 crc kubenswrapper[4746]: E0128 21:02:29.328722 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3913d1a-3943-41bf-a670-cf63f257f3a4" containerName="nova-manage" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.328746 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3913d1a-3943-41bf-a670-cf63f257f3a4" containerName="nova-manage" Jan 28 21:02:29 crc kubenswrapper[4746]: E0128 21:02:29.328777 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b99e27-4d9c-4167-a68d-ceb2e627bb95" containerName="nova-scheduler-scheduler" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.328786 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b99e27-4d9c-4167-a68d-ceb2e627bb95" containerName="nova-scheduler-scheduler" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.329074 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="44b99e27-4d9c-4167-a68d-ceb2e627bb95" containerName="nova-scheduler-scheduler" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.329117 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3913d1a-3943-41bf-a670-cf63f257f3a4" containerName="nova-manage" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.330013 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.336229 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.341366 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.432929 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blkmw\" (UniqueName: \"kubernetes.io/projected/d36f2955-7688-4b25-9097-becffcb1f3ad-kube-api-access-blkmw\") pod \"nova-scheduler-0\" (UID: \"d36f2955-7688-4b25-9097-becffcb1f3ad\") " pod="openstack/nova-scheduler-0" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.432973 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36f2955-7688-4b25-9097-becffcb1f3ad-config-data\") pod \"nova-scheduler-0\" (UID: \"d36f2955-7688-4b25-9097-becffcb1f3ad\") " pod="openstack/nova-scheduler-0" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.438244 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36f2955-7688-4b25-9097-becffcb1f3ad-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d36f2955-7688-4b25-9097-becffcb1f3ad\") " pod="openstack/nova-scheduler-0" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.540551 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36f2955-7688-4b25-9097-becffcb1f3ad-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d36f2955-7688-4b25-9097-becffcb1f3ad\") " pod="openstack/nova-scheduler-0" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.540734 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blkmw\" (UniqueName: \"kubernetes.io/projected/d36f2955-7688-4b25-9097-becffcb1f3ad-kube-api-access-blkmw\") pod \"nova-scheduler-0\" (UID: \"d36f2955-7688-4b25-9097-becffcb1f3ad\") " pod="openstack/nova-scheduler-0" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.540762 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36f2955-7688-4b25-9097-becffcb1f3ad-config-data\") pod \"nova-scheduler-0\" (UID: \"d36f2955-7688-4b25-9097-becffcb1f3ad\") " pod="openstack/nova-scheduler-0" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.545665 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36f2955-7688-4b25-9097-becffcb1f3ad-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d36f2955-7688-4b25-9097-becffcb1f3ad\") " pod="openstack/nova-scheduler-0" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.546972 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.547209 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17" containerName="kube-state-metrics" containerID="cri-o://cf6d9052c57bbb311927bde6702b43fb502f545e155aa24d1396694d9c822738" gracePeriod=30 Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.549330 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36f2955-7688-4b25-9097-becffcb1f3ad-config-data\") pod \"nova-scheduler-0\" (UID: \"d36f2955-7688-4b25-9097-becffcb1f3ad\") " pod="openstack/nova-scheduler-0" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.558489 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blkmw\" (UniqueName: \"kubernetes.io/projected/d36f2955-7688-4b25-9097-becffcb1f3ad-kube-api-access-blkmw\") pod \"nova-scheduler-0\" (UID: \"d36f2955-7688-4b25-9097-becffcb1f3ad\") " pod="openstack/nova-scheduler-0" Jan 28 21:02:29 crc kubenswrapper[4746]: I0128 21:02:29.654602 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.134752 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.157596 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67g5z\" (UniqueName: \"kubernetes.io/projected/d3ee29df-412f-4e87-9ec2-c0431746e3a0-kube-api-access-67g5z\") pod \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\" (UID: \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\") " Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.157748 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ee29df-412f-4e87-9ec2-c0431746e3a0-config-data\") pod \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\" (UID: \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\") " Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.157858 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ee29df-412f-4e87-9ec2-c0431746e3a0-combined-ca-bundle\") pod \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\" (UID: \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\") " Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.157928 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ee29df-412f-4e87-9ec2-c0431746e3a0-logs\") pod \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\" (UID: \"d3ee29df-412f-4e87-9ec2-c0431746e3a0\") " Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.159386 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3ee29df-412f-4e87-9ec2-c0431746e3a0-logs" (OuterVolumeSpecName: "logs") pod "d3ee29df-412f-4e87-9ec2-c0431746e3a0" (UID: "d3ee29df-412f-4e87-9ec2-c0431746e3a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.171410 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ee29df-412f-4e87-9ec2-c0431746e3a0-kube-api-access-67g5z" (OuterVolumeSpecName: "kube-api-access-67g5z") pod "d3ee29df-412f-4e87-9ec2-c0431746e3a0" (UID: "d3ee29df-412f-4e87-9ec2-c0431746e3a0"). InnerVolumeSpecName "kube-api-access-67g5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.262599 4746 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ee29df-412f-4e87-9ec2-c0431746e3a0-logs\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.262665 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67g5z\" (UniqueName: \"kubernetes.io/projected/d3ee29df-412f-4e87-9ec2-c0431746e3a0-kube-api-access-67g5z\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.280284 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ee29df-412f-4e87-9ec2-c0431746e3a0-config-data" (OuterVolumeSpecName: "config-data") pod "d3ee29df-412f-4e87-9ec2-c0431746e3a0" (UID: "d3ee29df-412f-4e87-9ec2-c0431746e3a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.317337 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ee29df-412f-4e87-9ec2-c0431746e3a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3ee29df-412f-4e87-9ec2-c0431746e3a0" (UID: "d3ee29df-412f-4e87-9ec2-c0431746e3a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.325422 4746 generic.go:334] "Generic (PLEG): container finished" podID="d3ee29df-412f-4e87-9ec2-c0431746e3a0" containerID="65c7f70deddcaf9800c9f9004a3e7977ee14a93748ec078ec67bac98d78e0dac" exitCode=0 Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.325616 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.326771 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3ee29df-412f-4e87-9ec2-c0431746e3a0","Type":"ContainerDied","Data":"65c7f70deddcaf9800c9f9004a3e7977ee14a93748ec078ec67bac98d78e0dac"} Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.326804 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3ee29df-412f-4e87-9ec2-c0431746e3a0","Type":"ContainerDied","Data":"f03dd96af88bd834bf7a56df89f95eb22b540b72ba526f46d4e5e4aac10baa06"} Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.326821 4746 scope.go:117] "RemoveContainer" containerID="65c7f70deddcaf9800c9f9004a3e7977ee14a93748ec078ec67bac98d78e0dac" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.344288 4746 generic.go:334] "Generic (PLEG): container finished" podID="9426df49-b969-4890-8ea2-d37f2312e5e3" containerID="525b216bf9c8a170b6745fa7879f9e6f31010e3a295beb89016f0788129848e2" exitCode=0 Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.344376 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9426df49-b969-4890-8ea2-d37f2312e5e3","Type":"ContainerDied","Data":"525b216bf9c8a170b6745fa7879f9e6f31010e3a295beb89016f0788129848e2"} Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.346129 4746 generic.go:334] "Generic (PLEG): container finished" podID="4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17" containerID="cf6d9052c57bbb311927bde6702b43fb502f545e155aa24d1396694d9c822738" exitCode=2 Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.346163 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17","Type":"ContainerDied","Data":"cf6d9052c57bbb311927bde6702b43fb502f545e155aa24d1396694d9c822738"} Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.366071 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ee29df-412f-4e87-9ec2-c0431746e3a0-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.366136 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ee29df-412f-4e87-9ec2-c0431746e3a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.370936 4746 scope.go:117] "RemoveContainer" containerID="2ecc9e969b4f94bb854b25b0bb46edb565ee8c8e19506d10e5b48e34cb27f53f" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.397145 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.401317 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.419347 4746 scope.go:117] "RemoveContainer" containerID="65c7f70deddcaf9800c9f9004a3e7977ee14a93748ec078ec67bac98d78e0dac" Jan 28 21:02:30 crc kubenswrapper[4746]: E0128 21:02:30.420561 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65c7f70deddcaf9800c9f9004a3e7977ee14a93748ec078ec67bac98d78e0dac\": container with ID starting with 65c7f70deddcaf9800c9f9004a3e7977ee14a93748ec078ec67bac98d78e0dac not found: ID does not exist" containerID="65c7f70deddcaf9800c9f9004a3e7977ee14a93748ec078ec67bac98d78e0dac" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.420598 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c7f70deddcaf9800c9f9004a3e7977ee14a93748ec078ec67bac98d78e0dac"} err="failed to get container status \"65c7f70deddcaf9800c9f9004a3e7977ee14a93748ec078ec67bac98d78e0dac\": rpc error: code = NotFound desc = could not find container \"65c7f70deddcaf9800c9f9004a3e7977ee14a93748ec078ec67bac98d78e0dac\": container with ID starting with 65c7f70deddcaf9800c9f9004a3e7977ee14a93748ec078ec67bac98d78e0dac not found: ID does not exist" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.420623 4746 scope.go:117] "RemoveContainer" containerID="2ecc9e969b4f94bb854b25b0bb46edb565ee8c8e19506d10e5b48e34cb27f53f" Jan 28 21:02:30 crc kubenswrapper[4746]: E0128 21:02:30.421655 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ecc9e969b4f94bb854b25b0bb46edb565ee8c8e19506d10e5b48e34cb27f53f\": container with ID starting with 2ecc9e969b4f94bb854b25b0bb46edb565ee8c8e19506d10e5b48e34cb27f53f not found: ID does not exist" containerID="2ecc9e969b4f94bb854b25b0bb46edb565ee8c8e19506d10e5b48e34cb27f53f" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.421693 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ecc9e969b4f94bb854b25b0bb46edb565ee8c8e19506d10e5b48e34cb27f53f"} err="failed to get container status \"2ecc9e969b4f94bb854b25b0bb46edb565ee8c8e19506d10e5b48e34cb27f53f\": rpc error: code = NotFound desc = could not find container \"2ecc9e969b4f94bb854b25b0bb46edb565ee8c8e19506d10e5b48e34cb27f53f\": container with ID starting with 2ecc9e969b4f94bb854b25b0bb46edb565ee8c8e19506d10e5b48e34cb27f53f not found: ID does not exist" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.459562 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.483749 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 28 21:02:30 crc kubenswrapper[4746]: E0128 21:02:30.484266 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ee29df-412f-4e87-9ec2-c0431746e3a0" containerName="nova-api-log" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.484278 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ee29df-412f-4e87-9ec2-c0431746e3a0" containerName="nova-api-log" Jan 28 21:02:30 crc kubenswrapper[4746]: E0128 21:02:30.484293 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17" containerName="kube-state-metrics" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.484299 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17" containerName="kube-state-metrics" Jan 28 21:02:30 crc kubenswrapper[4746]: E0128 21:02:30.484308 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ee29df-412f-4e87-9ec2-c0431746e3a0" containerName="nova-api-api" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.484315 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ee29df-412f-4e87-9ec2-c0431746e3a0" containerName="nova-api-api" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.484532 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ee29df-412f-4e87-9ec2-c0431746e3a0" containerName="nova-api-log" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.484548 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17" containerName="kube-state-metrics" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.484566 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ee29df-412f-4e87-9ec2-c0431746e3a0" containerName="nova-api-api" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.485744 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.490112 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.509913 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.593035 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksj4w\" (UniqueName: \"kubernetes.io/projected/4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17-kube-api-access-ksj4w\") pod \"4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17\" (UID: \"4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17\") " Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.593443 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ee42d3-e89f-4e39-b37a-4f01000355a4-logs\") pod \"nova-api-0\" (UID: \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\") " pod="openstack/nova-api-0" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.593554 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlxcg\" (UniqueName: \"kubernetes.io/projected/e3ee42d3-e89f-4e39-b37a-4f01000355a4-kube-api-access-rlxcg\") pod \"nova-api-0\" (UID: \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\") " pod="openstack/nova-api-0" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.593637 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ee42d3-e89f-4e39-b37a-4f01000355a4-config-data\") pod \"nova-api-0\" (UID: \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\") " pod="openstack/nova-api-0" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.593670 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ee42d3-e89f-4e39-b37a-4f01000355a4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\") " pod="openstack/nova-api-0" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.604460 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17-kube-api-access-ksj4w" (OuterVolumeSpecName: "kube-api-access-ksj4w") pod "4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17" (UID: "4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17"). InnerVolumeSpecName "kube-api-access-ksj4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.699070 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ee42d3-e89f-4e39-b37a-4f01000355a4-config-data\") pod \"nova-api-0\" (UID: \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\") " pod="openstack/nova-api-0" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.699148 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ee42d3-e89f-4e39-b37a-4f01000355a4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\") " pod="openstack/nova-api-0" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.699174 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ee42d3-e89f-4e39-b37a-4f01000355a4-logs\") pod \"nova-api-0\" (UID: \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\") " pod="openstack/nova-api-0" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.699260 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlxcg\" (UniqueName: \"kubernetes.io/projected/e3ee42d3-e89f-4e39-b37a-4f01000355a4-kube-api-access-rlxcg\") pod \"nova-api-0\" (UID: \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\") " pod="openstack/nova-api-0" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.699337 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksj4w\" (UniqueName: \"kubernetes.io/projected/4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17-kube-api-access-ksj4w\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.704868 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ee42d3-e89f-4e39-b37a-4f01000355a4-logs\") pod \"nova-api-0\" (UID: \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\") " pod="openstack/nova-api-0" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.705208 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ee42d3-e89f-4e39-b37a-4f01000355a4-config-data\") pod \"nova-api-0\" (UID: \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\") " pod="openstack/nova-api-0" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.707796 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ee42d3-e89f-4e39-b37a-4f01000355a4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\") " pod="openstack/nova-api-0" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.725723 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlxcg\" (UniqueName: \"kubernetes.io/projected/e3ee42d3-e89f-4e39-b37a-4f01000355a4-kube-api-access-rlxcg\") pod \"nova-api-0\" (UID: \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\") " pod="openstack/nova-api-0" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.730324 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.814173 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7826h" podUID="04eb3dcd-859e-4fad-829e-c24bf2c954b4" containerName="registry-server" probeResult="failure" output=< Jan 28 21:02:30 crc kubenswrapper[4746]: timeout: failed to connect service ":50051" within 1s Jan 28 21:02:30 crc kubenswrapper[4746]: > Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.822411 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.849208 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44b99e27-4d9c-4167-a68d-ceb2e627bb95" path="/var/lib/kubelet/pods/44b99e27-4d9c-4167-a68d-ceb2e627bb95/volumes" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.849824 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3ee29df-412f-4e87-9ec2-c0431746e3a0" path="/var/lib/kubelet/pods/d3ee29df-412f-4e87-9ec2-c0431746e3a0/volumes" Jan 28 21:02:30 crc kubenswrapper[4746]: I0128 21:02:30.928849 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.006833 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9426df49-b969-4890-8ea2-d37f2312e5e3-config-data\") pod \"9426df49-b969-4890-8ea2-d37f2312e5e3\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.007287 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9426df49-b969-4890-8ea2-d37f2312e5e3-combined-ca-bundle\") pod \"9426df49-b969-4890-8ea2-d37f2312e5e3\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.007409 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdxkd\" (UniqueName: \"kubernetes.io/projected/9426df49-b969-4890-8ea2-d37f2312e5e3-kube-api-access-tdxkd\") pod \"9426df49-b969-4890-8ea2-d37f2312e5e3\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.007526 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9426df49-b969-4890-8ea2-d37f2312e5e3-logs\") pod \"9426df49-b969-4890-8ea2-d37f2312e5e3\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.007591 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9426df49-b969-4890-8ea2-d37f2312e5e3-nova-metadata-tls-certs\") pod \"9426df49-b969-4890-8ea2-d37f2312e5e3\" (UID: \"9426df49-b969-4890-8ea2-d37f2312e5e3\") " Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.008435 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9426df49-b969-4890-8ea2-d37f2312e5e3-logs" (OuterVolumeSpecName: "logs") pod "9426df49-b969-4890-8ea2-d37f2312e5e3" (UID: "9426df49-b969-4890-8ea2-d37f2312e5e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.014602 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9426df49-b969-4890-8ea2-d37f2312e5e3-kube-api-access-tdxkd" (OuterVolumeSpecName: "kube-api-access-tdxkd") pod "9426df49-b969-4890-8ea2-d37f2312e5e3" (UID: "9426df49-b969-4890-8ea2-d37f2312e5e3"). InnerVolumeSpecName "kube-api-access-tdxkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.061933 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9426df49-b969-4890-8ea2-d37f2312e5e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9426df49-b969-4890-8ea2-d37f2312e5e3" (UID: "9426df49-b969-4890-8ea2-d37f2312e5e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.072073 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9426df49-b969-4890-8ea2-d37f2312e5e3-config-data" (OuterVolumeSpecName: "config-data") pod "9426df49-b969-4890-8ea2-d37f2312e5e3" (UID: "9426df49-b969-4890-8ea2-d37f2312e5e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.082176 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9426df49-b969-4890-8ea2-d37f2312e5e3-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9426df49-b969-4890-8ea2-d37f2312e5e3" (UID: "9426df49-b969-4890-8ea2-d37f2312e5e3"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.109985 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdxkd\" (UniqueName: \"kubernetes.io/projected/9426df49-b969-4890-8ea2-d37f2312e5e3-kube-api-access-tdxkd\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.110021 4746 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9426df49-b969-4890-8ea2-d37f2312e5e3-logs\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.110030 4746 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9426df49-b969-4890-8ea2-d37f2312e5e3-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.110038 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9426df49-b969-4890-8ea2-d37f2312e5e3-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.110048 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9426df49-b969-4890-8ea2-d37f2312e5e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.334421 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 21:02:31 crc kubenswrapper[4746]: W0128 21:02:31.335242 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3ee42d3_e89f_4e39_b37a_4f01000355a4.slice/crio-a051b19a53544dc58685c3699fab406c84df579e296324343a88a2c38f1b366c WatchSource:0}: Error finding container a051b19a53544dc58685c3699fab406c84df579e296324343a88a2c38f1b366c: Status 404 returned error can't find the container with id a051b19a53544dc58685c3699fab406c84df579e296324343a88a2c38f1b366c Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.383437 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17","Type":"ContainerDied","Data":"bf44be3c0234f66aaa68f27d0dbf90d6605b576dcc3ac644a36e0b8b3263f898"} Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.383544 4746 scope.go:117] "RemoveContainer" containerID="cf6d9052c57bbb311927bde6702b43fb502f545e155aa24d1396694d9c822738" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.383463 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.388956 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3ee42d3-e89f-4e39-b37a-4f01000355a4","Type":"ContainerStarted","Data":"a051b19a53544dc58685c3699fab406c84df579e296324343a88a2c38f1b366c"} Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.392305 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d36f2955-7688-4b25-9097-becffcb1f3ad","Type":"ContainerStarted","Data":"3a1a145b03eb61ffe86d3384a4986b535b6f506650ef5dcd8a761df1cd17fb01"} Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.392355 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d36f2955-7688-4b25-9097-becffcb1f3ad","Type":"ContainerStarted","Data":"1da180cac8ca0ddc76de55491caeffbf942662c4bf7aadeb28566654af29361f"} Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.396440 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9426df49-b969-4890-8ea2-d37f2312e5e3","Type":"ContainerDied","Data":"8ee002287db0e8c07208dd73a2a54e545acbbc3037d6d025ec41d5ffd58339d7"} Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.396487 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.419163 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.438009 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.445848 4746 scope.go:117] "RemoveContainer" containerID="525b216bf9c8a170b6745fa7879f9e6f31010e3a295beb89016f0788129848e2" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.450005 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 21:02:31 crc kubenswrapper[4746]: E0128 21:02:31.450527 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9426df49-b969-4890-8ea2-d37f2312e5e3" containerName="nova-metadata-log" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.450561 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="9426df49-b969-4890-8ea2-d37f2312e5e3" containerName="nova-metadata-log" Jan 28 21:02:31 crc kubenswrapper[4746]: E0128 21:02:31.450596 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9426df49-b969-4890-8ea2-d37f2312e5e3" containerName="nova-metadata-metadata" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.450604 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="9426df49-b969-4890-8ea2-d37f2312e5e3" containerName="nova-metadata-metadata" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.450849 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="9426df49-b969-4890-8ea2-d37f2312e5e3" containerName="nova-metadata-metadata" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.450874 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="9426df49-b969-4890-8ea2-d37f2312e5e3" containerName="nova-metadata-log" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.451613 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.451601879 podStartE2EDuration="2.451601879s" podCreationTimestamp="2026-01-28 21:02:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:02:31.431543929 +0000 UTC m=+1379.387730283" watchObservedRunningTime="2026-01-28 21:02:31.451601879 +0000 UTC m=+1379.407788233" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.451784 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.455004 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.455116 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.479907 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.494199 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.508830 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.524969 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwptk\" (UniqueName: \"kubernetes.io/projected/62270f68-89c1-462f-8aac-c4944f92cc3f-kube-api-access-fwptk\") pod \"kube-state-metrics-0\" (UID: \"62270f68-89c1-462f-8aac-c4944f92cc3f\") " pod="openstack/kube-state-metrics-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.525098 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62270f68-89c1-462f-8aac-c4944f92cc3f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"62270f68-89c1-462f-8aac-c4944f92cc3f\") " pod="openstack/kube-state-metrics-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.525236 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/62270f68-89c1-462f-8aac-c4944f92cc3f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"62270f68-89c1-462f-8aac-c4944f92cc3f\") " pod="openstack/kube-state-metrics-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.525923 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/62270f68-89c1-462f-8aac-c4944f92cc3f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"62270f68-89c1-462f-8aac-c4944f92cc3f\") " pod="openstack/kube-state-metrics-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.526533 4746 scope.go:117] "RemoveContainer" containerID="2d297a0d0f186621ca9c2e154b2d393b199f82be155c05fd32d8e2c3e95bddbf" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.540146 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.542014 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.544593 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.545008 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.581961 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.627708 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/62270f68-89c1-462f-8aac-c4944f92cc3f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"62270f68-89c1-462f-8aac-c4944f92cc3f\") " pod="openstack/kube-state-metrics-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.627815 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll6tq\" (UniqueName: \"kubernetes.io/projected/305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0-kube-api-access-ll6tq\") pod \"nova-metadata-0\" (UID: \"305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0\") " pod="openstack/nova-metadata-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.627841 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0\") " pod="openstack/nova-metadata-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.627868 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/62270f68-89c1-462f-8aac-c4944f92cc3f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"62270f68-89c1-462f-8aac-c4944f92cc3f\") " pod="openstack/kube-state-metrics-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.627917 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0-logs\") pod \"nova-metadata-0\" (UID: \"305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0\") " pod="openstack/nova-metadata-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.627963 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwptk\" (UniqueName: \"kubernetes.io/projected/62270f68-89c1-462f-8aac-c4944f92cc3f-kube-api-access-fwptk\") pod \"kube-state-metrics-0\" (UID: \"62270f68-89c1-462f-8aac-c4944f92cc3f\") " pod="openstack/kube-state-metrics-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.628013 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62270f68-89c1-462f-8aac-c4944f92cc3f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"62270f68-89c1-462f-8aac-c4944f92cc3f\") " pod="openstack/kube-state-metrics-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.628048 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0\") " pod="openstack/nova-metadata-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.628101 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0-config-data\") pod \"nova-metadata-0\" (UID: \"305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0\") " pod="openstack/nova-metadata-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.633741 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/62270f68-89c1-462f-8aac-c4944f92cc3f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"62270f68-89c1-462f-8aac-c4944f92cc3f\") " pod="openstack/kube-state-metrics-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.633931 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/62270f68-89c1-462f-8aac-c4944f92cc3f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"62270f68-89c1-462f-8aac-c4944f92cc3f\") " pod="openstack/kube-state-metrics-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.637522 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62270f68-89c1-462f-8aac-c4944f92cc3f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"62270f68-89c1-462f-8aac-c4944f92cc3f\") " pod="openstack/kube-state-metrics-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.659889 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwptk\" (UniqueName: \"kubernetes.io/projected/62270f68-89c1-462f-8aac-c4944f92cc3f-kube-api-access-fwptk\") pod \"kube-state-metrics-0\" (UID: \"62270f68-89c1-462f-8aac-c4944f92cc3f\") " pod="openstack/kube-state-metrics-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.733383 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll6tq\" (UniqueName: \"kubernetes.io/projected/305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0-kube-api-access-ll6tq\") pod \"nova-metadata-0\" (UID: \"305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0\") " pod="openstack/nova-metadata-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.733443 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0\") " pod="openstack/nova-metadata-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.733529 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0-logs\") pod \"nova-metadata-0\" (UID: \"305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0\") " pod="openstack/nova-metadata-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.733618 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0\") " pod="openstack/nova-metadata-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.733653 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0-config-data\") pod \"nova-metadata-0\" (UID: \"305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0\") " pod="openstack/nova-metadata-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.734831 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0-logs\") pod \"nova-metadata-0\" (UID: \"305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0\") " pod="openstack/nova-metadata-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.737966 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0\") " pod="openstack/nova-metadata-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.745845 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0\") " pod="openstack/nova-metadata-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.751984 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0-config-data\") pod \"nova-metadata-0\" (UID: \"305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0\") " pod="openstack/nova-metadata-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.758808 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll6tq\" (UniqueName: \"kubernetes.io/projected/305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0-kube-api-access-ll6tq\") pod \"nova-metadata-0\" (UID: \"305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0\") " pod="openstack/nova-metadata-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.833582 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 28 21:02:31 crc kubenswrapper[4746]: I0128 21:02:31.886656 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 28 21:02:32 crc kubenswrapper[4746]: I0128 21:02:32.354055 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 28 21:02:32 crc kubenswrapper[4746]: I0128 21:02:32.409313 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3ee42d3-e89f-4e39-b37a-4f01000355a4","Type":"ContainerStarted","Data":"8f2f925c10bbd7275f49f2b27d72aa4d876f1f0d5efa8ac05c79fa1fcae80c1c"} Jan 28 21:02:32 crc kubenswrapper[4746]: I0128 21:02:32.409647 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3ee42d3-e89f-4e39-b37a-4f01000355a4","Type":"ContainerStarted","Data":"d01d05176baf7ab85d89a66c810887542fc5aa90c182e9b02313ae59f6e754f4"} Jan 28 21:02:32 crc kubenswrapper[4746]: I0128 21:02:32.416533 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"62270f68-89c1-462f-8aac-c4944f92cc3f","Type":"ContainerStarted","Data":"adfccfe48a8eb0068c7c38d7372eb3056b52c3fdb660d07b87fc2993ebd86eea"} Jan 28 21:02:32 crc kubenswrapper[4746]: I0128 21:02:32.431392 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.431374834 podStartE2EDuration="2.431374834s" podCreationTimestamp="2026-01-28 21:02:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:02:32.426993225 +0000 UTC m=+1380.383179579" watchObservedRunningTime="2026-01-28 21:02:32.431374834 +0000 UTC m=+1380.387561198" Jan 28 21:02:32 crc kubenswrapper[4746]: W0128 21:02:32.482066 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod305b9f70_9fd4_4f7c_a4c5_dc46a63ebbc0.slice/crio-7a22a0f0473728c3f31277f0969639e59aa10af630f3468ea44b53de434cf852 WatchSource:0}: Error finding container 7a22a0f0473728c3f31277f0969639e59aa10af630f3468ea44b53de434cf852: Status 404 returned error can't find the container with id 7a22a0f0473728c3f31277f0969639e59aa10af630f3468ea44b53de434cf852 Jan 28 21:02:32 crc kubenswrapper[4746]: I0128 21:02:32.488147 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 28 21:02:32 crc kubenswrapper[4746]: I0128 21:02:32.566874 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:02:32 crc kubenswrapper[4746]: I0128 21:02:32.567164 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerName="ceilometer-central-agent" containerID="cri-o://b2ad84d6551f116608ef3dc3a9bbb9e32265de65399db25e132d0a1a69fd7ed5" gracePeriod=30 Jan 28 21:02:32 crc kubenswrapper[4746]: I0128 21:02:32.567298 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerName="proxy-httpd" containerID="cri-o://a588d0fde33b7e202fa8ce27e993cd9efb4be5bf280c684d972489d45edd2e24" gracePeriod=30 Jan 28 21:02:32 crc kubenswrapper[4746]: I0128 21:02:32.567352 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerName="ceilometer-notification-agent" containerID="cri-o://629e76dea151bd0f814f1082d64186f4b063dca70fa7f1a8f1f73aeec446f671" gracePeriod=30 Jan 28 21:02:32 crc kubenswrapper[4746]: I0128 21:02:32.567383 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerName="sg-core" containerID="cri-o://ee473725abd27899722f37d0953e40f88fcf03f78dea819d20fd5c74d953a493" gracePeriod=30 Jan 28 21:02:32 crc kubenswrapper[4746]: I0128 21:02:32.892377 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17" path="/var/lib/kubelet/pods/4aa3c3d3-f7a7-4e26-bf26-630de3cc7a17/volumes" Jan 28 21:02:32 crc kubenswrapper[4746]: I0128 21:02:32.893189 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9426df49-b969-4890-8ea2-d37f2312e5e3" path="/var/lib/kubelet/pods/9426df49-b969-4890-8ea2-d37f2312e5e3/volumes" Jan 28 21:02:33 crc kubenswrapper[4746]: I0128 21:02:33.427619 4746 generic.go:334] "Generic (PLEG): container finished" podID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerID="a588d0fde33b7e202fa8ce27e993cd9efb4be5bf280c684d972489d45edd2e24" exitCode=0 Jan 28 21:02:33 crc kubenswrapper[4746]: I0128 21:02:33.427842 4746 generic.go:334] "Generic (PLEG): container finished" podID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerID="ee473725abd27899722f37d0953e40f88fcf03f78dea819d20fd5c74d953a493" exitCode=2 Jan 28 21:02:33 crc kubenswrapper[4746]: I0128 21:02:33.427849 4746 generic.go:334] "Generic (PLEG): container finished" podID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerID="b2ad84d6551f116608ef3dc3a9bbb9e32265de65399db25e132d0a1a69fd7ed5" exitCode=0 Jan 28 21:02:33 crc kubenswrapper[4746]: I0128 21:02:33.427698 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cbb8deb-abbe-4971-aef9-e1a801eb55eb","Type":"ContainerDied","Data":"a588d0fde33b7e202fa8ce27e993cd9efb4be5bf280c684d972489d45edd2e24"} Jan 28 21:02:33 crc kubenswrapper[4746]: I0128 21:02:33.427914 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cbb8deb-abbe-4971-aef9-e1a801eb55eb","Type":"ContainerDied","Data":"ee473725abd27899722f37d0953e40f88fcf03f78dea819d20fd5c74d953a493"} Jan 28 21:02:33 crc kubenswrapper[4746]: I0128 21:02:33.427928 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cbb8deb-abbe-4971-aef9-e1a801eb55eb","Type":"ContainerDied","Data":"b2ad84d6551f116608ef3dc3a9bbb9e32265de65399db25e132d0a1a69fd7ed5"} Jan 28 21:02:33 crc kubenswrapper[4746]: I0128 21:02:33.429562 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"62270f68-89c1-462f-8aac-c4944f92cc3f","Type":"ContainerStarted","Data":"af482a3119917334dbf6f66ac094cdda38befe94a09b8b6c4b731f2b3507084c"} Jan 28 21:02:33 crc kubenswrapper[4746]: I0128 21:02:33.429786 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 28 21:02:33 crc kubenswrapper[4746]: I0128 21:02:33.431612 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0","Type":"ContainerStarted","Data":"9c168e06cce37310a3c2c13fbf3b9f953f56cca37de2940567c29158a5646938"} Jan 28 21:02:33 crc kubenswrapper[4746]: I0128 21:02:33.431637 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0","Type":"ContainerStarted","Data":"b6362abb85aeb2eb5af0e45dc5e1f8bad100f73f30953cc3ddbaf8f3ede7a372"} Jan 28 21:02:33 crc kubenswrapper[4746]: I0128 21:02:33.431648 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0","Type":"ContainerStarted","Data":"7a22a0f0473728c3f31277f0969639e59aa10af630f3468ea44b53de434cf852"} Jan 28 21:02:33 crc kubenswrapper[4746]: I0128 21:02:33.450814 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.088289324 podStartE2EDuration="2.450800125s" podCreationTimestamp="2026-01-28 21:02:31 +0000 UTC" firstStartedPulling="2026-01-28 21:02:32.361694089 +0000 UTC m=+1380.317880443" lastFinishedPulling="2026-01-28 21:02:32.72420489 +0000 UTC m=+1380.680391244" observedRunningTime="2026-01-28 21:02:33.445540543 +0000 UTC m=+1381.401726897" watchObservedRunningTime="2026-01-28 21:02:33.450800125 +0000 UTC m=+1381.406986479" Jan 28 21:02:33 crc kubenswrapper[4746]: I0128 21:02:33.468790 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.468767177 podStartE2EDuration="2.468767177s" podCreationTimestamp="2026-01-28 21:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:02:33.464829202 +0000 UTC m=+1381.421015566" watchObservedRunningTime="2026-01-28 21:02:33.468767177 +0000 UTC m=+1381.424953541" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.359793 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.445616 4746 generic.go:334] "Generic (PLEG): container finished" podID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerID="629e76dea151bd0f814f1082d64186f4b063dca70fa7f1a8f1f73aeec446f671" exitCode=0 Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.445950 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cbb8deb-abbe-4971-aef9-e1a801eb55eb","Type":"ContainerDied","Data":"629e76dea151bd0f814f1082d64186f4b063dca70fa7f1a8f1f73aeec446f671"} Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.446024 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4cbb8deb-abbe-4971-aef9-e1a801eb55eb","Type":"ContainerDied","Data":"bd92a726f169e6882618a8bda0a39e84badd73a38af13ac6c385c650235928e5"} Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.446047 4746 scope.go:117] "RemoveContainer" containerID="a588d0fde33b7e202fa8ce27e993cd9efb4be5bf280c684d972489d45edd2e24" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.446731 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.480983 4746 scope.go:117] "RemoveContainer" containerID="ee473725abd27899722f37d0953e40f88fcf03f78dea819d20fd5c74d953a493" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.496990 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-config-data\") pod \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.497059 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-sg-core-conf-yaml\") pod \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.497438 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-log-httpd\") pod \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.497548 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-combined-ca-bundle\") pod \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.497687 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m522\" (UniqueName: \"kubernetes.io/projected/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-kube-api-access-7m522\") pod \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.497734 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-run-httpd\") pod \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.497763 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-scripts\") pod \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\" (UID: \"4cbb8deb-abbe-4971-aef9-e1a801eb55eb\") " Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.498986 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4cbb8deb-abbe-4971-aef9-e1a801eb55eb" (UID: "4cbb8deb-abbe-4971-aef9-e1a801eb55eb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.499000 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4cbb8deb-abbe-4971-aef9-e1a801eb55eb" (UID: "4cbb8deb-abbe-4971-aef9-e1a801eb55eb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.505107 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-kube-api-access-7m522" (OuterVolumeSpecName: "kube-api-access-7m522") pod "4cbb8deb-abbe-4971-aef9-e1a801eb55eb" (UID: "4cbb8deb-abbe-4971-aef9-e1a801eb55eb"). InnerVolumeSpecName "kube-api-access-7m522". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.518299 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-scripts" (OuterVolumeSpecName: "scripts") pod "4cbb8deb-abbe-4971-aef9-e1a801eb55eb" (UID: "4cbb8deb-abbe-4971-aef9-e1a801eb55eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.518336 4746 scope.go:117] "RemoveContainer" containerID="629e76dea151bd0f814f1082d64186f4b063dca70fa7f1a8f1f73aeec446f671" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.564544 4746 scope.go:117] "RemoveContainer" containerID="b2ad84d6551f116608ef3dc3a9bbb9e32265de65399db25e132d0a1a69fd7ed5" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.576335 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4cbb8deb-abbe-4971-aef9-e1a801eb55eb" (UID: "4cbb8deb-abbe-4971-aef9-e1a801eb55eb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.601413 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m522\" (UniqueName: \"kubernetes.io/projected/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-kube-api-access-7m522\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.601585 4746 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.601602 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.601613 4746 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.601625 4746 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.634209 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cbb8deb-abbe-4971-aef9-e1a801eb55eb" (UID: "4cbb8deb-abbe-4971-aef9-e1a801eb55eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.653205 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-config-data" (OuterVolumeSpecName: "config-data") pod "4cbb8deb-abbe-4971-aef9-e1a801eb55eb" (UID: "4cbb8deb-abbe-4971-aef9-e1a801eb55eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.655370 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.703843 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.704122 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbb8deb-abbe-4971-aef9-e1a801eb55eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.753067 4746 scope.go:117] "RemoveContainer" containerID="a588d0fde33b7e202fa8ce27e993cd9efb4be5bf280c684d972489d45edd2e24" Jan 28 21:02:34 crc kubenswrapper[4746]: E0128 21:02:34.753545 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a588d0fde33b7e202fa8ce27e993cd9efb4be5bf280c684d972489d45edd2e24\": container with ID starting with a588d0fde33b7e202fa8ce27e993cd9efb4be5bf280c684d972489d45edd2e24 not found: ID does not exist" containerID="a588d0fde33b7e202fa8ce27e993cd9efb4be5bf280c684d972489d45edd2e24" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.753577 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a588d0fde33b7e202fa8ce27e993cd9efb4be5bf280c684d972489d45edd2e24"} err="failed to get container status \"a588d0fde33b7e202fa8ce27e993cd9efb4be5bf280c684d972489d45edd2e24\": rpc error: code = NotFound desc = could not find container \"a588d0fde33b7e202fa8ce27e993cd9efb4be5bf280c684d972489d45edd2e24\": container with ID starting with a588d0fde33b7e202fa8ce27e993cd9efb4be5bf280c684d972489d45edd2e24 not found: ID does not exist" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.753601 4746 scope.go:117] "RemoveContainer" containerID="ee473725abd27899722f37d0953e40f88fcf03f78dea819d20fd5c74d953a493" Jan 28 21:02:34 crc kubenswrapper[4746]: E0128 21:02:34.753913 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee473725abd27899722f37d0953e40f88fcf03f78dea819d20fd5c74d953a493\": container with ID starting with ee473725abd27899722f37d0953e40f88fcf03f78dea819d20fd5c74d953a493 not found: ID does not exist" containerID="ee473725abd27899722f37d0953e40f88fcf03f78dea819d20fd5c74d953a493" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.754013 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee473725abd27899722f37d0953e40f88fcf03f78dea819d20fd5c74d953a493"} err="failed to get container status \"ee473725abd27899722f37d0953e40f88fcf03f78dea819d20fd5c74d953a493\": rpc error: code = NotFound desc = could not find container \"ee473725abd27899722f37d0953e40f88fcf03f78dea819d20fd5c74d953a493\": container with ID starting with ee473725abd27899722f37d0953e40f88fcf03f78dea819d20fd5c74d953a493 not found: ID does not exist" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.754117 4746 scope.go:117] "RemoveContainer" containerID="629e76dea151bd0f814f1082d64186f4b063dca70fa7f1a8f1f73aeec446f671" Jan 28 21:02:34 crc kubenswrapper[4746]: E0128 21:02:34.754487 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"629e76dea151bd0f814f1082d64186f4b063dca70fa7f1a8f1f73aeec446f671\": container with ID starting with 629e76dea151bd0f814f1082d64186f4b063dca70fa7f1a8f1f73aeec446f671 not found: ID does not exist" containerID="629e76dea151bd0f814f1082d64186f4b063dca70fa7f1a8f1f73aeec446f671" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.754584 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629e76dea151bd0f814f1082d64186f4b063dca70fa7f1a8f1f73aeec446f671"} err="failed to get container status \"629e76dea151bd0f814f1082d64186f4b063dca70fa7f1a8f1f73aeec446f671\": rpc error: code = NotFound desc = could not find container \"629e76dea151bd0f814f1082d64186f4b063dca70fa7f1a8f1f73aeec446f671\": container with ID starting with 629e76dea151bd0f814f1082d64186f4b063dca70fa7f1a8f1f73aeec446f671 not found: ID does not exist" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.754662 4746 scope.go:117] "RemoveContainer" containerID="b2ad84d6551f116608ef3dc3a9bbb9e32265de65399db25e132d0a1a69fd7ed5" Jan 28 21:02:34 crc kubenswrapper[4746]: E0128 21:02:34.755031 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2ad84d6551f116608ef3dc3a9bbb9e32265de65399db25e132d0a1a69fd7ed5\": container with ID starting with b2ad84d6551f116608ef3dc3a9bbb9e32265de65399db25e132d0a1a69fd7ed5 not found: ID does not exist" containerID="b2ad84d6551f116608ef3dc3a9bbb9e32265de65399db25e132d0a1a69fd7ed5" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.755064 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2ad84d6551f116608ef3dc3a9bbb9e32265de65399db25e132d0a1a69fd7ed5"} err="failed to get container status \"b2ad84d6551f116608ef3dc3a9bbb9e32265de65399db25e132d0a1a69fd7ed5\": rpc error: code = NotFound desc = could not find container \"b2ad84d6551f116608ef3dc3a9bbb9e32265de65399db25e132d0a1a69fd7ed5\": container with ID starting with b2ad84d6551f116608ef3dc3a9bbb9e32265de65399db25e132d0a1a69fd7ed5 not found: ID does not exist" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.790733 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.805515 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.819481 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:02:34 crc kubenswrapper[4746]: E0128 21:02:34.820143 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerName="ceilometer-notification-agent" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.820171 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerName="ceilometer-notification-agent" Jan 28 21:02:34 crc kubenswrapper[4746]: E0128 21:02:34.820213 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerName="ceilometer-central-agent" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.820228 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerName="ceilometer-central-agent" Jan 28 21:02:34 crc kubenswrapper[4746]: E0128 21:02:34.820276 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerName="proxy-httpd" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.820292 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerName="proxy-httpd" Jan 28 21:02:34 crc kubenswrapper[4746]: E0128 21:02:34.820339 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerName="sg-core" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.820352 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerName="sg-core" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.820735 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerName="ceilometer-central-agent" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.820777 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerName="sg-core" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.820829 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerName="ceilometer-notification-agent" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.820851 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" containerName="proxy-httpd" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.824158 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.827241 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.827513 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.830420 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.835503 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.868809 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cbb8deb-abbe-4971-aef9-e1a801eb55eb" path="/var/lib/kubelet/pods/4cbb8deb-abbe-4971-aef9-e1a801eb55eb/volumes" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.909476 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.909527 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-config-data\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.909553 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.909925 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-run-httpd\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.910049 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlbz2\" (UniqueName: \"kubernetes.io/projected/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-kube-api-access-rlbz2\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.910233 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.910292 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-scripts\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:34 crc kubenswrapper[4746]: I0128 21:02:34.910340 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-log-httpd\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:35 crc kubenswrapper[4746]: I0128 21:02:35.012326 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-run-httpd\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:35 crc kubenswrapper[4746]: I0128 21:02:35.012406 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlbz2\" (UniqueName: \"kubernetes.io/projected/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-kube-api-access-rlbz2\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:35 crc kubenswrapper[4746]: I0128 21:02:35.012509 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:35 crc kubenswrapper[4746]: I0128 21:02:35.012543 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-scripts\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:35 crc kubenswrapper[4746]: I0128 21:02:35.012570 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-log-httpd\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:35 crc kubenswrapper[4746]: I0128 21:02:35.012662 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:35 crc kubenswrapper[4746]: I0128 21:02:35.012684 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-config-data\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:35 crc kubenswrapper[4746]: I0128 21:02:35.012727 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:35 crc kubenswrapper[4746]: I0128 21:02:35.012914 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-run-httpd\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:35 crc kubenswrapper[4746]: I0128 21:02:35.012983 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-log-httpd\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:35 crc kubenswrapper[4746]: I0128 21:02:35.016736 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-scripts\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:35 crc kubenswrapper[4746]: I0128 21:02:35.018151 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-config-data\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:35 crc kubenswrapper[4746]: I0128 21:02:35.019272 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:35 crc kubenswrapper[4746]: I0128 21:02:35.021013 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:35 crc kubenswrapper[4746]: I0128 21:02:35.021688 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:35 crc kubenswrapper[4746]: I0128 21:02:35.032873 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlbz2\" (UniqueName: \"kubernetes.io/projected/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-kube-api-access-rlbz2\") pod \"ceilometer-0\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " pod="openstack/ceilometer-0" Jan 28 21:02:35 crc kubenswrapper[4746]: I0128 21:02:35.155227 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:02:35 crc kubenswrapper[4746]: I0128 21:02:35.623495 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:02:35 crc kubenswrapper[4746]: W0128 21:02:35.625785 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f1d0c8d_975f_4547_ad62_7e836f6db0d5.slice/crio-a40af6cbefe4f62eb09f84e77613ac10b151f8ff99bb29ed9ef4611a870eaca5 WatchSource:0}: Error finding container a40af6cbefe4f62eb09f84e77613ac10b151f8ff99bb29ed9ef4611a870eaca5: Status 404 returned error can't find the container with id a40af6cbefe4f62eb09f84e77613ac10b151f8ff99bb29ed9ef4611a870eaca5 Jan 28 21:02:36 crc kubenswrapper[4746]: I0128 21:02:36.471403 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f1d0c8d-975f-4547-ad62-7e836f6db0d5","Type":"ContainerStarted","Data":"d22798994d9d85a005557c5384c71e7c865d9945f49c70058a8f83c508bd9467"} Jan 28 21:02:36 crc kubenswrapper[4746]: I0128 21:02:36.471698 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f1d0c8d-975f-4547-ad62-7e836f6db0d5","Type":"ContainerStarted","Data":"a40af6cbefe4f62eb09f84e77613ac10b151f8ff99bb29ed9ef4611a870eaca5"} Jan 28 21:02:36 crc kubenswrapper[4746]: I0128 21:02:36.887392 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 21:02:36 crc kubenswrapper[4746]: I0128 21:02:36.887497 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 28 21:02:37 crc kubenswrapper[4746]: I0128 21:02:37.485258 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f1d0c8d-975f-4547-ad62-7e836f6db0d5","Type":"ContainerStarted","Data":"799c299015159163ac91e042ecc086de90ea53f16bef9fc8c2be68c7d25f6902"} Jan 28 21:02:38 crc kubenswrapper[4746]: I0128 21:02:38.500673 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f1d0c8d-975f-4547-ad62-7e836f6db0d5","Type":"ContainerStarted","Data":"40abf9f07158df6c0a04c52f9b9b32c660b9406e8f52d7c3c271923dacda9e3a"} Jan 28 21:02:39 crc kubenswrapper[4746]: I0128 21:02:39.655354 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 28 21:02:39 crc kubenswrapper[4746]: I0128 21:02:39.773906 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 28 21:02:40 crc kubenswrapper[4746]: I0128 21:02:40.520064 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f1d0c8d-975f-4547-ad62-7e836f6db0d5","Type":"ContainerStarted","Data":"25ae8953e5f80f13d3cbc7318b281f2aeccc12e556185c9577598aff83adc74b"} Jan 28 21:02:40 crc kubenswrapper[4746]: I0128 21:02:40.555697 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 28 21:02:40 crc kubenswrapper[4746]: I0128 21:02:40.562306 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.605348379 podStartE2EDuration="6.562212733s" podCreationTimestamp="2026-01-28 21:02:34 +0000 UTC" firstStartedPulling="2026-01-28 21:02:35.629668904 +0000 UTC m=+1383.585855258" lastFinishedPulling="2026-01-28 21:02:39.586533258 +0000 UTC m=+1387.542719612" observedRunningTime="2026-01-28 21:02:40.548897574 +0000 UTC m=+1388.505083928" watchObservedRunningTime="2026-01-28 21:02:40.562212733 +0000 UTC m=+1388.518399097" Jan 28 21:02:40 crc kubenswrapper[4746]: I0128 21:02:40.804599 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7826h" podUID="04eb3dcd-859e-4fad-829e-c24bf2c954b4" containerName="registry-server" probeResult="failure" output=< Jan 28 21:02:40 crc kubenswrapper[4746]: timeout: failed to connect service ":50051" within 1s Jan 28 21:02:40 crc kubenswrapper[4746]: > Jan 28 21:02:40 crc kubenswrapper[4746]: I0128 21:02:40.823269 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 21:02:40 crc kubenswrapper[4746]: I0128 21:02:40.823355 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 21:02:41 crc kubenswrapper[4746]: I0128 21:02:41.533870 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 21:02:41 crc kubenswrapper[4746]: I0128 21:02:41.887735 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 28 21:02:41 crc kubenswrapper[4746]: I0128 21:02:41.888305 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 28 21:02:41 crc kubenswrapper[4746]: I0128 21:02:41.905396 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e3ee42d3-e89f-4e39-b37a-4f01000355a4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.233:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 21:02:41 crc kubenswrapper[4746]: I0128 21:02:41.905786 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e3ee42d3-e89f-4e39-b37a-4f01000355a4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.233:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 28 21:02:41 crc kubenswrapper[4746]: I0128 21:02:41.929487 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 28 21:02:42 crc kubenswrapper[4746]: I0128 21:02:42.903242 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.235:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 21:02:42 crc kubenswrapper[4746]: I0128 21:02:42.903293 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.235:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 21:02:50 crc kubenswrapper[4746]: I0128 21:02:50.818203 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7826h" podUID="04eb3dcd-859e-4fad-829e-c24bf2c954b4" containerName="registry-server" probeResult="failure" output=< Jan 28 21:02:50 crc kubenswrapper[4746]: timeout: failed to connect service ":50051" within 1s Jan 28 21:02:50 crc kubenswrapper[4746]: > Jan 28 21:02:50 crc kubenswrapper[4746]: I0128 21:02:50.827769 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 28 21:02:50 crc kubenswrapper[4746]: I0128 21:02:50.828546 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 28 21:02:50 crc kubenswrapper[4746]: I0128 21:02:50.829593 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 28 21:02:50 crc kubenswrapper[4746]: I0128 21:02:50.849429 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 28 21:02:51 crc kubenswrapper[4746]: I0128 21:02:51.635647 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 28 21:02:51 crc kubenswrapper[4746]: I0128 21:02:51.639376 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 28 21:02:51 crc kubenswrapper[4746]: I0128 21:02:51.913965 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-zstkj"] Jan 28 21:02:51 crc kubenswrapper[4746]: I0128 21:02:51.918151 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:51 crc kubenswrapper[4746]: I0128 21:02:51.933552 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 28 21:02:51 crc kubenswrapper[4746]: I0128 21:02:51.933609 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 28 21:02:51 crc kubenswrapper[4746]: I0128 21:02:51.958101 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-zstkj"] Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.094675 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b54d4\" (UniqueName: \"kubernetes.io/projected/8dc0a819-0fb7-4d64-a2f3-4e762be61026-kube-api-access-b54d4\") pod \"dnsmasq-dns-5fd9b586ff-zstkj\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.094765 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-zstkj\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.094843 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-config\") pod \"dnsmasq-dns-5fd9b586ff-zstkj\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.094870 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-zstkj\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.094928 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-zstkj\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.094973 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-zstkj\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.141551 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.186762 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.196325 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-zstkj\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.196394 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-zstkj\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.196440 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b54d4\" (UniqueName: \"kubernetes.io/projected/8dc0a819-0fb7-4d64-a2f3-4e762be61026-kube-api-access-b54d4\") pod \"dnsmasq-dns-5fd9b586ff-zstkj\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.196492 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-zstkj\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.196544 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-config\") pod \"dnsmasq-dns-5fd9b586ff-zstkj\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.196564 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-zstkj\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.197486 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-zstkj\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.198606 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-zstkj\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.199142 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-zstkj\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.199805 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-zstkj\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.200354 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-config\") pod \"dnsmasq-dns-5fd9b586ff-zstkj\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.230383 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b54d4\" (UniqueName: \"kubernetes.io/projected/8dc0a819-0fb7-4d64-a2f3-4e762be61026-kube-api-access-b54d4\") pod \"dnsmasq-dns-5fd9b586ff-zstkj\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.249437 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:52 crc kubenswrapper[4746]: I0128 21:02:52.829853 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-zstkj"] Jan 28 21:02:53 crc kubenswrapper[4746]: I0128 21:02:53.654032 4746 generic.go:334] "Generic (PLEG): container finished" podID="8dc0a819-0fb7-4d64-a2f3-4e762be61026" containerID="99f3f9bede6349668311a33fbe8d25243120e4d3f65c682327304b86efe81700" exitCode=0 Jan 28 21:02:53 crc kubenswrapper[4746]: I0128 21:02:53.654238 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" event={"ID":"8dc0a819-0fb7-4d64-a2f3-4e762be61026","Type":"ContainerDied","Data":"99f3f9bede6349668311a33fbe8d25243120e4d3f65c682327304b86efe81700"} Jan 28 21:02:53 crc kubenswrapper[4746]: I0128 21:02:53.654451 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" event={"ID":"8dc0a819-0fb7-4d64-a2f3-4e762be61026","Type":"ContainerStarted","Data":"3100f2210755488d4c4fa735b19dd7709134fbc5c927b02e3e0e00c3a899650e"} Jan 28 21:02:54 crc kubenswrapper[4746]: I0128 21:02:54.357287 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:02:54 crc kubenswrapper[4746]: I0128 21:02:54.361316 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerName="ceilometer-central-agent" containerID="cri-o://d22798994d9d85a005557c5384c71e7c865d9945f49c70058a8f83c508bd9467" gracePeriod=30 Jan 28 21:02:54 crc kubenswrapper[4746]: I0128 21:02:54.361566 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerName="proxy-httpd" containerID="cri-o://25ae8953e5f80f13d3cbc7318b281f2aeccc12e556185c9577598aff83adc74b" gracePeriod=30 Jan 28 21:02:54 crc kubenswrapper[4746]: I0128 21:02:54.361642 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerName="sg-core" containerID="cri-o://40abf9f07158df6c0a04c52f9b9b32c660b9406e8f52d7c3c271923dacda9e3a" gracePeriod=30 Jan 28 21:02:54 crc kubenswrapper[4746]: I0128 21:02:54.361703 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerName="ceilometer-notification-agent" containerID="cri-o://799c299015159163ac91e042ecc086de90ea53f16bef9fc8c2be68c7d25f6902" gracePeriod=30 Jan 28 21:02:54 crc kubenswrapper[4746]: I0128 21:02:54.379235 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.236:3000/\": read tcp 10.217.0.2:37186->10.217.0.236:3000: read: connection reset by peer" Jan 28 21:02:54 crc kubenswrapper[4746]: I0128 21:02:54.669035 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" event={"ID":"8dc0a819-0fb7-4d64-a2f3-4e762be61026","Type":"ContainerStarted","Data":"bdd5e5ce30341333f996d8fd3b7934f5a86c197b203f920715526262a91ee325"} Jan 28 21:02:54 crc kubenswrapper[4746]: I0128 21:02:54.669493 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:02:54 crc kubenswrapper[4746]: I0128 21:02:54.756500 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" podStartSLOduration=3.756468667 podStartE2EDuration="3.756468667s" podCreationTimestamp="2026-01-28 21:02:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:02:54.700426249 +0000 UTC m=+1402.656612603" watchObservedRunningTime="2026-01-28 21:02:54.756468667 +0000 UTC m=+1402.712655021" Jan 28 21:02:54 crc kubenswrapper[4746]: I0128 21:02:54.757873 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 28 21:02:54 crc kubenswrapper[4746]: I0128 21:02:54.758138 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e3ee42d3-e89f-4e39-b37a-4f01000355a4" containerName="nova-api-api" containerID="cri-o://8f2f925c10bbd7275f49f2b27d72aa4d876f1f0d5efa8ac05c79fa1fcae80c1c" gracePeriod=30 Jan 28 21:02:54 crc kubenswrapper[4746]: I0128 21:02:54.758294 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e3ee42d3-e89f-4e39-b37a-4f01000355a4" containerName="nova-api-log" containerID="cri-o://d01d05176baf7ab85d89a66c810887542fc5aa90c182e9b02313ae59f6e754f4" gracePeriod=30 Jan 28 21:02:55 crc kubenswrapper[4746]: I0128 21:02:55.684358 4746 generic.go:334] "Generic (PLEG): container finished" podID="e3ee42d3-e89f-4e39-b37a-4f01000355a4" containerID="d01d05176baf7ab85d89a66c810887542fc5aa90c182e9b02313ae59f6e754f4" exitCode=143 Jan 28 21:02:55 crc kubenswrapper[4746]: I0128 21:02:55.684457 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3ee42d3-e89f-4e39-b37a-4f01000355a4","Type":"ContainerDied","Data":"d01d05176baf7ab85d89a66c810887542fc5aa90c182e9b02313ae59f6e754f4"} Jan 28 21:02:55 crc kubenswrapper[4746]: I0128 21:02:55.689625 4746 generic.go:334] "Generic (PLEG): container finished" podID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerID="25ae8953e5f80f13d3cbc7318b281f2aeccc12e556185c9577598aff83adc74b" exitCode=0 Jan 28 21:02:55 crc kubenswrapper[4746]: I0128 21:02:55.689645 4746 generic.go:334] "Generic (PLEG): container finished" podID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerID="40abf9f07158df6c0a04c52f9b9b32c660b9406e8f52d7c3c271923dacda9e3a" exitCode=2 Jan 28 21:02:55 crc kubenswrapper[4746]: I0128 21:02:55.689652 4746 generic.go:334] "Generic (PLEG): container finished" podID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerID="d22798994d9d85a005557c5384c71e7c865d9945f49c70058a8f83c508bd9467" exitCode=0 Jan 28 21:02:55 crc kubenswrapper[4746]: I0128 21:02:55.689721 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f1d0c8d-975f-4547-ad62-7e836f6db0d5","Type":"ContainerDied","Data":"25ae8953e5f80f13d3cbc7318b281f2aeccc12e556185c9577598aff83adc74b"} Jan 28 21:02:55 crc kubenswrapper[4746]: I0128 21:02:55.689765 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f1d0c8d-975f-4547-ad62-7e836f6db0d5","Type":"ContainerDied","Data":"40abf9f07158df6c0a04c52f9b9b32c660b9406e8f52d7c3c271923dacda9e3a"} Jan 28 21:02:55 crc kubenswrapper[4746]: I0128 21:02:55.689779 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f1d0c8d-975f-4547-ad62-7e836f6db0d5","Type":"ContainerDied","Data":"d22798994d9d85a005557c5384c71e7c865d9945f49c70058a8f83c508bd9467"} Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.534506 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.639292 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlxcg\" (UniqueName: \"kubernetes.io/projected/e3ee42d3-e89f-4e39-b37a-4f01000355a4-kube-api-access-rlxcg\") pod \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\" (UID: \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\") " Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.639522 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ee42d3-e89f-4e39-b37a-4f01000355a4-logs\") pod \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\" (UID: \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\") " Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.639811 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ee42d3-e89f-4e39-b37a-4f01000355a4-combined-ca-bundle\") pod \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\" (UID: \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\") " Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.639845 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ee42d3-e89f-4e39-b37a-4f01000355a4-config-data\") pod \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\" (UID: \"e3ee42d3-e89f-4e39-b37a-4f01000355a4\") " Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.640366 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3ee42d3-e89f-4e39-b37a-4f01000355a4-logs" (OuterVolumeSpecName: "logs") pod "e3ee42d3-e89f-4e39-b37a-4f01000355a4" (UID: "e3ee42d3-e89f-4e39-b37a-4f01000355a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.641135 4746 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ee42d3-e89f-4e39-b37a-4f01000355a4-logs\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.654863 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3ee42d3-e89f-4e39-b37a-4f01000355a4-kube-api-access-rlxcg" (OuterVolumeSpecName: "kube-api-access-rlxcg") pod "e3ee42d3-e89f-4e39-b37a-4f01000355a4" (UID: "e3ee42d3-e89f-4e39-b37a-4f01000355a4"). InnerVolumeSpecName "kube-api-access-rlxcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.702593 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ee42d3-e89f-4e39-b37a-4f01000355a4-config-data" (OuterVolumeSpecName: "config-data") pod "e3ee42d3-e89f-4e39-b37a-4f01000355a4" (UID: "e3ee42d3-e89f-4e39-b37a-4f01000355a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.711951 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ee42d3-e89f-4e39-b37a-4f01000355a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3ee42d3-e89f-4e39-b37a-4f01000355a4" (UID: "e3ee42d3-e89f-4e39-b37a-4f01000355a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.740040 4746 generic.go:334] "Generic (PLEG): container finished" podID="e3ee42d3-e89f-4e39-b37a-4f01000355a4" containerID="8f2f925c10bbd7275f49f2b27d72aa4d876f1f0d5efa8ac05c79fa1fcae80c1c" exitCode=0 Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.740113 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3ee42d3-e89f-4e39-b37a-4f01000355a4","Type":"ContainerDied","Data":"8f2f925c10bbd7275f49f2b27d72aa4d876f1f0d5efa8ac05c79fa1fcae80c1c"} Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.740145 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3ee42d3-e89f-4e39-b37a-4f01000355a4","Type":"ContainerDied","Data":"a051b19a53544dc58685c3699fab406c84df579e296324343a88a2c38f1b366c"} Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.740162 4746 scope.go:117] "RemoveContainer" containerID="8f2f925c10bbd7275f49f2b27d72aa4d876f1f0d5efa8ac05c79fa1fcae80c1c" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.740315 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.745152 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ee42d3-e89f-4e39-b37a-4f01000355a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.745182 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ee42d3-e89f-4e39-b37a-4f01000355a4-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.745200 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlxcg\" (UniqueName: \"kubernetes.io/projected/e3ee42d3-e89f-4e39-b37a-4f01000355a4-kube-api-access-rlxcg\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.825840 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.835516 4746 scope.go:117] "RemoveContainer" containerID="d01d05176baf7ab85d89a66c810887542fc5aa90c182e9b02313ae59f6e754f4" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.897229 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.897299 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 28 21:02:59 crc kubenswrapper[4746]: E0128 21:02:58.897818 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ee42d3-e89f-4e39-b37a-4f01000355a4" containerName="nova-api-api" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.897835 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ee42d3-e89f-4e39-b37a-4f01000355a4" containerName="nova-api-api" Jan 28 21:02:59 crc kubenswrapper[4746]: E0128 21:02:58.897847 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ee42d3-e89f-4e39-b37a-4f01000355a4" containerName="nova-api-log" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.897853 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ee42d3-e89f-4e39-b37a-4f01000355a4" containerName="nova-api-log" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.898127 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ee42d3-e89f-4e39-b37a-4f01000355a4" containerName="nova-api-log" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.898150 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ee42d3-e89f-4e39-b37a-4f01000355a4" containerName="nova-api-api" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.899662 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.899855 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.902578 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.902657 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.902927 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.940283 4746 scope.go:117] "RemoveContainer" containerID="8f2f925c10bbd7275f49f2b27d72aa4d876f1f0d5efa8ac05c79fa1fcae80c1c" Jan 28 21:02:59 crc kubenswrapper[4746]: E0128 21:02:58.943015 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f2f925c10bbd7275f49f2b27d72aa4d876f1f0d5efa8ac05c79fa1fcae80c1c\": container with ID starting with 8f2f925c10bbd7275f49f2b27d72aa4d876f1f0d5efa8ac05c79fa1fcae80c1c not found: ID does not exist" containerID="8f2f925c10bbd7275f49f2b27d72aa4d876f1f0d5efa8ac05c79fa1fcae80c1c" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.943072 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f2f925c10bbd7275f49f2b27d72aa4d876f1f0d5efa8ac05c79fa1fcae80c1c"} err="failed to get container status \"8f2f925c10bbd7275f49f2b27d72aa4d876f1f0d5efa8ac05c79fa1fcae80c1c\": rpc error: code = NotFound desc = could not find container \"8f2f925c10bbd7275f49f2b27d72aa4d876f1f0d5efa8ac05c79fa1fcae80c1c\": container with ID starting with 8f2f925c10bbd7275f49f2b27d72aa4d876f1f0d5efa8ac05c79fa1fcae80c1c not found: ID does not exist" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.943151 4746 scope.go:117] "RemoveContainer" containerID="d01d05176baf7ab85d89a66c810887542fc5aa90c182e9b02313ae59f6e754f4" Jan 28 21:02:59 crc kubenswrapper[4746]: E0128 21:02:58.944702 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d01d05176baf7ab85d89a66c810887542fc5aa90c182e9b02313ae59f6e754f4\": container with ID starting with d01d05176baf7ab85d89a66c810887542fc5aa90c182e9b02313ae59f6e754f4 not found: ID does not exist" containerID="d01d05176baf7ab85d89a66c810887542fc5aa90c182e9b02313ae59f6e754f4" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:58.944734 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01d05176baf7ab85d89a66c810887542fc5aa90c182e9b02313ae59f6e754f4"} err="failed to get container status \"d01d05176baf7ab85d89a66c810887542fc5aa90c182e9b02313ae59f6e754f4\": rpc error: code = NotFound desc = could not find container \"d01d05176baf7ab85d89a66c810887542fc5aa90c182e9b02313ae59f6e754f4\": container with ID starting with d01d05176baf7ab85d89a66c810887542fc5aa90c182e9b02313ae59f6e754f4 not found: ID does not exist" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.052343 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgz97\" (UniqueName: \"kubernetes.io/projected/aca41824-3271-42e1-93f8-76a1a9000681-kube-api-access-rgz97\") pod \"nova-api-0\" (UID: \"aca41824-3271-42e1-93f8-76a1a9000681\") " pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.052444 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca41824-3271-42e1-93f8-76a1a9000681-config-data\") pod \"nova-api-0\" (UID: \"aca41824-3271-42e1-93f8-76a1a9000681\") " pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.052497 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca41824-3271-42e1-93f8-76a1a9000681-public-tls-certs\") pod \"nova-api-0\" (UID: \"aca41824-3271-42e1-93f8-76a1a9000681\") " pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.052528 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca41824-3271-42e1-93f8-76a1a9000681-internal-tls-certs\") pod \"nova-api-0\" (UID: \"aca41824-3271-42e1-93f8-76a1a9000681\") " pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.052544 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aca41824-3271-42e1-93f8-76a1a9000681-logs\") pod \"nova-api-0\" (UID: \"aca41824-3271-42e1-93f8-76a1a9000681\") " pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.052750 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca41824-3271-42e1-93f8-76a1a9000681-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aca41824-3271-42e1-93f8-76a1a9000681\") " pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.154176 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca41824-3271-42e1-93f8-76a1a9000681-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aca41824-3271-42e1-93f8-76a1a9000681\") " pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.154548 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgz97\" (UniqueName: \"kubernetes.io/projected/aca41824-3271-42e1-93f8-76a1a9000681-kube-api-access-rgz97\") pod \"nova-api-0\" (UID: \"aca41824-3271-42e1-93f8-76a1a9000681\") " pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.154595 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca41824-3271-42e1-93f8-76a1a9000681-config-data\") pod \"nova-api-0\" (UID: \"aca41824-3271-42e1-93f8-76a1a9000681\") " pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.154616 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca41824-3271-42e1-93f8-76a1a9000681-public-tls-certs\") pod \"nova-api-0\" (UID: \"aca41824-3271-42e1-93f8-76a1a9000681\") " pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.154647 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca41824-3271-42e1-93f8-76a1a9000681-internal-tls-certs\") pod \"nova-api-0\" (UID: \"aca41824-3271-42e1-93f8-76a1a9000681\") " pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.154669 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aca41824-3271-42e1-93f8-76a1a9000681-logs\") pod \"nova-api-0\" (UID: \"aca41824-3271-42e1-93f8-76a1a9000681\") " pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.155965 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aca41824-3271-42e1-93f8-76a1a9000681-logs\") pod \"nova-api-0\" (UID: \"aca41824-3271-42e1-93f8-76a1a9000681\") " pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.162346 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca41824-3271-42e1-93f8-76a1a9000681-config-data\") pod \"nova-api-0\" (UID: \"aca41824-3271-42e1-93f8-76a1a9000681\") " pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.164649 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca41824-3271-42e1-93f8-76a1a9000681-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aca41824-3271-42e1-93f8-76a1a9000681\") " pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.164761 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca41824-3271-42e1-93f8-76a1a9000681-internal-tls-certs\") pod \"nova-api-0\" (UID: \"aca41824-3271-42e1-93f8-76a1a9000681\") " pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.175104 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgz97\" (UniqueName: \"kubernetes.io/projected/aca41824-3271-42e1-93f8-76a1a9000681-kube-api-access-rgz97\") pod \"nova-api-0\" (UID: \"aca41824-3271-42e1-93f8-76a1a9000681\") " pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.182709 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aca41824-3271-42e1-93f8-76a1a9000681-public-tls-certs\") pod \"nova-api-0\" (UID: \"aca41824-3271-42e1-93f8-76a1a9000681\") " pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.229770 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.463571 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.565264 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-sg-core-conf-yaml\") pod \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.565387 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-log-httpd\") pod \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.565410 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-ceilometer-tls-certs\") pod \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.565428 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-scripts\") pod \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.565515 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlbz2\" (UniqueName: \"kubernetes.io/projected/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-kube-api-access-rlbz2\") pod \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.565594 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-config-data\") pod \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.565652 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-run-httpd\") pod \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.565755 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-combined-ca-bundle\") pod \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\" (UID: \"4f1d0c8d-975f-4547-ad62-7e836f6db0d5\") " Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.569282 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4f1d0c8d-975f-4547-ad62-7e836f6db0d5" (UID: "4f1d0c8d-975f-4547-ad62-7e836f6db0d5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.571692 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4f1d0c8d-975f-4547-ad62-7e836f6db0d5" (UID: "4f1d0c8d-975f-4547-ad62-7e836f6db0d5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.572960 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-scripts" (OuterVolumeSpecName: "scripts") pod "4f1d0c8d-975f-4547-ad62-7e836f6db0d5" (UID: "4f1d0c8d-975f-4547-ad62-7e836f6db0d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.575651 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-kube-api-access-rlbz2" (OuterVolumeSpecName: "kube-api-access-rlbz2") pod "4f1d0c8d-975f-4547-ad62-7e836f6db0d5" (UID: "4f1d0c8d-975f-4547-ad62-7e836f6db0d5"). InnerVolumeSpecName "kube-api-access-rlbz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.611707 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4f1d0c8d-975f-4547-ad62-7e836f6db0d5" (UID: "4f1d0c8d-975f-4547-ad62-7e836f6db0d5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.668709 4746 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.668735 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.668744 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlbz2\" (UniqueName: \"kubernetes.io/projected/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-kube-api-access-rlbz2\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.668753 4746 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.668763 4746 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.714290 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4f1d0c8d-975f-4547-ad62-7e836f6db0d5" (UID: "4f1d0c8d-975f-4547-ad62-7e836f6db0d5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.720313 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f1d0c8d-975f-4547-ad62-7e836f6db0d5" (UID: "4f1d0c8d-975f-4547-ad62-7e836f6db0d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.771379 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.771424 4746 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.792235 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-config-data" (OuterVolumeSpecName: "config-data") pod "4f1d0c8d-975f-4547-ad62-7e836f6db0d5" (UID: "4f1d0c8d-975f-4547-ad62-7e836f6db0d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.805631 4746 generic.go:334] "Generic (PLEG): container finished" podID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerID="799c299015159163ac91e042ecc086de90ea53f16bef9fc8c2be68c7d25f6902" exitCode=0 Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.805725 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f1d0c8d-975f-4547-ad62-7e836f6db0d5","Type":"ContainerDied","Data":"799c299015159163ac91e042ecc086de90ea53f16bef9fc8c2be68c7d25f6902"} Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.805771 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f1d0c8d-975f-4547-ad62-7e836f6db0d5","Type":"ContainerDied","Data":"a40af6cbefe4f62eb09f84e77613ac10b151f8ff99bb29ed9ef4611a870eaca5"} Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.805794 4746 scope.go:117] "RemoveContainer" containerID="25ae8953e5f80f13d3cbc7318b281f2aeccc12e556185c9577598aff83adc74b" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.806063 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.874721 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f1d0c8d-975f-4547-ad62-7e836f6db0d5-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.880380 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.880560 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7826h" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.882443 4746 scope.go:117] "RemoveContainer" containerID="40abf9f07158df6c0a04c52f9b9b32c660b9406e8f52d7c3c271923dacda9e3a" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.910559 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.956190 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.979667 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7826h" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.989023 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:02:59 crc kubenswrapper[4746]: E0128 21:02:59.990726 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerName="ceilometer-notification-agent" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.990744 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerName="ceilometer-notification-agent" Jan 28 21:02:59 crc kubenswrapper[4746]: E0128 21:02:59.990759 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerName="sg-core" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.990766 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerName="sg-core" Jan 28 21:02:59 crc kubenswrapper[4746]: E0128 21:02:59.990779 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerName="proxy-httpd" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.990805 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerName="proxy-httpd" Jan 28 21:02:59 crc kubenswrapper[4746]: E0128 21:02:59.990905 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerName="ceilometer-central-agent" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.990982 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerName="ceilometer-central-agent" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.991440 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerName="sg-core" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.991477 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerName="ceilometer-notification-agent" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.991525 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerName="ceilometer-central-agent" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.991537 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" containerName="proxy-httpd" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.992265 4746 scope.go:117] "RemoveContainer" containerID="799c299015159163ac91e042ecc086de90ea53f16bef9fc8c2be68c7d25f6902" Jan 28 21:02:59 crc kubenswrapper[4746]: I0128 21:02:59.998491 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.002611 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.002833 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.003892 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.035219 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.091684 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-scripts\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.091766 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.091817 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-config-data\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.091872 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4612060-a99e-4bc4-b074-2dffa9cc7050-log-httpd\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.091915 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.091936 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.091964 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4612060-a99e-4bc4-b074-2dffa9cc7050-run-httpd\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.091992 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xxvw\" (UniqueName: \"kubernetes.io/projected/a4612060-a99e-4bc4-b074-2dffa9cc7050-kube-api-access-8xxvw\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.136814 4746 scope.go:117] "RemoveContainer" containerID="d22798994d9d85a005557c5384c71e7c865d9945f49c70058a8f83c508bd9467" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.155257 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7826h"] Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.182976 4746 scope.go:117] "RemoveContainer" containerID="25ae8953e5f80f13d3cbc7318b281f2aeccc12e556185c9577598aff83adc74b" Jan 28 21:03:00 crc kubenswrapper[4746]: E0128 21:03:00.183761 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25ae8953e5f80f13d3cbc7318b281f2aeccc12e556185c9577598aff83adc74b\": container with ID starting with 25ae8953e5f80f13d3cbc7318b281f2aeccc12e556185c9577598aff83adc74b not found: ID does not exist" containerID="25ae8953e5f80f13d3cbc7318b281f2aeccc12e556185c9577598aff83adc74b" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.183807 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25ae8953e5f80f13d3cbc7318b281f2aeccc12e556185c9577598aff83adc74b"} err="failed to get container status \"25ae8953e5f80f13d3cbc7318b281f2aeccc12e556185c9577598aff83adc74b\": rpc error: code = NotFound desc = could not find container \"25ae8953e5f80f13d3cbc7318b281f2aeccc12e556185c9577598aff83adc74b\": container with ID starting with 25ae8953e5f80f13d3cbc7318b281f2aeccc12e556185c9577598aff83adc74b not found: ID does not exist" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.183832 4746 scope.go:117] "RemoveContainer" containerID="40abf9f07158df6c0a04c52f9b9b32c660b9406e8f52d7c3c271923dacda9e3a" Jan 28 21:03:00 crc kubenswrapper[4746]: E0128 21:03:00.184368 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40abf9f07158df6c0a04c52f9b9b32c660b9406e8f52d7c3c271923dacda9e3a\": container with ID starting with 40abf9f07158df6c0a04c52f9b9b32c660b9406e8f52d7c3c271923dacda9e3a not found: ID does not exist" containerID="40abf9f07158df6c0a04c52f9b9b32c660b9406e8f52d7c3c271923dacda9e3a" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.184398 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40abf9f07158df6c0a04c52f9b9b32c660b9406e8f52d7c3c271923dacda9e3a"} err="failed to get container status \"40abf9f07158df6c0a04c52f9b9b32c660b9406e8f52d7c3c271923dacda9e3a\": rpc error: code = NotFound desc = could not find container \"40abf9f07158df6c0a04c52f9b9b32c660b9406e8f52d7c3c271923dacda9e3a\": container with ID starting with 40abf9f07158df6c0a04c52f9b9b32c660b9406e8f52d7c3c271923dacda9e3a not found: ID does not exist" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.184420 4746 scope.go:117] "RemoveContainer" containerID="799c299015159163ac91e042ecc086de90ea53f16bef9fc8c2be68c7d25f6902" Jan 28 21:03:00 crc kubenswrapper[4746]: E0128 21:03:00.184781 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"799c299015159163ac91e042ecc086de90ea53f16bef9fc8c2be68c7d25f6902\": container with ID starting with 799c299015159163ac91e042ecc086de90ea53f16bef9fc8c2be68c7d25f6902 not found: ID does not exist" containerID="799c299015159163ac91e042ecc086de90ea53f16bef9fc8c2be68c7d25f6902" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.184797 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"799c299015159163ac91e042ecc086de90ea53f16bef9fc8c2be68c7d25f6902"} err="failed to get container status \"799c299015159163ac91e042ecc086de90ea53f16bef9fc8c2be68c7d25f6902\": rpc error: code = NotFound desc = could not find container \"799c299015159163ac91e042ecc086de90ea53f16bef9fc8c2be68c7d25f6902\": container with ID starting with 799c299015159163ac91e042ecc086de90ea53f16bef9fc8c2be68c7d25f6902 not found: ID does not exist" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.184810 4746 scope.go:117] "RemoveContainer" containerID="d22798994d9d85a005557c5384c71e7c865d9945f49c70058a8f83c508bd9467" Jan 28 21:03:00 crc kubenswrapper[4746]: E0128 21:03:00.184986 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d22798994d9d85a005557c5384c71e7c865d9945f49c70058a8f83c508bd9467\": container with ID starting with d22798994d9d85a005557c5384c71e7c865d9945f49c70058a8f83c508bd9467 not found: ID does not exist" containerID="d22798994d9d85a005557c5384c71e7c865d9945f49c70058a8f83c508bd9467" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.185000 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d22798994d9d85a005557c5384c71e7c865d9945f49c70058a8f83c508bd9467"} err="failed to get container status \"d22798994d9d85a005557c5384c71e7c865d9945f49c70058a8f83c508bd9467\": rpc error: code = NotFound desc = could not find container \"d22798994d9d85a005557c5384c71e7c865d9945f49c70058a8f83c508bd9467\": container with ID starting with d22798994d9d85a005557c5384c71e7c865d9945f49c70058a8f83c508bd9467 not found: ID does not exist" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.193390 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4612060-a99e-4bc4-b074-2dffa9cc7050-log-httpd\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.193453 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.193474 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.193500 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4612060-a99e-4bc4-b074-2dffa9cc7050-run-httpd\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.193533 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xxvw\" (UniqueName: \"kubernetes.io/projected/a4612060-a99e-4bc4-b074-2dffa9cc7050-kube-api-access-8xxvw\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.193579 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-scripts\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.193619 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.193657 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-config-data\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.195361 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4612060-a99e-4bc4-b074-2dffa9cc7050-run-httpd\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.196390 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4612060-a99e-4bc4-b074-2dffa9cc7050-log-httpd\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.200239 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.200917 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-config-data\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.202646 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.202854 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.203041 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-scripts\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.214682 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xxvw\" (UniqueName: \"kubernetes.io/projected/a4612060-a99e-4bc4-b074-2dffa9cc7050-kube-api-access-8xxvw\") pod \"ceilometer-0\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.449004 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.818195 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aca41824-3271-42e1-93f8-76a1a9000681","Type":"ContainerStarted","Data":"ce25b0ca56029d94cd2934f9877b7ee1798eb389808102ec3db07f197e39e4fa"} Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.818835 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aca41824-3271-42e1-93f8-76a1a9000681","Type":"ContainerStarted","Data":"7240900c0d9db4cf740de00720b4ea442e011cd3750ca338f4c5d4744b513270"} Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.818858 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aca41824-3271-42e1-93f8-76a1a9000681","Type":"ContainerStarted","Data":"45f749bdf075c4ed529fa0cc6794d3e229f9e29e2f5c953784da82f0860a86d0"} Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.847899 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f1d0c8d-975f-4547-ad62-7e836f6db0d5" path="/var/lib/kubelet/pods/4f1d0c8d-975f-4547-ad62-7e836f6db0d5/volumes" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.848588 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.848549546 podStartE2EDuration="2.848549546s" podCreationTimestamp="2026-01-28 21:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:03:00.8375305 +0000 UTC m=+1408.793716854" watchObservedRunningTime="2026-01-28 21:03:00.848549546 +0000 UTC m=+1408.804735900" Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.848724 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3ee42d3-e89f-4e39-b37a-4f01000355a4" path="/var/lib/kubelet/pods/e3ee42d3-e89f-4e39-b37a-4f01000355a4/volumes" Jan 28 21:03:00 crc kubenswrapper[4746]: W0128 21:03:00.929413 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4612060_a99e_4bc4_b074_2dffa9cc7050.slice/crio-aacb0d11ae88afab424fadd80d351951f1408d32707132acbca75a25812d3cf7 WatchSource:0}: Error finding container aacb0d11ae88afab424fadd80d351951f1408d32707132acbca75a25812d3cf7: Status 404 returned error can't find the container with id aacb0d11ae88afab424fadd80d351951f1408d32707132acbca75a25812d3cf7 Jan 28 21:03:00 crc kubenswrapper[4746]: I0128 21:03:00.935250 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:03:01 crc kubenswrapper[4746]: I0128 21:03:01.837719 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4612060-a99e-4bc4-b074-2dffa9cc7050","Type":"ContainerStarted","Data":"870712e1d90ee565a39d018ffd7c3ff66ecc4586dfd9bea7574cbf87b0b91b88"} Jan 28 21:03:01 crc kubenswrapper[4746]: I0128 21:03:01.837960 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4612060-a99e-4bc4-b074-2dffa9cc7050","Type":"ContainerStarted","Data":"aacb0d11ae88afab424fadd80d351951f1408d32707132acbca75a25812d3cf7"} Jan 28 21:03:01 crc kubenswrapper[4746]: I0128 21:03:01.838141 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7826h" podUID="04eb3dcd-859e-4fad-829e-c24bf2c954b4" containerName="registry-server" containerID="cri-o://97a21ed0977d08d0eaf60fcb02cf46d34a01a47812b0539f645b301cf8b1f157" gracePeriod=2 Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.262466 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.330525 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-hcnml"] Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.330770 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cd565959-hcnml" podUID="ac1b9c96-c2d1-43fb-96bb-e79328b627a6" containerName="dnsmasq-dns" containerID="cri-o://c5b4b1463b13700962c4ecfa0f03dc72ed0b20a3635db0d9f5dc5f100f396cee" gracePeriod=10 Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.544679 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7826h" Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.655358 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04eb3dcd-859e-4fad-829e-c24bf2c954b4-utilities\") pod \"04eb3dcd-859e-4fad-829e-c24bf2c954b4\" (UID: \"04eb3dcd-859e-4fad-829e-c24bf2c954b4\") " Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.655427 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04eb3dcd-859e-4fad-829e-c24bf2c954b4-catalog-content\") pod \"04eb3dcd-859e-4fad-829e-c24bf2c954b4\" (UID: \"04eb3dcd-859e-4fad-829e-c24bf2c954b4\") " Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.655617 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szzg4\" (UniqueName: \"kubernetes.io/projected/04eb3dcd-859e-4fad-829e-c24bf2c954b4-kube-api-access-szzg4\") pod \"04eb3dcd-859e-4fad-829e-c24bf2c954b4\" (UID: \"04eb3dcd-859e-4fad-829e-c24bf2c954b4\") " Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.656728 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04eb3dcd-859e-4fad-829e-c24bf2c954b4-utilities" (OuterVolumeSpecName: "utilities") pod "04eb3dcd-859e-4fad-829e-c24bf2c954b4" (UID: "04eb3dcd-859e-4fad-829e-c24bf2c954b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.668428 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04eb3dcd-859e-4fad-829e-c24bf2c954b4-kube-api-access-szzg4" (OuterVolumeSpecName: "kube-api-access-szzg4") pod "04eb3dcd-859e-4fad-829e-c24bf2c954b4" (UID: "04eb3dcd-859e-4fad-829e-c24bf2c954b4"). InnerVolumeSpecName "kube-api-access-szzg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.762112 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szzg4\" (UniqueName: \"kubernetes.io/projected/04eb3dcd-859e-4fad-829e-c24bf2c954b4-kube-api-access-szzg4\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.762153 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04eb3dcd-859e-4fad-829e-c24bf2c954b4-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.844101 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04eb3dcd-859e-4fad-829e-c24bf2c954b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04eb3dcd-859e-4fad-829e-c24bf2c954b4" (UID: "04eb3dcd-859e-4fad-829e-c24bf2c954b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.863270 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04eb3dcd-859e-4fad-829e-c24bf2c954b4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.869090 4746 generic.go:334] "Generic (PLEG): container finished" podID="04eb3dcd-859e-4fad-829e-c24bf2c954b4" containerID="97a21ed0977d08d0eaf60fcb02cf46d34a01a47812b0539f645b301cf8b1f157" exitCode=0 Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.869182 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7826h" Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.869216 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7826h" event={"ID":"04eb3dcd-859e-4fad-829e-c24bf2c954b4","Type":"ContainerDied","Data":"97a21ed0977d08d0eaf60fcb02cf46d34a01a47812b0539f645b301cf8b1f157"} Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.869978 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7826h" event={"ID":"04eb3dcd-859e-4fad-829e-c24bf2c954b4","Type":"ContainerDied","Data":"cbef2dd49c2b4030e0f1172ae75c291591c9d79b1aca6fb5002054307addec15"} Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.870011 4746 scope.go:117] "RemoveContainer" containerID="97a21ed0977d08d0eaf60fcb02cf46d34a01a47812b0539f645b301cf8b1f157" Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.876373 4746 generic.go:334] "Generic (PLEG): container finished" podID="ac1b9c96-c2d1-43fb-96bb-e79328b627a6" containerID="c5b4b1463b13700962c4ecfa0f03dc72ed0b20a3635db0d9f5dc5f100f396cee" exitCode=0 Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.876462 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-hcnml" event={"ID":"ac1b9c96-c2d1-43fb-96bb-e79328b627a6","Type":"ContainerDied","Data":"c5b4b1463b13700962c4ecfa0f03dc72ed0b20a3635db0d9f5dc5f100f396cee"} Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.901373 4746 scope.go:117] "RemoveContainer" containerID="31e21fca11d41102e20b62cb8706f889109b63df5f24a367aa7d62c27d7cad7b" Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.906452 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4612060-a99e-4bc4-b074-2dffa9cc7050","Type":"ContainerStarted","Data":"dc24cad982129800500dbe58a114a149545ff2de5f1fd9354649314a70c9220a"} Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.918916 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.951058 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7826h"] Jan 28 21:03:02 crc kubenswrapper[4746]: I0128 21:03:02.953265 4746 scope.go:117] "RemoveContainer" containerID="8489fad5f60aef344c5116823427367409db1e4a669ebd8ad9708f77b31931f5" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.011366 4746 scope.go:117] "RemoveContainer" containerID="97a21ed0977d08d0eaf60fcb02cf46d34a01a47812b0539f645b301cf8b1f157" Jan 28 21:03:03 crc kubenswrapper[4746]: E0128 21:03:03.011802 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97a21ed0977d08d0eaf60fcb02cf46d34a01a47812b0539f645b301cf8b1f157\": container with ID starting with 97a21ed0977d08d0eaf60fcb02cf46d34a01a47812b0539f645b301cf8b1f157 not found: ID does not exist" containerID="97a21ed0977d08d0eaf60fcb02cf46d34a01a47812b0539f645b301cf8b1f157" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.011834 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97a21ed0977d08d0eaf60fcb02cf46d34a01a47812b0539f645b301cf8b1f157"} err="failed to get container status \"97a21ed0977d08d0eaf60fcb02cf46d34a01a47812b0539f645b301cf8b1f157\": rpc error: code = NotFound desc = could not find container \"97a21ed0977d08d0eaf60fcb02cf46d34a01a47812b0539f645b301cf8b1f157\": container with ID starting with 97a21ed0977d08d0eaf60fcb02cf46d34a01a47812b0539f645b301cf8b1f157 not found: ID does not exist" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.011857 4746 scope.go:117] "RemoveContainer" containerID="31e21fca11d41102e20b62cb8706f889109b63df5f24a367aa7d62c27d7cad7b" Jan 28 21:03:03 crc kubenswrapper[4746]: E0128 21:03:03.012041 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31e21fca11d41102e20b62cb8706f889109b63df5f24a367aa7d62c27d7cad7b\": container with ID starting with 31e21fca11d41102e20b62cb8706f889109b63df5f24a367aa7d62c27d7cad7b not found: ID does not exist" containerID="31e21fca11d41102e20b62cb8706f889109b63df5f24a367aa7d62c27d7cad7b" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.012064 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31e21fca11d41102e20b62cb8706f889109b63df5f24a367aa7d62c27d7cad7b"} err="failed to get container status \"31e21fca11d41102e20b62cb8706f889109b63df5f24a367aa7d62c27d7cad7b\": rpc error: code = NotFound desc = could not find container \"31e21fca11d41102e20b62cb8706f889109b63df5f24a367aa7d62c27d7cad7b\": container with ID starting with 31e21fca11d41102e20b62cb8706f889109b63df5f24a367aa7d62c27d7cad7b not found: ID does not exist" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.012094 4746 scope.go:117] "RemoveContainer" containerID="8489fad5f60aef344c5116823427367409db1e4a669ebd8ad9708f77b31931f5" Jan 28 21:03:03 crc kubenswrapper[4746]: E0128 21:03:03.013097 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8489fad5f60aef344c5116823427367409db1e4a669ebd8ad9708f77b31931f5\": container with ID starting with 8489fad5f60aef344c5116823427367409db1e4a669ebd8ad9708f77b31931f5 not found: ID does not exist" containerID="8489fad5f60aef344c5116823427367409db1e4a669ebd8ad9708f77b31931f5" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.013120 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8489fad5f60aef344c5116823427367409db1e4a669ebd8ad9708f77b31931f5"} err="failed to get container status \"8489fad5f60aef344c5116823427367409db1e4a669ebd8ad9708f77b31931f5\": rpc error: code = NotFound desc = could not find container \"8489fad5f60aef344c5116823427367409db1e4a669ebd8ad9708f77b31931f5\": container with ID starting with 8489fad5f60aef344c5116823427367409db1e4a669ebd8ad9708f77b31931f5 not found: ID does not exist" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.017159 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7826h"] Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.067752 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58mwn\" (UniqueName: \"kubernetes.io/projected/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-kube-api-access-58mwn\") pod \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.067986 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-dns-swift-storage-0\") pod \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.068024 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-config\") pod \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.068042 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-ovsdbserver-sb\") pod \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.068136 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-ovsdbserver-nb\") pod \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.068201 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-dns-svc\") pod \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\" (UID: \"ac1b9c96-c2d1-43fb-96bb-e79328b627a6\") " Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.113298 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-kube-api-access-58mwn" (OuterVolumeSpecName: "kube-api-access-58mwn") pod "ac1b9c96-c2d1-43fb-96bb-e79328b627a6" (UID: "ac1b9c96-c2d1-43fb-96bb-e79328b627a6"). InnerVolumeSpecName "kube-api-access-58mwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.171528 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58mwn\" (UniqueName: \"kubernetes.io/projected/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-kube-api-access-58mwn\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.204157 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ac1b9c96-c2d1-43fb-96bb-e79328b627a6" (UID: "ac1b9c96-c2d1-43fb-96bb-e79328b627a6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.208913 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ac1b9c96-c2d1-43fb-96bb-e79328b627a6" (UID: "ac1b9c96-c2d1-43fb-96bb-e79328b627a6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.220540 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-config" (OuterVolumeSpecName: "config") pod "ac1b9c96-c2d1-43fb-96bb-e79328b627a6" (UID: "ac1b9c96-c2d1-43fb-96bb-e79328b627a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.273730 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ac1b9c96-c2d1-43fb-96bb-e79328b627a6" (UID: "ac1b9c96-c2d1-43fb-96bb-e79328b627a6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.275062 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.275096 4746 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.275106 4746 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.275116 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-config\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.279511 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ac1b9c96-c2d1-43fb-96bb-e79328b627a6" (UID: "ac1b9c96-c2d1-43fb-96bb-e79328b627a6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.377124 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac1b9c96-c2d1-43fb-96bb-e79328b627a6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.919500 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-hcnml" event={"ID":"ac1b9c96-c2d1-43fb-96bb-e79328b627a6","Type":"ContainerDied","Data":"0f3278d9541726ab41caffdcd78b1335643d68733959c1eb48df08d0b136330d"} Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.919808 4746 scope.go:117] "RemoveContainer" containerID="c5b4b1463b13700962c4ecfa0f03dc72ed0b20a3635db0d9f5dc5f100f396cee" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.919552 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-hcnml" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.922505 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4612060-a99e-4bc4-b074-2dffa9cc7050","Type":"ContainerStarted","Data":"820e061effa263cd6da98a16cd6aa47f8ccc39b8a23d57ef1f02a28d620949d6"} Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.953742 4746 scope.go:117] "RemoveContainer" containerID="2757f52655aadfb3734a5117d6fbd18efe3eae7d26e3e6abf772479f8fc34f8b" Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.963615 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-hcnml"] Jan 28 21:03:03 crc kubenswrapper[4746]: I0128 21:03:03.977506 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-hcnml"] Jan 28 21:03:04 crc kubenswrapper[4746]: I0128 21:03:04.846947 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04eb3dcd-859e-4fad-829e-c24bf2c954b4" path="/var/lib/kubelet/pods/04eb3dcd-859e-4fad-829e-c24bf2c954b4/volumes" Jan 28 21:03:04 crc kubenswrapper[4746]: I0128 21:03:04.848049 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac1b9c96-c2d1-43fb-96bb-e79328b627a6" path="/var/lib/kubelet/pods/ac1b9c96-c2d1-43fb-96bb-e79328b627a6/volumes" Jan 28 21:03:05 crc kubenswrapper[4746]: I0128 21:03:05.965624 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4612060-a99e-4bc4-b074-2dffa9cc7050","Type":"ContainerStarted","Data":"c57e5991b70e6ffe20ff52051fbfafc3d82c395d44a654190e5c9b733009f6d0"} Jan 28 21:03:05 crc kubenswrapper[4746]: I0128 21:03:05.966765 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 21:03:05 crc kubenswrapper[4746]: I0128 21:03:05.999101 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.779381494 podStartE2EDuration="6.999074409s" podCreationTimestamp="2026-01-28 21:02:59 +0000 UTC" firstStartedPulling="2026-01-28 21:03:00.931653192 +0000 UTC m=+1408.887839546" lastFinishedPulling="2026-01-28 21:03:05.151346107 +0000 UTC m=+1413.107532461" observedRunningTime="2026-01-28 21:03:05.992008899 +0000 UTC m=+1413.948195253" watchObservedRunningTime="2026-01-28 21:03:05.999074409 +0000 UTC m=+1413.955260763" Jan 28 21:03:09 crc kubenswrapper[4746]: I0128 21:03:09.231049 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 21:03:09 crc kubenswrapper[4746]: I0128 21:03:09.231446 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 28 21:03:10 crc kubenswrapper[4746]: I0128 21:03:10.245283 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aca41824-3271-42e1-93f8-76a1a9000681" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.238:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 21:03:10 crc kubenswrapper[4746]: I0128 21:03:10.245349 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aca41824-3271-42e1-93f8-76a1a9000681" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.238:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 28 21:03:15 crc kubenswrapper[4746]: I0128 21:03:15.871323 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:03:15 crc kubenswrapper[4746]: I0128 21:03:15.872064 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:03:19 crc kubenswrapper[4746]: I0128 21:03:19.242868 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 28 21:03:19 crc kubenswrapper[4746]: I0128 21:03:19.243672 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 28 21:03:19 crc kubenswrapper[4746]: I0128 21:03:19.245260 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 28 21:03:19 crc kubenswrapper[4746]: I0128 21:03:19.253192 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 28 21:03:20 crc kubenswrapper[4746]: I0128 21:03:20.155067 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 28 21:03:20 crc kubenswrapper[4746]: I0128 21:03:20.162366 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 28 21:03:30 crc kubenswrapper[4746]: I0128 21:03:30.463577 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.201779 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-p9ghn"] Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.213786 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-p9ghn"] Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.285158 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-hz5db"] Jan 28 21:03:41 crc kubenswrapper[4746]: E0128 21:03:41.285580 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04eb3dcd-859e-4fad-829e-c24bf2c954b4" containerName="registry-server" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.285592 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="04eb3dcd-859e-4fad-829e-c24bf2c954b4" containerName="registry-server" Jan 28 21:03:41 crc kubenswrapper[4746]: E0128 21:03:41.285620 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04eb3dcd-859e-4fad-829e-c24bf2c954b4" containerName="extract-utilities" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.285627 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="04eb3dcd-859e-4fad-829e-c24bf2c954b4" containerName="extract-utilities" Jan 28 21:03:41 crc kubenswrapper[4746]: E0128 21:03:41.285641 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1b9c96-c2d1-43fb-96bb-e79328b627a6" containerName="init" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.285646 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1b9c96-c2d1-43fb-96bb-e79328b627a6" containerName="init" Jan 28 21:03:41 crc kubenswrapper[4746]: E0128 21:03:41.285661 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1b9c96-c2d1-43fb-96bb-e79328b627a6" containerName="dnsmasq-dns" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.285667 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1b9c96-c2d1-43fb-96bb-e79328b627a6" containerName="dnsmasq-dns" Jan 28 21:03:41 crc kubenswrapper[4746]: E0128 21:03:41.285675 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04eb3dcd-859e-4fad-829e-c24bf2c954b4" containerName="extract-content" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.285682 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="04eb3dcd-859e-4fad-829e-c24bf2c954b4" containerName="extract-content" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.285859 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac1b9c96-c2d1-43fb-96bb-e79328b627a6" containerName="dnsmasq-dns" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.285886 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="04eb3dcd-859e-4fad-829e-c24bf2c954b4" containerName="registry-server" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.286687 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.294539 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.300338 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-hz5db"] Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.431552 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb91f276-e145-42e8-a53a-72b1f8311302-combined-ca-bundle\") pod \"cloudkitty-db-sync-hz5db\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.431613 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg4fr\" (UniqueName: \"kubernetes.io/projected/fb91f276-e145-42e8-a53a-72b1f8311302-kube-api-access-cg4fr\") pod \"cloudkitty-db-sync-hz5db\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.431713 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb91f276-e145-42e8-a53a-72b1f8311302-config-data\") pod \"cloudkitty-db-sync-hz5db\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.431772 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb91f276-e145-42e8-a53a-72b1f8311302-scripts\") pod \"cloudkitty-db-sync-hz5db\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.431819 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fb91f276-e145-42e8-a53a-72b1f8311302-certs\") pod \"cloudkitty-db-sync-hz5db\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.533880 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb91f276-e145-42e8-a53a-72b1f8311302-combined-ca-bundle\") pod \"cloudkitty-db-sync-hz5db\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.534200 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg4fr\" (UniqueName: \"kubernetes.io/projected/fb91f276-e145-42e8-a53a-72b1f8311302-kube-api-access-cg4fr\") pod \"cloudkitty-db-sync-hz5db\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.534371 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb91f276-e145-42e8-a53a-72b1f8311302-config-data\") pod \"cloudkitty-db-sync-hz5db\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.534472 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb91f276-e145-42e8-a53a-72b1f8311302-scripts\") pod \"cloudkitty-db-sync-hz5db\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.534569 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fb91f276-e145-42e8-a53a-72b1f8311302-certs\") pod \"cloudkitty-db-sync-hz5db\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.540576 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fb91f276-e145-42e8-a53a-72b1f8311302-certs\") pod \"cloudkitty-db-sync-hz5db\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.540909 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb91f276-e145-42e8-a53a-72b1f8311302-combined-ca-bundle\") pod \"cloudkitty-db-sync-hz5db\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.543825 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb91f276-e145-42e8-a53a-72b1f8311302-config-data\") pod \"cloudkitty-db-sync-hz5db\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.550068 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb91f276-e145-42e8-a53a-72b1f8311302-scripts\") pod \"cloudkitty-db-sync-hz5db\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.555862 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg4fr\" (UniqueName: \"kubernetes.io/projected/fb91f276-e145-42e8-a53a-72b1f8311302-kube-api-access-cg4fr\") pod \"cloudkitty-db-sync-hz5db\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:41 crc kubenswrapper[4746]: I0128 21:03:41.608481 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:42 crc kubenswrapper[4746]: I0128 21:03:42.117621 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-hz5db"] Jan 28 21:03:42 crc kubenswrapper[4746]: I0128 21:03:42.453532 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-hz5db" event={"ID":"fb91f276-e145-42e8-a53a-72b1f8311302","Type":"ContainerStarted","Data":"5a1ce51fbc71d9c19570b53cb1c6b645cc384d4b558a64bb0f14486f2ccfc640"} Jan 28 21:03:42 crc kubenswrapper[4746]: I0128 21:03:42.876872 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cce95230-2b72-4598-9d28-3a1465803567" path="/var/lib/kubelet/pods/cce95230-2b72-4598-9d28-3a1465803567/volumes" Jan 28 21:03:43 crc kubenswrapper[4746]: I0128 21:03:43.294693 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 21:03:43 crc kubenswrapper[4746]: I0128 21:03:43.528838 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:03:43 crc kubenswrapper[4746]: I0128 21:03:43.529935 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerName="ceilometer-central-agent" containerID="cri-o://870712e1d90ee565a39d018ffd7c3ff66ecc4586dfd9bea7574cbf87b0b91b88" gracePeriod=30 Jan 28 21:03:43 crc kubenswrapper[4746]: I0128 21:03:43.530026 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerName="proxy-httpd" containerID="cri-o://c57e5991b70e6ffe20ff52051fbfafc3d82c395d44a654190e5c9b733009f6d0" gracePeriod=30 Jan 28 21:03:43 crc kubenswrapper[4746]: I0128 21:03:43.530055 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerName="sg-core" containerID="cri-o://820e061effa263cd6da98a16cd6aa47f8ccc39b8a23d57ef1f02a28d620949d6" gracePeriod=30 Jan 28 21:03:43 crc kubenswrapper[4746]: I0128 21:03:43.530231 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerName="ceilometer-notification-agent" containerID="cri-o://dc24cad982129800500dbe58a114a149545ff2de5f1fd9354649314a70c9220a" gracePeriod=30 Jan 28 21:03:44 crc kubenswrapper[4746]: I0128 21:03:44.477920 4746 generic.go:334] "Generic (PLEG): container finished" podID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerID="c57e5991b70e6ffe20ff52051fbfafc3d82c395d44a654190e5c9b733009f6d0" exitCode=0 Jan 28 21:03:44 crc kubenswrapper[4746]: I0128 21:03:44.477952 4746 generic.go:334] "Generic (PLEG): container finished" podID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerID="820e061effa263cd6da98a16cd6aa47f8ccc39b8a23d57ef1f02a28d620949d6" exitCode=2 Jan 28 21:03:44 crc kubenswrapper[4746]: I0128 21:03:44.477961 4746 generic.go:334] "Generic (PLEG): container finished" podID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerID="870712e1d90ee565a39d018ffd7c3ff66ecc4586dfd9bea7574cbf87b0b91b88" exitCode=0 Jan 28 21:03:44 crc kubenswrapper[4746]: I0128 21:03:44.477982 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4612060-a99e-4bc4-b074-2dffa9cc7050","Type":"ContainerDied","Data":"c57e5991b70e6ffe20ff52051fbfafc3d82c395d44a654190e5c9b733009f6d0"} Jan 28 21:03:44 crc kubenswrapper[4746]: I0128 21:03:44.478006 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4612060-a99e-4bc4-b074-2dffa9cc7050","Type":"ContainerDied","Data":"820e061effa263cd6da98a16cd6aa47f8ccc39b8a23d57ef1f02a28d620949d6"} Jan 28 21:03:44 crc kubenswrapper[4746]: I0128 21:03:44.478015 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4612060-a99e-4bc4-b074-2dffa9cc7050","Type":"ContainerDied","Data":"870712e1d90ee565a39d018ffd7c3ff66ecc4586dfd9bea7574cbf87b0b91b88"} Jan 28 21:03:44 crc kubenswrapper[4746]: I0128 21:03:44.556094 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 21:03:45 crc kubenswrapper[4746]: I0128 21:03:45.495670 4746 generic.go:334] "Generic (PLEG): container finished" podID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerID="dc24cad982129800500dbe58a114a149545ff2de5f1fd9354649314a70c9220a" exitCode=0 Jan 28 21:03:45 crc kubenswrapper[4746]: I0128 21:03:45.495745 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4612060-a99e-4bc4-b074-2dffa9cc7050","Type":"ContainerDied","Data":"dc24cad982129800500dbe58a114a149545ff2de5f1fd9354649314a70c9220a"} Jan 28 21:03:45 crc kubenswrapper[4746]: I0128 21:03:45.871726 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:03:45 crc kubenswrapper[4746]: I0128 21:03:45.871977 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:03:45 crc kubenswrapper[4746]: I0128 21:03:45.995016 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.167413 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-config-data\") pod \"a4612060-a99e-4bc4-b074-2dffa9cc7050\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.167494 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-combined-ca-bundle\") pod \"a4612060-a99e-4bc4-b074-2dffa9cc7050\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.167602 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-scripts\") pod \"a4612060-a99e-4bc4-b074-2dffa9cc7050\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.167641 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-sg-core-conf-yaml\") pod \"a4612060-a99e-4bc4-b074-2dffa9cc7050\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.167691 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4612060-a99e-4bc4-b074-2dffa9cc7050-run-httpd\") pod \"a4612060-a99e-4bc4-b074-2dffa9cc7050\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.167729 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xxvw\" (UniqueName: \"kubernetes.io/projected/a4612060-a99e-4bc4-b074-2dffa9cc7050-kube-api-access-8xxvw\") pod \"a4612060-a99e-4bc4-b074-2dffa9cc7050\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.167756 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4612060-a99e-4bc4-b074-2dffa9cc7050-log-httpd\") pod \"a4612060-a99e-4bc4-b074-2dffa9cc7050\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.167799 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-ceilometer-tls-certs\") pod \"a4612060-a99e-4bc4-b074-2dffa9cc7050\" (UID: \"a4612060-a99e-4bc4-b074-2dffa9cc7050\") " Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.169818 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4612060-a99e-4bc4-b074-2dffa9cc7050-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a4612060-a99e-4bc4-b074-2dffa9cc7050" (UID: "a4612060-a99e-4bc4-b074-2dffa9cc7050"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.170194 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4612060-a99e-4bc4-b074-2dffa9cc7050-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a4612060-a99e-4bc4-b074-2dffa9cc7050" (UID: "a4612060-a99e-4bc4-b074-2dffa9cc7050"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.188809 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4612060-a99e-4bc4-b074-2dffa9cc7050-kube-api-access-8xxvw" (OuterVolumeSpecName: "kube-api-access-8xxvw") pod "a4612060-a99e-4bc4-b074-2dffa9cc7050" (UID: "a4612060-a99e-4bc4-b074-2dffa9cc7050"). InnerVolumeSpecName "kube-api-access-8xxvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.192778 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-scripts" (OuterVolumeSpecName: "scripts") pod "a4612060-a99e-4bc4-b074-2dffa9cc7050" (UID: "a4612060-a99e-4bc4-b074-2dffa9cc7050"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.216395 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a4612060-a99e-4bc4-b074-2dffa9cc7050" (UID: "a4612060-a99e-4bc4-b074-2dffa9cc7050"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.237233 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a4612060-a99e-4bc4-b074-2dffa9cc7050" (UID: "a4612060-a99e-4bc4-b074-2dffa9cc7050"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.271513 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.271546 4746 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.271557 4746 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4612060-a99e-4bc4-b074-2dffa9cc7050-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.271566 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xxvw\" (UniqueName: \"kubernetes.io/projected/a4612060-a99e-4bc4-b074-2dffa9cc7050-kube-api-access-8xxvw\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.271574 4746 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a4612060-a99e-4bc4-b074-2dffa9cc7050-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.271584 4746 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.316517 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4612060-a99e-4bc4-b074-2dffa9cc7050" (UID: "a4612060-a99e-4bc4-b074-2dffa9cc7050"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.374001 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.378615 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-config-data" (OuterVolumeSpecName: "config-data") pod "a4612060-a99e-4bc4-b074-2dffa9cc7050" (UID: "a4612060-a99e-4bc4-b074-2dffa9cc7050"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.476529 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4612060-a99e-4bc4-b074-2dffa9cc7050-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.524756 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a4612060-a99e-4bc4-b074-2dffa9cc7050","Type":"ContainerDied","Data":"aacb0d11ae88afab424fadd80d351951f1408d32707132acbca75a25812d3cf7"} Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.524812 4746 scope.go:117] "RemoveContainer" containerID="c57e5991b70e6ffe20ff52051fbfafc3d82c395d44a654190e5c9b733009f6d0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.525026 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.577253 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.602810 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.615226 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:03:46 crc kubenswrapper[4746]: E0128 21:03:46.615696 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerName="ceilometer-notification-agent" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.615715 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerName="ceilometer-notification-agent" Jan 28 21:03:46 crc kubenswrapper[4746]: E0128 21:03:46.615731 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerName="proxy-httpd" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.615738 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerName="proxy-httpd" Jan 28 21:03:46 crc kubenswrapper[4746]: E0128 21:03:46.615752 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerName="ceilometer-central-agent" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.615759 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerName="ceilometer-central-agent" Jan 28 21:03:46 crc kubenswrapper[4746]: E0128 21:03:46.615771 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerName="sg-core" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.615776 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerName="sg-core" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.615957 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerName="proxy-httpd" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.615983 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerName="ceilometer-notification-agent" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.616004 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerName="ceilometer-central-agent" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.616016 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4612060-a99e-4bc4-b074-2dffa9cc7050" containerName="sg-core" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.622660 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.626708 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.627668 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.627877 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.638474 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.786030 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f6522f9-6035-4484-ba00-2255f04cd85d-scripts\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.786110 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f6522f9-6035-4484-ba00-2255f04cd85d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.786143 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6522f9-6035-4484-ba00-2255f04cd85d-log-httpd\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.786173 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f6522f9-6035-4484-ba00-2255f04cd85d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.786190 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f6522f9-6035-4484-ba00-2255f04cd85d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.786217 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6522f9-6035-4484-ba00-2255f04cd85d-run-httpd\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.786254 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqfq7\" (UniqueName: \"kubernetes.io/projected/6f6522f9-6035-4484-ba00-2255f04cd85d-kube-api-access-dqfq7\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.786294 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f6522f9-6035-4484-ba00-2255f04cd85d-config-data\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.848884 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4612060-a99e-4bc4-b074-2dffa9cc7050" path="/var/lib/kubelet/pods/a4612060-a99e-4bc4-b074-2dffa9cc7050/volumes" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.888481 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f6522f9-6035-4484-ba00-2255f04cd85d-scripts\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.889018 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f6522f9-6035-4484-ba00-2255f04cd85d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.889044 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6522f9-6035-4484-ba00-2255f04cd85d-log-httpd\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.889343 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f6522f9-6035-4484-ba00-2255f04cd85d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.889365 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f6522f9-6035-4484-ba00-2255f04cd85d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.889408 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6522f9-6035-4484-ba00-2255f04cd85d-run-httpd\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.889445 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqfq7\" (UniqueName: \"kubernetes.io/projected/6f6522f9-6035-4484-ba00-2255f04cd85d-kube-api-access-dqfq7\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.889505 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f6522f9-6035-4484-ba00-2255f04cd85d-config-data\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.889754 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6522f9-6035-4484-ba00-2255f04cd85d-log-httpd\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.892716 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f6522f9-6035-4484-ba00-2255f04cd85d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.892828 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f6522f9-6035-4484-ba00-2255f04cd85d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.893468 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f6522f9-6035-4484-ba00-2255f04cd85d-run-httpd\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.895294 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f6522f9-6035-4484-ba00-2255f04cd85d-scripts\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.895832 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f6522f9-6035-4484-ba00-2255f04cd85d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.896481 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f6522f9-6035-4484-ba00-2255f04cd85d-config-data\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.928102 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqfq7\" (UniqueName: \"kubernetes.io/projected/6f6522f9-6035-4484-ba00-2255f04cd85d-kube-api-access-dqfq7\") pod \"ceilometer-0\" (UID: \"6f6522f9-6035-4484-ba00-2255f04cd85d\") " pod="openstack/ceilometer-0" Jan 28 21:03:46 crc kubenswrapper[4746]: I0128 21:03:46.956646 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 28 21:03:48 crc kubenswrapper[4746]: I0128 21:03:48.116827 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="88718387-09d6-4e3d-a06f-4353ba42ce91" containerName="rabbitmq" containerID="cri-o://d03a46ad73048c6c7b226dc21426667c6e3fd353d111f06a75a8bf91d50aa9fd" gracePeriod=604796 Jan 28 21:03:48 crc kubenswrapper[4746]: I0128 21:03:48.698951 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="701360b2-121a-4cb4-9a4f-9ce63391e740" containerName="rabbitmq" containerID="cri-o://ed9100b5aaafbed1ab7eef3015c4df8d2843b7bec57597f5b18c522d626ddc01" gracePeriod=604796 Jan 28 21:03:50 crc kubenswrapper[4746]: I0128 21:03:50.084527 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="88718387-09d6-4e3d-a06f-4353ba42ce91" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Jan 28 21:03:50 crc kubenswrapper[4746]: I0128 21:03:50.467624 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="701360b2-121a-4cb4-9a4f-9ce63391e740" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Jan 28 21:03:51 crc kubenswrapper[4746]: I0128 21:03:51.261397 4746 scope.go:117] "RemoveContainer" containerID="820e061effa263cd6da98a16cd6aa47f8ccc39b8a23d57ef1f02a28d620949d6" Jan 28 21:03:51 crc kubenswrapper[4746]: I0128 21:03:51.878782 4746 scope.go:117] "RemoveContainer" containerID="dc24cad982129800500dbe58a114a149545ff2de5f1fd9354649314a70c9220a" Jan 28 21:03:51 crc kubenswrapper[4746]: I0128 21:03:51.907687 4746 scope.go:117] "RemoveContainer" containerID="870712e1d90ee565a39d018ffd7c3ff66ecc4586dfd9bea7574cbf87b0b91b88" Jan 28 21:03:52 crc kubenswrapper[4746]: I0128 21:03:52.331695 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 28 21:03:52 crc kubenswrapper[4746]: W0128 21:03:52.334800 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f6522f9_6035_4484_ba00_2255f04cd85d.slice/crio-f245b48e584c3bf7011c805f8fafeeb154b054f25188337f716e8c1bff14654a WatchSource:0}: Error finding container f245b48e584c3bf7011c805f8fafeeb154b054f25188337f716e8c1bff14654a: Status 404 returned error can't find the container with id f245b48e584c3bf7011c805f8fafeeb154b054f25188337f716e8c1bff14654a Jan 28 21:03:52 crc kubenswrapper[4746]: I0128 21:03:52.612767 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6522f9-6035-4484-ba00-2255f04cd85d","Type":"ContainerStarted","Data":"f245b48e584c3bf7011c805f8fafeeb154b054f25188337f716e8c1bff14654a"} Jan 28 21:03:52 crc kubenswrapper[4746]: I0128 21:03:52.616033 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-hz5db" event={"ID":"fb91f276-e145-42e8-a53a-72b1f8311302","Type":"ContainerStarted","Data":"b6d0f5ff00d9ed0151d4228f2de581e21e8c62bcc21f24bacd68d86bc5d26826"} Jan 28 21:03:52 crc kubenswrapper[4746]: I0128 21:03:52.633165 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-hz5db" podStartSLOduration=1.808453982 podStartE2EDuration="11.633146538s" podCreationTimestamp="2026-01-28 21:03:41 +0000 UTC" firstStartedPulling="2026-01-28 21:03:42.107166215 +0000 UTC m=+1450.063352569" lastFinishedPulling="2026-01-28 21:03:51.931858771 +0000 UTC m=+1459.888045125" observedRunningTime="2026-01-28 21:03:52.632270674 +0000 UTC m=+1460.588457048" watchObservedRunningTime="2026-01-28 21:03:52.633146538 +0000 UTC m=+1460.589332892" Jan 28 21:03:54 crc kubenswrapper[4746]: I0128 21:03:54.646235 4746 generic.go:334] "Generic (PLEG): container finished" podID="fb91f276-e145-42e8-a53a-72b1f8311302" containerID="b6d0f5ff00d9ed0151d4228f2de581e21e8c62bcc21f24bacd68d86bc5d26826" exitCode=0 Jan 28 21:03:54 crc kubenswrapper[4746]: I0128 21:03:54.646307 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-hz5db" event={"ID":"fb91f276-e145-42e8-a53a-72b1f8311302","Type":"ContainerDied","Data":"b6d0f5ff00d9ed0151d4228f2de581e21e8c62bcc21f24bacd68d86bc5d26826"} Jan 28 21:03:54 crc kubenswrapper[4746]: I0128 21:03:54.648918 4746 generic.go:334] "Generic (PLEG): container finished" podID="88718387-09d6-4e3d-a06f-4353ba42ce91" containerID="d03a46ad73048c6c7b226dc21426667c6e3fd353d111f06a75a8bf91d50aa9fd" exitCode=0 Jan 28 21:03:54 crc kubenswrapper[4746]: I0128 21:03:54.648948 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"88718387-09d6-4e3d-a06f-4353ba42ce91","Type":"ContainerDied","Data":"d03a46ad73048c6c7b226dc21426667c6e3fd353d111f06a75a8bf91d50aa9fd"} Jan 28 21:03:55 crc kubenswrapper[4746]: I0128 21:03:55.664228 4746 generic.go:334] "Generic (PLEG): container finished" podID="701360b2-121a-4cb4-9a4f-9ce63391e740" containerID="ed9100b5aaafbed1ab7eef3015c4df8d2843b7bec57597f5b18c522d626ddc01" exitCode=0 Jan 28 21:03:55 crc kubenswrapper[4746]: I0128 21:03:55.664386 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"701360b2-121a-4cb4-9a4f-9ce63391e740","Type":"ContainerDied","Data":"ed9100b5aaafbed1ab7eef3015c4df8d2843b7bec57597f5b18c522d626ddc01"} Jan 28 21:03:55 crc kubenswrapper[4746]: I0128 21:03:55.887674 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 28 21:03:55 crc kubenswrapper[4746]: I0128 21:03:55.987092 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\") pod \"88718387-09d6-4e3d-a06f-4353ba42ce91\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " Jan 28 21:03:55 crc kubenswrapper[4746]: I0128 21:03:55.987162 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88718387-09d6-4e3d-a06f-4353ba42ce91-config-data\") pod \"88718387-09d6-4e3d-a06f-4353ba42ce91\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " Jan 28 21:03:55 crc kubenswrapper[4746]: I0128 21:03:55.987210 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-plugins\") pod \"88718387-09d6-4e3d-a06f-4353ba42ce91\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " Jan 28 21:03:55 crc kubenswrapper[4746]: I0128 21:03:55.987267 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-confd\") pod \"88718387-09d6-4e3d-a06f-4353ba42ce91\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " Jan 28 21:03:55 crc kubenswrapper[4746]: I0128 21:03:55.987295 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/88718387-09d6-4e3d-a06f-4353ba42ce91-erlang-cookie-secret\") pod \"88718387-09d6-4e3d-a06f-4353ba42ce91\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " Jan 28 21:03:55 crc kubenswrapper[4746]: I0128 21:03:55.987326 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-tls\") pod \"88718387-09d6-4e3d-a06f-4353ba42ce91\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " Jan 28 21:03:55 crc kubenswrapper[4746]: I0128 21:03:55.987380 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/88718387-09d6-4e3d-a06f-4353ba42ce91-plugins-conf\") pod \"88718387-09d6-4e3d-a06f-4353ba42ce91\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " Jan 28 21:03:55 crc kubenswrapper[4746]: I0128 21:03:55.987444 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/88718387-09d6-4e3d-a06f-4353ba42ce91-pod-info\") pod \"88718387-09d6-4e3d-a06f-4353ba42ce91\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " Jan 28 21:03:55 crc kubenswrapper[4746]: I0128 21:03:55.987481 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-erlang-cookie\") pod \"88718387-09d6-4e3d-a06f-4353ba42ce91\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " Jan 28 21:03:55 crc kubenswrapper[4746]: I0128 21:03:55.987584 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/88718387-09d6-4e3d-a06f-4353ba42ce91-server-conf\") pod \"88718387-09d6-4e3d-a06f-4353ba42ce91\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " Jan 28 21:03:55 crc kubenswrapper[4746]: I0128 21:03:55.987662 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkb6z\" (UniqueName: \"kubernetes.io/projected/88718387-09d6-4e3d-a06f-4353ba42ce91-kube-api-access-bkb6z\") pod \"88718387-09d6-4e3d-a06f-4353ba42ce91\" (UID: \"88718387-09d6-4e3d-a06f-4353ba42ce91\") " Jan 28 21:03:55 crc kubenswrapper[4746]: I0128 21:03:55.994575 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "88718387-09d6-4e3d-a06f-4353ba42ce91" (UID: "88718387-09d6-4e3d-a06f-4353ba42ce91"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:03:55 crc kubenswrapper[4746]: I0128 21:03:55.994978 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "88718387-09d6-4e3d-a06f-4353ba42ce91" (UID: "88718387-09d6-4e3d-a06f-4353ba42ce91"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:03:55 crc kubenswrapper[4746]: I0128 21:03:55.995301 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88718387-09d6-4e3d-a06f-4353ba42ce91-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "88718387-09d6-4e3d-a06f-4353ba42ce91" (UID: "88718387-09d6-4e3d-a06f-4353ba42ce91"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.043724 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/88718387-09d6-4e3d-a06f-4353ba42ce91-pod-info" (OuterVolumeSpecName: "pod-info") pod "88718387-09d6-4e3d-a06f-4353ba42ce91" (UID: "88718387-09d6-4e3d-a06f-4353ba42ce91"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.043810 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88718387-09d6-4e3d-a06f-4353ba42ce91-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "88718387-09d6-4e3d-a06f-4353ba42ce91" (UID: "88718387-09d6-4e3d-a06f-4353ba42ce91"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.043908 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "88718387-09d6-4e3d-a06f-4353ba42ce91" (UID: "88718387-09d6-4e3d-a06f-4353ba42ce91"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.045374 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88718387-09d6-4e3d-a06f-4353ba42ce91-config-data" (OuterVolumeSpecName: "config-data") pod "88718387-09d6-4e3d-a06f-4353ba42ce91" (UID: "88718387-09d6-4e3d-a06f-4353ba42ce91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.047889 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88718387-09d6-4e3d-a06f-4353ba42ce91-kube-api-access-bkb6z" (OuterVolumeSpecName: "kube-api-access-bkb6z") pod "88718387-09d6-4e3d-a06f-4353ba42ce91" (UID: "88718387-09d6-4e3d-a06f-4353ba42ce91"). InnerVolumeSpecName "kube-api-access-bkb6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.056951 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.091421 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkb6z\" (UniqueName: \"kubernetes.io/projected/88718387-09d6-4e3d-a06f-4353ba42ce91-kube-api-access-bkb6z\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.091454 4746 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.091463 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88718387-09d6-4e3d-a06f-4353ba42ce91-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.091471 4746 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/88718387-09d6-4e3d-a06f-4353ba42ce91-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.091480 4746 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.091488 4746 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/88718387-09d6-4e3d-a06f-4353ba42ce91-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.091496 4746 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/88718387-09d6-4e3d-a06f-4353ba42ce91-pod-info\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.091504 4746 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.107268 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae" (OuterVolumeSpecName: "persistence") pod "88718387-09d6-4e3d-a06f-4353ba42ce91" (UID: "88718387-09d6-4e3d-a06f-4353ba42ce91"). InnerVolumeSpecName "pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.156783 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88718387-09d6-4e3d-a06f-4353ba42ce91-server-conf" (OuterVolumeSpecName: "server-conf") pod "88718387-09d6-4e3d-a06f-4353ba42ce91" (UID: "88718387-09d6-4e3d-a06f-4353ba42ce91"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.193700 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-erlang-cookie\") pod \"701360b2-121a-4cb4-9a4f-9ce63391e740\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.193753 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-confd\") pod \"701360b2-121a-4cb4-9a4f-9ce63391e740\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.193834 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-plugins\") pod \"701360b2-121a-4cb4-9a4f-9ce63391e740\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.193888 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/701360b2-121a-4cb4-9a4f-9ce63391e740-server-conf\") pod \"701360b2-121a-4cb4-9a4f-9ce63391e740\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.194001 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-tls\") pod \"701360b2-121a-4cb4-9a4f-9ce63391e740\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.194053 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/701360b2-121a-4cb4-9a4f-9ce63391e740-plugins-conf\") pod \"701360b2-121a-4cb4-9a4f-9ce63391e740\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.194073 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr2kk\" (UniqueName: \"kubernetes.io/projected/701360b2-121a-4cb4-9a4f-9ce63391e740-kube-api-access-wr2kk\") pod \"701360b2-121a-4cb4-9a4f-9ce63391e740\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.194116 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/701360b2-121a-4cb4-9a4f-9ce63391e740-pod-info\") pod \"701360b2-121a-4cb4-9a4f-9ce63391e740\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.194176 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/701360b2-121a-4cb4-9a4f-9ce63391e740-config-data\") pod \"701360b2-121a-4cb4-9a4f-9ce63391e740\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.194201 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/701360b2-121a-4cb4-9a4f-9ce63391e740-erlang-cookie-secret\") pod \"701360b2-121a-4cb4-9a4f-9ce63391e740\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.198191 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\") pod \"701360b2-121a-4cb4-9a4f-9ce63391e740\" (UID: \"701360b2-121a-4cb4-9a4f-9ce63391e740\") " Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.198969 4746 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/88718387-09d6-4e3d-a06f-4353ba42ce91-server-conf\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.199023 4746 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\") on node \"crc\" " Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.199751 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/701360b2-121a-4cb4-9a4f-9ce63391e740-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "701360b2-121a-4cb4-9a4f-9ce63391e740" (UID: "701360b2-121a-4cb4-9a4f-9ce63391e740"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.209025 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "701360b2-121a-4cb4-9a4f-9ce63391e740" (UID: "701360b2-121a-4cb4-9a4f-9ce63391e740"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.219975 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "701360b2-121a-4cb4-9a4f-9ce63391e740" (UID: "701360b2-121a-4cb4-9a4f-9ce63391e740"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.224691 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701360b2-121a-4cb4-9a4f-9ce63391e740-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "701360b2-121a-4cb4-9a4f-9ce63391e740" (UID: "701360b2-121a-4cb4-9a4f-9ce63391e740"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.224744 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "701360b2-121a-4cb4-9a4f-9ce63391e740" (UID: "701360b2-121a-4cb4-9a4f-9ce63391e740"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.224777 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701360b2-121a-4cb4-9a4f-9ce63391e740-kube-api-access-wr2kk" (OuterVolumeSpecName: "kube-api-access-wr2kk") pod "701360b2-121a-4cb4-9a4f-9ce63391e740" (UID: "701360b2-121a-4cb4-9a4f-9ce63391e740"). InnerVolumeSpecName "kube-api-access-wr2kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.225149 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/701360b2-121a-4cb4-9a4f-9ce63391e740-pod-info" (OuterVolumeSpecName: "pod-info") pod "701360b2-121a-4cb4-9a4f-9ce63391e740" (UID: "701360b2-121a-4cb4-9a4f-9ce63391e740"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.246742 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54" (OuterVolumeSpecName: "persistence") pod "701360b2-121a-4cb4-9a4f-9ce63391e740" (UID: "701360b2-121a-4cb4-9a4f-9ce63391e740"). InnerVolumeSpecName "pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.251280 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "88718387-09d6-4e3d-a06f-4353ba42ce91" (UID: "88718387-09d6-4e3d-a06f-4353ba42ce91"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.275407 4746 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.275630 4746 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae") on node "crc" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.286982 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/701360b2-121a-4cb4-9a4f-9ce63391e740-config-data" (OuterVolumeSpecName: "config-data") pod "701360b2-121a-4cb4-9a4f-9ce63391e740" (UID: "701360b2-121a-4cb4-9a4f-9ce63391e740"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.295209 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/701360b2-121a-4cb4-9a4f-9ce63391e740-server-conf" (OuterVolumeSpecName: "server-conf") pod "701360b2-121a-4cb4-9a4f-9ce63391e740" (UID: "701360b2-121a-4cb4-9a4f-9ce63391e740"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.304763 4746 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.305055 4746 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/701360b2-121a-4cb4-9a4f-9ce63391e740-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.305064 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr2kk\" (UniqueName: \"kubernetes.io/projected/701360b2-121a-4cb4-9a4f-9ce63391e740-kube-api-access-wr2kk\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.305094 4746 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/701360b2-121a-4cb4-9a4f-9ce63391e740-pod-info\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.305107 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/701360b2-121a-4cb4-9a4f-9ce63391e740-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.305115 4746 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/701360b2-121a-4cb4-9a4f-9ce63391e740-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.305139 4746 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\") on node \"crc\" " Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.305150 4746 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.305159 4746 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.305167 4746 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/701360b2-121a-4cb4-9a4f-9ce63391e740-server-conf\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.305175 4746 reconciler_common.go:293] "Volume detached for volume \"pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.305184 4746 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/88718387-09d6-4e3d-a06f-4353ba42ce91-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.324727 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.345705 4746 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.345849 4746 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54") on node "crc" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.372400 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "701360b2-121a-4cb4-9a4f-9ce63391e740" (UID: "701360b2-121a-4cb4-9a4f-9ce63391e740"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.413431 4746 reconciler_common.go:293] "Volume detached for volume \"pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.413463 4746 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/701360b2-121a-4cb4-9a4f-9ce63391e740-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.514680 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb91f276-e145-42e8-a53a-72b1f8311302-combined-ca-bundle\") pod \"fb91f276-e145-42e8-a53a-72b1f8311302\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.514912 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg4fr\" (UniqueName: \"kubernetes.io/projected/fb91f276-e145-42e8-a53a-72b1f8311302-kube-api-access-cg4fr\") pod \"fb91f276-e145-42e8-a53a-72b1f8311302\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.515001 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fb91f276-e145-42e8-a53a-72b1f8311302-certs\") pod \"fb91f276-e145-42e8-a53a-72b1f8311302\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.515071 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb91f276-e145-42e8-a53a-72b1f8311302-config-data\") pod \"fb91f276-e145-42e8-a53a-72b1f8311302\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.515148 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb91f276-e145-42e8-a53a-72b1f8311302-scripts\") pod \"fb91f276-e145-42e8-a53a-72b1f8311302\" (UID: \"fb91f276-e145-42e8-a53a-72b1f8311302\") " Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.519682 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb91f276-e145-42e8-a53a-72b1f8311302-scripts" (OuterVolumeSpecName: "scripts") pod "fb91f276-e145-42e8-a53a-72b1f8311302" (UID: "fb91f276-e145-42e8-a53a-72b1f8311302"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.522413 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb91f276-e145-42e8-a53a-72b1f8311302-kube-api-access-cg4fr" (OuterVolumeSpecName: "kube-api-access-cg4fr") pod "fb91f276-e145-42e8-a53a-72b1f8311302" (UID: "fb91f276-e145-42e8-a53a-72b1f8311302"). InnerVolumeSpecName "kube-api-access-cg4fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.523722 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb91f276-e145-42e8-a53a-72b1f8311302-certs" (OuterVolumeSpecName: "certs") pod "fb91f276-e145-42e8-a53a-72b1f8311302" (UID: "fb91f276-e145-42e8-a53a-72b1f8311302"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.555181 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb91f276-e145-42e8-a53a-72b1f8311302-config-data" (OuterVolumeSpecName: "config-data") pod "fb91f276-e145-42e8-a53a-72b1f8311302" (UID: "fb91f276-e145-42e8-a53a-72b1f8311302"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.557795 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb91f276-e145-42e8-a53a-72b1f8311302-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb91f276-e145-42e8-a53a-72b1f8311302" (UID: "fb91f276-e145-42e8-a53a-72b1f8311302"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.617197 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb91f276-e145-42e8-a53a-72b1f8311302-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.617234 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg4fr\" (UniqueName: \"kubernetes.io/projected/fb91f276-e145-42e8-a53a-72b1f8311302-kube-api-access-cg4fr\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.617244 4746 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fb91f276-e145-42e8-a53a-72b1f8311302-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.617253 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb91f276-e145-42e8-a53a-72b1f8311302-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.617261 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb91f276-e145-42e8-a53a-72b1f8311302-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.683874 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-hz5db" event={"ID":"fb91f276-e145-42e8-a53a-72b1f8311302","Type":"ContainerDied","Data":"5a1ce51fbc71d9c19570b53cb1c6b645cc384d4b558a64bb0f14486f2ccfc640"} Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.683915 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a1ce51fbc71d9c19570b53cb1c6b645cc384d4b558a64bb0f14486f2ccfc640" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.683970 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-hz5db" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.698826 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6522f9-6035-4484-ba00-2255f04cd85d","Type":"ContainerStarted","Data":"569ceed149c2730d11eb7aad10aa0f201b6bb8128ea4a28219ba86cd27fbdcfe"} Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.701807 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.702127 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"88718387-09d6-4e3d-a06f-4353ba42ce91","Type":"ContainerDied","Data":"64fd52a5f6f1b19040c4a7e9bea22fab671a685a5e6979c081c503d6ab6812c7"} Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.702272 4746 scope.go:117] "RemoveContainer" containerID="d03a46ad73048c6c7b226dc21426667c6e3fd353d111f06a75a8bf91d50aa9fd" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.710061 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"701360b2-121a-4cb4-9a4f-9ce63391e740","Type":"ContainerDied","Data":"288c0fefb19fd38219e1688e0284a2c2b633a493917d689a3750d7b80adf7b08"} Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.710844 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.762259 4746 scope.go:117] "RemoveContainer" containerID="ff6dfe0b7527df02f28dd43e980cf6fdbb546706b9ce1c38c5e9ebe0e3c4c38a" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.815863 4746 scope.go:117] "RemoveContainer" containerID="ed9100b5aaafbed1ab7eef3015c4df8d2843b7bec57597f5b18c522d626ddc01" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.818032 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.871461 4746 scope.go:117] "RemoveContainer" containerID="f2f571246c74fa9c9e1b471c32df4a45d7e6ced3641e96b15c5ddca28302d0b2" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.883979 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.884021 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 21:03:56 crc kubenswrapper[4746]: E0128 21:03:56.884351 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701360b2-121a-4cb4-9a4f-9ce63391e740" containerName="rabbitmq" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.884363 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="701360b2-121a-4cb4-9a4f-9ce63391e740" containerName="rabbitmq" Jan 28 21:03:56 crc kubenswrapper[4746]: E0128 21:03:56.884375 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88718387-09d6-4e3d-a06f-4353ba42ce91" containerName="setup-container" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.884381 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="88718387-09d6-4e3d-a06f-4353ba42ce91" containerName="setup-container" Jan 28 21:03:56 crc kubenswrapper[4746]: E0128 21:03:56.884397 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb91f276-e145-42e8-a53a-72b1f8311302" containerName="cloudkitty-db-sync" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.884405 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb91f276-e145-42e8-a53a-72b1f8311302" containerName="cloudkitty-db-sync" Jan 28 21:03:56 crc kubenswrapper[4746]: E0128 21:03:56.884453 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701360b2-121a-4cb4-9a4f-9ce63391e740" containerName="setup-container" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.884459 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="701360b2-121a-4cb4-9a4f-9ce63391e740" containerName="setup-container" Jan 28 21:03:56 crc kubenswrapper[4746]: E0128 21:03:56.884469 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88718387-09d6-4e3d-a06f-4353ba42ce91" containerName="rabbitmq" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.884475 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="88718387-09d6-4e3d-a06f-4353ba42ce91" containerName="rabbitmq" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.886810 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="88718387-09d6-4e3d-a06f-4353ba42ce91" containerName="rabbitmq" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.886848 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="701360b2-121a-4cb4-9a4f-9ce63391e740" containerName="rabbitmq" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.886871 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb91f276-e145-42e8-a53a-72b1f8311302" containerName="cloudkitty-db-sync" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.888064 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-c8pfb"] Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.888585 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.897307 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.897412 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.903422 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.903608 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-tsllx" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.903632 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.903741 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.903819 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.903860 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.904040 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.904292 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.906213 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.915815 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-lrs2d"] Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.925348 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-lrs2d"] Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.938023 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-c8pfb"] Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.950242 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.968296 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.970400 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.974278 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.974498 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.974644 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.974796 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-ktgk4" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.974919 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.975030 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.975111 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 28 21:03:56 crc kubenswrapper[4746]: I0128 21:03:56.979484 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.031979 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.032138 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f330def9-769c-4adf-9df3-c1a7c54cd502-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.032230 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f330def9-769c-4adf-9df3-c1a7c54cd502-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.032270 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.032290 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f330def9-769c-4adf-9df3-c1a7c54cd502-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.032492 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f330def9-769c-4adf-9df3-c1a7c54cd502-config-data\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.032550 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.032588 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f330def9-769c-4adf-9df3-c1a7c54cd502-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.032818 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.032856 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-config\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.032895 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f330def9-769c-4adf-9df3-c1a7c54cd502-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.033004 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8qjq\" (UniqueName: \"kubernetes.io/projected/f330def9-769c-4adf-9df3-c1a7c54cd502-kube-api-access-l8qjq\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.033047 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.033096 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f330def9-769c-4adf-9df3-c1a7c54cd502-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.033136 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f330def9-769c-4adf-9df3-c1a7c54cd502-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.033184 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f330def9-769c-4adf-9df3-c1a7c54cd502-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.033207 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.033232 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv9bq\" (UniqueName: \"kubernetes.io/projected/368e0053-e824-487b-995b-0805d6bbc718-kube-api-access-xv9bq\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.040068 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-9czms"] Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.041394 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.048138 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.057702 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-9czms"] Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135367 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135414 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135437 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e31300a2-ace0-435c-8eb9-383dc7a6120b-config-data\") pod \"cloudkitty-storageinit-9czms\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135455 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f330def9-769c-4adf-9df3-c1a7c54cd502-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135476 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135492 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmf28\" (UniqueName: \"kubernetes.io/projected/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-kube-api-access-fmf28\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135518 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f330def9-769c-4adf-9df3-c1a7c54cd502-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135535 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135550 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f330def9-769c-4adf-9df3-c1a7c54cd502-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135593 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e31300a2-ace0-435c-8eb9-383dc7a6120b-certs\") pod \"cloudkitty-storageinit-9czms\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135610 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135637 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f330def9-769c-4adf-9df3-c1a7c54cd502-config-data\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135660 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135681 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135695 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f330def9-769c-4adf-9df3-c1a7c54cd502-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135724 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e31300a2-ace0-435c-8eb9-383dc7a6120b-combined-ca-bundle\") pod \"cloudkitty-storageinit-9czms\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135762 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e31300a2-ace0-435c-8eb9-383dc7a6120b-scripts\") pod \"cloudkitty-storageinit-9czms\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135779 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135794 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135812 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-config\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135830 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f330def9-769c-4adf-9df3-c1a7c54cd502-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135848 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135874 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m8nf\" (UniqueName: \"kubernetes.io/projected/e31300a2-ace0-435c-8eb9-383dc7a6120b-kube-api-access-6m8nf\") pod \"cloudkitty-storageinit-9czms\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135895 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135912 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8qjq\" (UniqueName: \"kubernetes.io/projected/f330def9-769c-4adf-9df3-c1a7c54cd502-kube-api-access-l8qjq\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135930 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135947 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f330def9-769c-4adf-9df3-c1a7c54cd502-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135966 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.135984 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f330def9-769c-4adf-9df3-c1a7c54cd502-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.136000 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.136017 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f330def9-769c-4adf-9df3-c1a7c54cd502-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.136031 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.136051 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.136074 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv9bq\" (UniqueName: \"kubernetes.io/projected/368e0053-e824-487b-995b-0805d6bbc718-kube-api-access-xv9bq\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.137382 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.137990 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f330def9-769c-4adf-9df3-c1a7c54cd502-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.139144 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f330def9-769c-4adf-9df3-c1a7c54cd502-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.139532 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-config\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.139658 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f330def9-769c-4adf-9df3-c1a7c54cd502-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.140208 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.140560 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.140915 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f330def9-769c-4adf-9df3-c1a7c54cd502-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.141284 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.141910 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f330def9-769c-4adf-9df3-c1a7c54cd502-config-data\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.142094 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.143412 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f330def9-769c-4adf-9df3-c1a7c54cd502-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.143683 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f330def9-769c-4adf-9df3-c1a7c54cd502-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.144334 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f330def9-769c-4adf-9df3-c1a7c54cd502-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.144360 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.144391 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6f3ec12aa3a4e2e24baea6243845f770d59c6a449418d7f6cc0746fe9e1bbf7f/globalmount\"" pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.145770 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f330def9-769c-4adf-9df3-c1a7c54cd502-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.159900 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8qjq\" (UniqueName: \"kubernetes.io/projected/f330def9-769c-4adf-9df3-c1a7c54cd502-kube-api-access-l8qjq\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.160187 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv9bq\" (UniqueName: \"kubernetes.io/projected/368e0053-e824-487b-995b-0805d6bbc718-kube-api-access-xv9bq\") pod \"dnsmasq-dns-dbb88bf8c-c8pfb\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.223306 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-970e444a-5d69-45cf-8cae-8be3d06f1dae\") pod \"rabbitmq-server-0\" (UID: \"f330def9-769c-4adf-9df3-c1a7c54cd502\") " pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.238064 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.238175 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.238202 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e31300a2-ace0-435c-8eb9-383dc7a6120b-config-data\") pod \"cloudkitty-storageinit-9czms\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.238226 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.238257 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmf28\" (UniqueName: \"kubernetes.io/projected/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-kube-api-access-fmf28\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.238306 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e31300a2-ace0-435c-8eb9-383dc7a6120b-certs\") pod \"cloudkitty-storageinit-9czms\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.238334 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.238368 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.238409 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e31300a2-ace0-435c-8eb9-383dc7a6120b-combined-ca-bundle\") pod \"cloudkitty-storageinit-9czms\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.238447 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e31300a2-ace0-435c-8eb9-383dc7a6120b-scripts\") pod \"cloudkitty-storageinit-9czms\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.238464 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.238489 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.238514 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m8nf\" (UniqueName: \"kubernetes.io/projected/e31300a2-ace0-435c-8eb9-383dc7a6120b-kube-api-access-6m8nf\") pod \"cloudkitty-storageinit-9czms\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.238536 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.238566 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.238602 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.238780 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.239144 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.239547 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.240010 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.240062 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.242181 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e31300a2-ace0-435c-8eb9-383dc7a6120b-certs\") pod \"cloudkitty-storageinit-9czms\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.244256 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e31300a2-ace0-435c-8eb9-383dc7a6120b-combined-ca-bundle\") pod \"cloudkitty-storageinit-9czms\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.244360 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e31300a2-ace0-435c-8eb9-383dc7a6120b-config-data\") pod \"cloudkitty-storageinit-9czms\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.244986 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.245581 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.247494 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.247527 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e31300a2-ace0-435c-8eb9-383dc7a6120b-scripts\") pod \"cloudkitty-storageinit-9czms\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.247536 4746 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.247598 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cc06eb5d713361bb8bc23b99b651543751fb837cba7bff1aeb5a7aa259796cdb/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.247714 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.253350 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmf28\" (UniqueName: \"kubernetes.io/projected/31ed4da0-c996-4afb-aa3d-d61a7c13ccfb-kube-api-access-fmf28\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.255579 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m8nf\" (UniqueName: \"kubernetes.io/projected/e31300a2-ace0-435c-8eb9-383dc7a6120b-kube-api-access-6m8nf\") pod \"cloudkitty-storageinit-9czms\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.256136 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.268317 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.311613 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ec30e86-c9d3-4c1e-a1f0-0bea12b5bd54\") pod \"rabbitmq-cell1-server-0\" (UID: \"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb\") " pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.368807 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.591895 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.744472 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6522f9-6035-4484-ba00-2255f04cd85d","Type":"ContainerStarted","Data":"fab1cbb3ebe0700c728c3420e44ab0ec138da9fc47a4bff91e94ec4c47e301a1"} Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.744514 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6522f9-6035-4484-ba00-2255f04cd85d","Type":"ContainerStarted","Data":"decde22812b5d43cd48e83a798e062b3f11724f0dae6124db26cd001310a601c"} Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.845628 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-c8pfb"] Jan 28 21:03:57 crc kubenswrapper[4746]: I0128 21:03:57.937173 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 28 21:03:58 crc kubenswrapper[4746]: W0128 21:03:58.009848 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode31300a2_ace0_435c_8eb9_383dc7a6120b.slice/crio-0ec5e87d0834574b80022da72dc5282a58c408b32a062a5d19e6027646c01cb1 WatchSource:0}: Error finding container 0ec5e87d0834574b80022da72dc5282a58c408b32a062a5d19e6027646c01cb1: Status 404 returned error can't find the container with id 0ec5e87d0834574b80022da72dc5282a58c408b32a062a5d19e6027646c01cb1 Jan 28 21:03:58 crc kubenswrapper[4746]: I0128 21:03:58.011186 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-9czms"] Jan 28 21:03:58 crc kubenswrapper[4746]: W0128 21:03:58.152233 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31ed4da0_c996_4afb_aa3d_d61a7c13ccfb.slice/crio-012b2683cb45dde6628c463774fb018d838c1f664320b7c735e0399ed55e84d6 WatchSource:0}: Error finding container 012b2683cb45dde6628c463774fb018d838c1f664320b7c735e0399ed55e84d6: Status 404 returned error can't find the container with id 012b2683cb45dde6628c463774fb018d838c1f664320b7c735e0399ed55e84d6 Jan 28 21:03:58 crc kubenswrapper[4746]: I0128 21:03:58.160528 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 28 21:03:58 crc kubenswrapper[4746]: I0128 21:03:58.765894 4746 generic.go:334] "Generic (PLEG): container finished" podID="368e0053-e824-487b-995b-0805d6bbc718" containerID="42b99a7a24b240593ee48cc6243b657b4f84e2469c3597254a889f7b3a2fa567" exitCode=0 Jan 28 21:03:58 crc kubenswrapper[4746]: I0128 21:03:58.766014 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" event={"ID":"368e0053-e824-487b-995b-0805d6bbc718","Type":"ContainerDied","Data":"42b99a7a24b240593ee48cc6243b657b4f84e2469c3597254a889f7b3a2fa567"} Jan 28 21:03:58 crc kubenswrapper[4746]: I0128 21:03:58.766270 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" event={"ID":"368e0053-e824-487b-995b-0805d6bbc718","Type":"ContainerStarted","Data":"2073fae9ee44ae79a884355411d83360a232ea7e11c9b0ec884055c99270a217"} Jan 28 21:03:58 crc kubenswrapper[4746]: I0128 21:03:58.769458 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-9czms" event={"ID":"e31300a2-ace0-435c-8eb9-383dc7a6120b","Type":"ContainerStarted","Data":"e05be61de5bbf8b7d404b06b2402502a8c3f3bb3334539bff21b437b83051af4"} Jan 28 21:03:58 crc kubenswrapper[4746]: I0128 21:03:58.769515 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-9czms" event={"ID":"e31300a2-ace0-435c-8eb9-383dc7a6120b","Type":"ContainerStarted","Data":"0ec5e87d0834574b80022da72dc5282a58c408b32a062a5d19e6027646c01cb1"} Jan 28 21:03:58 crc kubenswrapper[4746]: I0128 21:03:58.773323 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f330def9-769c-4adf-9df3-c1a7c54cd502","Type":"ContainerStarted","Data":"1c88d8307c9322643e1e85d6f131fe472f05143dfa520ba6c2e077cdca516b8d"} Jan 28 21:03:58 crc kubenswrapper[4746]: I0128 21:03:58.778447 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb","Type":"ContainerStarted","Data":"012b2683cb45dde6628c463774fb018d838c1f664320b7c735e0399ed55e84d6"} Jan 28 21:03:58 crc kubenswrapper[4746]: I0128 21:03:58.843436 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-9czms" podStartSLOduration=1.843415261 podStartE2EDuration="1.843415261s" podCreationTimestamp="2026-01-28 21:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:03:58.826918204 +0000 UTC m=+1466.783104558" watchObservedRunningTime="2026-01-28 21:03:58.843415261 +0000 UTC m=+1466.799601615" Jan 28 21:03:58 crc kubenswrapper[4746]: I0128 21:03:58.860961 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3252e1be-6e47-4264-a4d2-ba4099c9f3c0" path="/var/lib/kubelet/pods/3252e1be-6e47-4264-a4d2-ba4099c9f3c0/volumes" Jan 28 21:03:58 crc kubenswrapper[4746]: I0128 21:03:58.862208 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="701360b2-121a-4cb4-9a4f-9ce63391e740" path="/var/lib/kubelet/pods/701360b2-121a-4cb4-9a4f-9ce63391e740/volumes" Jan 28 21:03:58 crc kubenswrapper[4746]: I0128 21:03:58.863193 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88718387-09d6-4e3d-a06f-4353ba42ce91" path="/var/lib/kubelet/pods/88718387-09d6-4e3d-a06f-4353ba42ce91/volumes" Jan 28 21:04:00 crc kubenswrapper[4746]: I0128 21:04:00.805211 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" event={"ID":"368e0053-e824-487b-995b-0805d6bbc718","Type":"ContainerStarted","Data":"a74e8853e3cb3f6ae9f3fd62e7fecd463506bc19f908f0501c233215f24f6c9f"} Jan 28 21:04:00 crc kubenswrapper[4746]: I0128 21:04:00.805913 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:04:00 crc kubenswrapper[4746]: I0128 21:04:00.806880 4746 generic.go:334] "Generic (PLEG): container finished" podID="e31300a2-ace0-435c-8eb9-383dc7a6120b" containerID="e05be61de5bbf8b7d404b06b2402502a8c3f3bb3334539bff21b437b83051af4" exitCode=0 Jan 28 21:04:00 crc kubenswrapper[4746]: I0128 21:04:00.806880 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-9czms" event={"ID":"e31300a2-ace0-435c-8eb9-383dc7a6120b","Type":"ContainerDied","Data":"e05be61de5bbf8b7d404b06b2402502a8c3f3bb3334539bff21b437b83051af4"} Jan 28 21:04:00 crc kubenswrapper[4746]: I0128 21:04:00.808637 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f330def9-769c-4adf-9df3-c1a7c54cd502","Type":"ContainerStarted","Data":"3a642c8737fafc652b84b55ccbe91af81f095d9f955b946a79d237b4834de974"} Jan 28 21:04:00 crc kubenswrapper[4746]: I0128 21:04:00.810387 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb","Type":"ContainerStarted","Data":"6b6b0e1be301e05489f4153d6dbadee75639f3523767e22c80b88f0c7b0b5fe1"} Jan 28 21:04:00 crc kubenswrapper[4746]: I0128 21:04:00.813401 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f6522f9-6035-4484-ba00-2255f04cd85d","Type":"ContainerStarted","Data":"3a671efad5e3e766a851b7071417d1d80ba22c7c27e3bcd9b140978bc97983af"} Jan 28 21:04:00 crc kubenswrapper[4746]: I0128 21:04:00.813611 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 28 21:04:00 crc kubenswrapper[4746]: I0128 21:04:00.871023 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" podStartSLOduration=4.870998965 podStartE2EDuration="4.870998965s" podCreationTimestamp="2026-01-28 21:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:04:00.828059853 +0000 UTC m=+1468.784246207" watchObservedRunningTime="2026-01-28 21:04:00.870998965 +0000 UTC m=+1468.827185329" Jan 28 21:04:00 crc kubenswrapper[4746]: I0128 21:04:00.936497 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.536942344 podStartE2EDuration="14.936474327s" podCreationTimestamp="2026-01-28 21:03:46 +0000 UTC" firstStartedPulling="2026-01-28 21:03:52.340060576 +0000 UTC m=+1460.296246931" lastFinishedPulling="2026-01-28 21:03:59.73959257 +0000 UTC m=+1467.695778914" observedRunningTime="2026-01-28 21:04:00.92105326 +0000 UTC m=+1468.877239624" watchObservedRunningTime="2026-01-28 21:04:00.936474327 +0000 UTC m=+1468.892660681" Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.312573 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.354138 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e31300a2-ace0-435c-8eb9-383dc7a6120b-combined-ca-bundle\") pod \"e31300a2-ace0-435c-8eb9-383dc7a6120b\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.354185 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e31300a2-ace0-435c-8eb9-383dc7a6120b-certs\") pod \"e31300a2-ace0-435c-8eb9-383dc7a6120b\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.354252 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e31300a2-ace0-435c-8eb9-383dc7a6120b-config-data\") pod \"e31300a2-ace0-435c-8eb9-383dc7a6120b\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.354272 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e31300a2-ace0-435c-8eb9-383dc7a6120b-scripts\") pod \"e31300a2-ace0-435c-8eb9-383dc7a6120b\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.354303 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m8nf\" (UniqueName: \"kubernetes.io/projected/e31300a2-ace0-435c-8eb9-383dc7a6120b-kube-api-access-6m8nf\") pod \"e31300a2-ace0-435c-8eb9-383dc7a6120b\" (UID: \"e31300a2-ace0-435c-8eb9-383dc7a6120b\") " Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.370316 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e31300a2-ace0-435c-8eb9-383dc7a6120b-scripts" (OuterVolumeSpecName: "scripts") pod "e31300a2-ace0-435c-8eb9-383dc7a6120b" (UID: "e31300a2-ace0-435c-8eb9-383dc7a6120b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.370432 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e31300a2-ace0-435c-8eb9-383dc7a6120b-certs" (OuterVolumeSpecName: "certs") pod "e31300a2-ace0-435c-8eb9-383dc7a6120b" (UID: "e31300a2-ace0-435c-8eb9-383dc7a6120b"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.371376 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e31300a2-ace0-435c-8eb9-383dc7a6120b-kube-api-access-6m8nf" (OuterVolumeSpecName: "kube-api-access-6m8nf") pod "e31300a2-ace0-435c-8eb9-383dc7a6120b" (UID: "e31300a2-ace0-435c-8eb9-383dc7a6120b"). InnerVolumeSpecName "kube-api-access-6m8nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.388878 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e31300a2-ace0-435c-8eb9-383dc7a6120b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e31300a2-ace0-435c-8eb9-383dc7a6120b" (UID: "e31300a2-ace0-435c-8eb9-383dc7a6120b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.397417 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e31300a2-ace0-435c-8eb9-383dc7a6120b-config-data" (OuterVolumeSpecName: "config-data") pod "e31300a2-ace0-435c-8eb9-383dc7a6120b" (UID: "e31300a2-ace0-435c-8eb9-383dc7a6120b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.456182 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e31300a2-ace0-435c-8eb9-383dc7a6120b-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.456221 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e31300a2-ace0-435c-8eb9-383dc7a6120b-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.456231 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m8nf\" (UniqueName: \"kubernetes.io/projected/e31300a2-ace0-435c-8eb9-383dc7a6120b-kube-api-access-6m8nf\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.456242 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e31300a2-ace0-435c-8eb9-383dc7a6120b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.456306 4746 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e31300a2-ace0-435c-8eb9-383dc7a6120b-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.851633 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-9czms" Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.853579 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-9czms" event={"ID":"e31300a2-ace0-435c-8eb9-383dc7a6120b","Type":"ContainerDied","Data":"0ec5e87d0834574b80022da72dc5282a58c408b32a062a5d19e6027646c01cb1"} Jan 28 21:04:02 crc kubenswrapper[4746]: I0128 21:04:02.853644 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ec5e87d0834574b80022da72dc5282a58c408b32a062a5d19e6027646c01cb1" Jan 28 21:04:03 crc kubenswrapper[4746]: I0128 21:04:03.011372 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 28 21:04:03 crc kubenswrapper[4746]: I0128 21:04:03.011606 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="8eab90e7-58c5-4bdf-bca6-12c78bdabea9" containerName="cloudkitty-proc" containerID="cri-o://4ab8038e3ebef6e4484d598e7d3ed09d8e833167d3e19159b52ee369687e748d" gracePeriod=30 Jan 28 21:04:03 crc kubenswrapper[4746]: I0128 21:04:03.036673 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 28 21:04:03 crc kubenswrapper[4746]: I0128 21:04:03.036964 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="6e308833-0f26-4f16-9f4d-cd9a6e583ee9" containerName="cloudkitty-api-log" containerID="cri-o://832297f2799904b89c93ad5ebc2e91e71de6e0b478f0dfd3fc3a9f2f3fa1f944" gracePeriod=30 Jan 28 21:04:03 crc kubenswrapper[4746]: I0128 21:04:03.037001 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="6e308833-0f26-4f16-9f4d-cd9a6e583ee9" containerName="cloudkitty-api" containerID="cri-o://c82b268270307d0d209b8578cb1b031e6c4e7d82ac12518698a356f5d64ba8aa" gracePeriod=30 Jan 28 21:04:03 crc kubenswrapper[4746]: I0128 21:04:03.862572 4746 generic.go:334] "Generic (PLEG): container finished" podID="8eab90e7-58c5-4bdf-bca6-12c78bdabea9" containerID="4ab8038e3ebef6e4484d598e7d3ed09d8e833167d3e19159b52ee369687e748d" exitCode=0 Jan 28 21:04:03 crc kubenswrapper[4746]: I0128 21:04:03.862782 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"8eab90e7-58c5-4bdf-bca6-12c78bdabea9","Type":"ContainerDied","Data":"4ab8038e3ebef6e4484d598e7d3ed09d8e833167d3e19159b52ee369687e748d"} Jan 28 21:04:03 crc kubenswrapper[4746]: I0128 21:04:03.865631 4746 generic.go:334] "Generic (PLEG): container finished" podID="6e308833-0f26-4f16-9f4d-cd9a6e583ee9" containerID="832297f2799904b89c93ad5ebc2e91e71de6e0b478f0dfd3fc3a9f2f3fa1f944" exitCode=143 Jan 28 21:04:03 crc kubenswrapper[4746]: I0128 21:04:03.865658 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"6e308833-0f26-4f16-9f4d-cd9a6e583ee9","Type":"ContainerDied","Data":"832297f2799904b89c93ad5ebc2e91e71de6e0b478f0dfd3fc3a9f2f3fa1f944"} Jan 28 21:04:03 crc kubenswrapper[4746]: I0128 21:04:03.915608 4746 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-api-0" podUID="6e308833-0f26-4f16-9f4d-cd9a6e583ee9" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.197:8889/healthcheck\": read tcp 10.217.0.2:50336->10.217.0.197:8889: read: connection reset by peer" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.160938 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.227020 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-combined-ca-bundle\") pod \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.227166 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-scripts\") pod \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.227269 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-config-data-custom\") pod \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.227326 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-certs\") pod \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.227363 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6k576\" (UniqueName: \"kubernetes.io/projected/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-kube-api-access-6k576\") pod \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.227455 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-config-data\") pod \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\" (UID: \"8eab90e7-58c5-4bdf-bca6-12c78bdabea9\") " Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.236014 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-scripts" (OuterVolumeSpecName: "scripts") pod "8eab90e7-58c5-4bdf-bca6-12c78bdabea9" (UID: "8eab90e7-58c5-4bdf-bca6-12c78bdabea9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.245334 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8eab90e7-58c5-4bdf-bca6-12c78bdabea9" (UID: "8eab90e7-58c5-4bdf-bca6-12c78bdabea9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.245370 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-certs" (OuterVolumeSpecName: "certs") pod "8eab90e7-58c5-4bdf-bca6-12c78bdabea9" (UID: "8eab90e7-58c5-4bdf-bca6-12c78bdabea9"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.245427 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-kube-api-access-6k576" (OuterVolumeSpecName: "kube-api-access-6k576") pod "8eab90e7-58c5-4bdf-bca6-12c78bdabea9" (UID: "8eab90e7-58c5-4bdf-bca6-12c78bdabea9"). InnerVolumeSpecName "kube-api-access-6k576". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.258291 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-config-data" (OuterVolumeSpecName: "config-data") pod "8eab90e7-58c5-4bdf-bca6-12c78bdabea9" (UID: "8eab90e7-58c5-4bdf-bca6-12c78bdabea9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.270940 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8eab90e7-58c5-4bdf-bca6-12c78bdabea9" (UID: "8eab90e7-58c5-4bdf-bca6-12c78bdabea9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.305522 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.330792 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-internal-tls-certs\") pod \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.330848 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-config-data\") pod \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.330964 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-config-data-custom\") pod \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.330992 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-logs\") pod \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.331015 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-public-tls-certs\") pod \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.331032 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-scripts\") pod \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.331067 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-certs\") pod \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.331145 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-combined-ca-bundle\") pod \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.331170 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z72c4\" (UniqueName: \"kubernetes.io/projected/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-kube-api-access-z72c4\") pod \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\" (UID: \"6e308833-0f26-4f16-9f4d-cd9a6e583ee9\") " Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.331555 4746 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.331571 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6k576\" (UniqueName: \"kubernetes.io/projected/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-kube-api-access-6k576\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.331581 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.331589 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.331597 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.331605 4746 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8eab90e7-58c5-4bdf-bca6-12c78bdabea9-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.340355 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-kube-api-access-z72c4" (OuterVolumeSpecName: "kube-api-access-z72c4") pod "6e308833-0f26-4f16-9f4d-cd9a6e583ee9" (UID: "6e308833-0f26-4f16-9f4d-cd9a6e583ee9"). InnerVolumeSpecName "kube-api-access-z72c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.347594 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-logs" (OuterVolumeSpecName: "logs") pod "6e308833-0f26-4f16-9f4d-cd9a6e583ee9" (UID: "6e308833-0f26-4f16-9f4d-cd9a6e583ee9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.357489 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6e308833-0f26-4f16-9f4d-cd9a6e583ee9" (UID: "6e308833-0f26-4f16-9f4d-cd9a6e583ee9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.361863 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-certs" (OuterVolumeSpecName: "certs") pod "6e308833-0f26-4f16-9f4d-cd9a6e583ee9" (UID: "6e308833-0f26-4f16-9f4d-cd9a6e583ee9"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.362626 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-scripts" (OuterVolumeSpecName: "scripts") pod "6e308833-0f26-4f16-9f4d-cd9a6e583ee9" (UID: "6e308833-0f26-4f16-9f4d-cd9a6e583ee9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.419905 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-config-data" (OuterVolumeSpecName: "config-data") pod "6e308833-0f26-4f16-9f4d-cd9a6e583ee9" (UID: "6e308833-0f26-4f16-9f4d-cd9a6e583ee9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.433031 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z72c4\" (UniqueName: \"kubernetes.io/projected/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-kube-api-access-z72c4\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.433224 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.433311 4746 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.433366 4746 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-logs\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.433416 4746 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-scripts\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.433491 4746 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.442300 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e308833-0f26-4f16-9f4d-cd9a6e583ee9" (UID: "6e308833-0f26-4f16-9f4d-cd9a6e583ee9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.455454 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6e308833-0f26-4f16-9f4d-cd9a6e583ee9" (UID: "6e308833-0f26-4f16-9f4d-cd9a6e583ee9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.455496 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6e308833-0f26-4f16-9f4d-cd9a6e583ee9" (UID: "6e308833-0f26-4f16-9f4d-cd9a6e583ee9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.535415 4746 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.535444 4746 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.535453 4746 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e308833-0f26-4f16-9f4d-cd9a6e583ee9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.898882 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.898963 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"8eab90e7-58c5-4bdf-bca6-12c78bdabea9","Type":"ContainerDied","Data":"e1a1ea0ab8207674637ab36e5430dbf91210b0a889652187579a0efa44a5ebd8"} Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.899019 4746 scope.go:117] "RemoveContainer" containerID="4ab8038e3ebef6e4484d598e7d3ed09d8e833167d3e19159b52ee369687e748d" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.907378 4746 generic.go:334] "Generic (PLEG): container finished" podID="6e308833-0f26-4f16-9f4d-cd9a6e583ee9" containerID="c82b268270307d0d209b8578cb1b031e6c4e7d82ac12518698a356f5d64ba8aa" exitCode=0 Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.907434 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"6e308833-0f26-4f16-9f4d-cd9a6e583ee9","Type":"ContainerDied","Data":"c82b268270307d0d209b8578cb1b031e6c4e7d82ac12518698a356f5d64ba8aa"} Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.907462 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"6e308833-0f26-4f16-9f4d-cd9a6e583ee9","Type":"ContainerDied","Data":"b8cfc746885ee651b2f65ddaa465c26371726b571b267ce7f809c540dc9d4d35"} Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.907552 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.937594 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.981119 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 28 21:04:04 crc kubenswrapper[4746]: I0128 21:04:04.987395 4746 scope.go:117] "RemoveContainer" containerID="c82b268270307d0d209b8578cb1b031e6c4e7d82ac12518698a356f5d64ba8aa" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.012986 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.025387 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.028025 4746 scope.go:117] "RemoveContainer" containerID="832297f2799904b89c93ad5ebc2e91e71de6e0b478f0dfd3fc3a9f2f3fa1f944" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.040464 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 28 21:04:05 crc kubenswrapper[4746]: E0128 21:04:05.041001 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e308833-0f26-4f16-9f4d-cd9a6e583ee9" containerName="cloudkitty-api" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.041026 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e308833-0f26-4f16-9f4d-cd9a6e583ee9" containerName="cloudkitty-api" Jan 28 21:04:05 crc kubenswrapper[4746]: E0128 21:04:05.041046 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e31300a2-ace0-435c-8eb9-383dc7a6120b" containerName="cloudkitty-storageinit" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.041056 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="e31300a2-ace0-435c-8eb9-383dc7a6120b" containerName="cloudkitty-storageinit" Jan 28 21:04:05 crc kubenswrapper[4746]: E0128 21:04:05.041071 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eab90e7-58c5-4bdf-bca6-12c78bdabea9" containerName="cloudkitty-proc" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.041185 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eab90e7-58c5-4bdf-bca6-12c78bdabea9" containerName="cloudkitty-proc" Jan 28 21:04:05 crc kubenswrapper[4746]: E0128 21:04:05.041208 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e308833-0f26-4f16-9f4d-cd9a6e583ee9" containerName="cloudkitty-api-log" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.041219 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e308833-0f26-4f16-9f4d-cd9a6e583ee9" containerName="cloudkitty-api-log" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.041506 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e308833-0f26-4f16-9f4d-cd9a6e583ee9" containerName="cloudkitty-api-log" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.041533 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e308833-0f26-4f16-9f4d-cd9a6e583ee9" containerName="cloudkitty-api" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.041560 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="e31300a2-ace0-435c-8eb9-383dc7a6120b" containerName="cloudkitty-storageinit" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.041572 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eab90e7-58c5-4bdf-bca6-12c78bdabea9" containerName="cloudkitty-proc" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.043429 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.046461 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-bjgvh" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.046765 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.047663 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.047910 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.048882 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.050361 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.052517 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.056096 4746 scope.go:117] "RemoveContainer" containerID="c82b268270307d0d209b8578cb1b031e6c4e7d82ac12518698a356f5d64ba8aa" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.056349 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.056393 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.056428 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Jan 28 21:04:05 crc kubenswrapper[4746]: E0128 21:04:05.059012 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c82b268270307d0d209b8578cb1b031e6c4e7d82ac12518698a356f5d64ba8aa\": container with ID starting with c82b268270307d0d209b8578cb1b031e6c4e7d82ac12518698a356f5d64ba8aa not found: ID does not exist" containerID="c82b268270307d0d209b8578cb1b031e6c4e7d82ac12518698a356f5d64ba8aa" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.059258 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c82b268270307d0d209b8578cb1b031e6c4e7d82ac12518698a356f5d64ba8aa"} err="failed to get container status \"c82b268270307d0d209b8578cb1b031e6c4e7d82ac12518698a356f5d64ba8aa\": rpc error: code = NotFound desc = could not find container \"c82b268270307d0d209b8578cb1b031e6c4e7d82ac12518698a356f5d64ba8aa\": container with ID starting with c82b268270307d0d209b8578cb1b031e6c4e7d82ac12518698a356f5d64ba8aa not found: ID does not exist" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.059285 4746 scope.go:117] "RemoveContainer" containerID="832297f2799904b89c93ad5ebc2e91e71de6e0b478f0dfd3fc3a9f2f3fa1f944" Jan 28 21:04:05 crc kubenswrapper[4746]: E0128 21:04:05.059606 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"832297f2799904b89c93ad5ebc2e91e71de6e0b478f0dfd3fc3a9f2f3fa1f944\": container with ID starting with 832297f2799904b89c93ad5ebc2e91e71de6e0b478f0dfd3fc3a9f2f3fa1f944 not found: ID does not exist" containerID="832297f2799904b89c93ad5ebc2e91e71de6e0b478f0dfd3fc3a9f2f3fa1f944" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.059637 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"832297f2799904b89c93ad5ebc2e91e71de6e0b478f0dfd3fc3a9f2f3fa1f944"} err="failed to get container status \"832297f2799904b89c93ad5ebc2e91e71de6e0b478f0dfd3fc3a9f2f3fa1f944\": rpc error: code = NotFound desc = could not find container \"832297f2799904b89c93ad5ebc2e91e71de6e0b478f0dfd3fc3a9f2f3fa1f944\": container with ID starting with 832297f2799904b89c93ad5ebc2e91e71de6e0b478f0dfd3fc3a9f2f3fa1f944 not found: ID does not exist" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.061407 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.076419 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.154836 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxgqj\" (UniqueName: \"kubernetes.io/projected/54f66341-e026-4e4e-a7d4-be4f199ff3d6-kube-api-access-wxgqj\") pod \"cloudkitty-proc-0\" (UID: \"54f66341-e026-4e4e-a7d4-be4f199ff3d6\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.154882 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-scripts\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.154910 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54f66341-e026-4e4e-a7d4-be4f199ff3d6-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"54f66341-e026-4e4e-a7d4-be4f199ff3d6\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.154930 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-logs\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.154961 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.155000 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-certs\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.155029 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54f66341-e026-4e4e-a7d4-be4f199ff3d6-config-data\") pod \"cloudkitty-proc-0\" (UID: \"54f66341-e026-4e4e-a7d4-be4f199ff3d6\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.155152 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54f66341-e026-4e4e-a7d4-be4f199ff3d6-scripts\") pod \"cloudkitty-proc-0\" (UID: \"54f66341-e026-4e4e-a7d4-be4f199ff3d6\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.155365 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54f66341-e026-4e4e-a7d4-be4f199ff3d6-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"54f66341-e026-4e4e-a7d4-be4f199ff3d6\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.155436 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.155455 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/54f66341-e026-4e4e-a7d4-be4f199ff3d6-certs\") pod \"cloudkitty-proc-0\" (UID: \"54f66341-e026-4e4e-a7d4-be4f199ff3d6\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.155486 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.155612 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-config-data\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.155687 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7lr7\" (UniqueName: \"kubernetes.io/projected/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-kube-api-access-z7lr7\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.155741 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.258687 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.258779 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-config-data\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.258805 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7lr7\" (UniqueName: \"kubernetes.io/projected/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-kube-api-access-z7lr7\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.258839 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.259127 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxgqj\" (UniqueName: \"kubernetes.io/projected/54f66341-e026-4e4e-a7d4-be4f199ff3d6-kube-api-access-wxgqj\") pod \"cloudkitty-proc-0\" (UID: \"54f66341-e026-4e4e-a7d4-be4f199ff3d6\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.259156 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-scripts\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.259176 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54f66341-e026-4e4e-a7d4-be4f199ff3d6-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"54f66341-e026-4e4e-a7d4-be4f199ff3d6\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.259194 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-logs\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.259213 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.259244 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-certs\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.259275 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54f66341-e026-4e4e-a7d4-be4f199ff3d6-config-data\") pod \"cloudkitty-proc-0\" (UID: \"54f66341-e026-4e4e-a7d4-be4f199ff3d6\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.259311 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54f66341-e026-4e4e-a7d4-be4f199ff3d6-scripts\") pod \"cloudkitty-proc-0\" (UID: \"54f66341-e026-4e4e-a7d4-be4f199ff3d6\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.259384 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54f66341-e026-4e4e-a7d4-be4f199ff3d6-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"54f66341-e026-4e4e-a7d4-be4f199ff3d6\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.259414 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.259427 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/54f66341-e026-4e4e-a7d4-be4f199ff3d6-certs\") pod \"cloudkitty-proc-0\" (UID: \"54f66341-e026-4e4e-a7d4-be4f199ff3d6\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.261233 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-logs\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.263615 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54f66341-e026-4e4e-a7d4-be4f199ff3d6-scripts\") pod \"cloudkitty-proc-0\" (UID: \"54f66341-e026-4e4e-a7d4-be4f199ff3d6\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.263742 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-certs\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.265167 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-scripts\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.265426 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.265668 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54f66341-e026-4e4e-a7d4-be4f199ff3d6-config-data\") pod \"cloudkitty-proc-0\" (UID: \"54f66341-e026-4e4e-a7d4-be4f199ff3d6\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.266030 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54f66341-e026-4e4e-a7d4-be4f199ff3d6-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"54f66341-e026-4e4e-a7d4-be4f199ff3d6\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.266423 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-config-data\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.266557 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.267047 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54f66341-e026-4e4e-a7d4-be4f199ff3d6-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"54f66341-e026-4e4e-a7d4-be4f199ff3d6\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.267610 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.268822 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/54f66341-e026-4e4e-a7d4-be4f199ff3d6-certs\") pod \"cloudkitty-proc-0\" (UID: \"54f66341-e026-4e4e-a7d4-be4f199ff3d6\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.273247 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.273531 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7lr7\" (UniqueName: \"kubernetes.io/projected/e6b50cb8-f8a4-49e7-b464-7e42fc66e499-kube-api-access-z7lr7\") pod \"cloudkitty-api-0\" (UID: \"e6b50cb8-f8a4-49e7-b464-7e42fc66e499\") " pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.277390 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxgqj\" (UniqueName: \"kubernetes.io/projected/54f66341-e026-4e4e-a7d4-be4f199ff3d6-kube-api-access-wxgqj\") pod \"cloudkitty-proc-0\" (UID: \"54f66341-e026-4e4e-a7d4-be4f199ff3d6\") " pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.381661 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.399437 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.905236 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Jan 28 21:04:05 crc kubenswrapper[4746]: I0128 21:04:05.926736 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"54f66341-e026-4e4e-a7d4-be4f199ff3d6","Type":"ContainerStarted","Data":"82ce401d71c426eee89e3bcc3c4a3831ff6c6e3bdf8bf822b6154689cd4c0945"} Jan 28 21:04:06 crc kubenswrapper[4746]: W0128 21:04:06.018761 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6b50cb8_f8a4_49e7_b464_7e42fc66e499.slice/crio-90ca1bcc9ac89fdab1793f09d2bf0f3182c30e7b788407f722166d15092e30a1 WatchSource:0}: Error finding container 90ca1bcc9ac89fdab1793f09d2bf0f3182c30e7b788407f722166d15092e30a1: Status 404 returned error can't find the container with id 90ca1bcc9ac89fdab1793f09d2bf0f3182c30e7b788407f722166d15092e30a1 Jan 28 21:04:06 crc kubenswrapper[4746]: I0128 21:04:06.023984 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Jan 28 21:04:06 crc kubenswrapper[4746]: I0128 21:04:06.864232 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e308833-0f26-4f16-9f4d-cd9a6e583ee9" path="/var/lib/kubelet/pods/6e308833-0f26-4f16-9f4d-cd9a6e583ee9/volumes" Jan 28 21:04:06 crc kubenswrapper[4746]: I0128 21:04:06.868581 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eab90e7-58c5-4bdf-bca6-12c78bdabea9" path="/var/lib/kubelet/pods/8eab90e7-58c5-4bdf-bca6-12c78bdabea9/volumes" Jan 28 21:04:06 crc kubenswrapper[4746]: I0128 21:04:06.940525 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e6b50cb8-f8a4-49e7-b464-7e42fc66e499","Type":"ContainerStarted","Data":"6d99b6ea37a75f4e27e5565b6b01bebd894b2e56078490d8eb93600686766697"} Jan 28 21:04:06 crc kubenswrapper[4746]: I0128 21:04:06.940580 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e6b50cb8-f8a4-49e7-b464-7e42fc66e499","Type":"ContainerStarted","Data":"90ca1bcc9ac89fdab1793f09d2bf0f3182c30e7b788407f722166d15092e30a1"} Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.271307 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.351048 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-zstkj"] Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.351353 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" podUID="8dc0a819-0fb7-4d64-a2f3-4e762be61026" containerName="dnsmasq-dns" containerID="cri-o://bdd5e5ce30341333f996d8fd3b7934f5a86c197b203f920715526262a91ee325" gracePeriod=10 Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.486324 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-pndxq"] Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.488501 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.556606 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-pndxq"] Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.633707 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f070414-7083-40c4-b7aa-db248c3fd681-config\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.633752 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f070414-7083-40c4-b7aa-db248c3fd681-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.633806 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f070414-7083-40c4-b7aa-db248c3fd681-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.633839 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7f070414-7083-40c4-b7aa-db248c3fd681-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.633863 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f2tc\" (UniqueName: \"kubernetes.io/projected/7f070414-7083-40c4-b7aa-db248c3fd681-kube-api-access-9f2tc\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.633887 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f070414-7083-40c4-b7aa-db248c3fd681-dns-svc\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.633914 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f070414-7083-40c4-b7aa-db248c3fd681-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.735879 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f070414-7083-40c4-b7aa-db248c3fd681-config\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.735927 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f070414-7083-40c4-b7aa-db248c3fd681-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.735980 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f070414-7083-40c4-b7aa-db248c3fd681-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.736013 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7f070414-7083-40c4-b7aa-db248c3fd681-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.736034 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f2tc\" (UniqueName: \"kubernetes.io/projected/7f070414-7083-40c4-b7aa-db248c3fd681-kube-api-access-9f2tc\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.736050 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f070414-7083-40c4-b7aa-db248c3fd681-dns-svc\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.736066 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f070414-7083-40c4-b7aa-db248c3fd681-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.737171 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f070414-7083-40c4-b7aa-db248c3fd681-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.737212 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f070414-7083-40c4-b7aa-db248c3fd681-config\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.737422 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f070414-7083-40c4-b7aa-db248c3fd681-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.737726 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7f070414-7083-40c4-b7aa-db248c3fd681-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.738006 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f070414-7083-40c4-b7aa-db248c3fd681-dns-svc\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.738226 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f070414-7083-40c4-b7aa-db248c3fd681-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.770071 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f2tc\" (UniqueName: \"kubernetes.io/projected/7f070414-7083-40c4-b7aa-db248c3fd681-kube-api-access-9f2tc\") pod \"dnsmasq-dns-85f64749dc-pndxq\" (UID: \"7f070414-7083-40c4-b7aa-db248c3fd681\") " pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.885647 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.886325 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.939616 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-dns-swift-storage-0\") pod \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.939726 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-config\") pod \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.939888 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-ovsdbserver-nb\") pod \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.939926 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b54d4\" (UniqueName: \"kubernetes.io/projected/8dc0a819-0fb7-4d64-a2f3-4e762be61026-kube-api-access-b54d4\") pod \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.939970 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-dns-svc\") pod \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.940009 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-ovsdbserver-sb\") pod \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\" (UID: \"8dc0a819-0fb7-4d64-a2f3-4e762be61026\") " Jan 28 21:04:07 crc kubenswrapper[4746]: I0128 21:04:07.993890 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dc0a819-0fb7-4d64-a2f3-4e762be61026-kube-api-access-b54d4" (OuterVolumeSpecName: "kube-api-access-b54d4") pod "8dc0a819-0fb7-4d64-a2f3-4e762be61026" (UID: "8dc0a819-0fb7-4d64-a2f3-4e762be61026"). InnerVolumeSpecName "kube-api-access-b54d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.042737 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b54d4\" (UniqueName: \"kubernetes.io/projected/8dc0a819-0fb7-4d64-a2f3-4e762be61026-kube-api-access-b54d4\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.067473 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e6b50cb8-f8a4-49e7-b464-7e42fc66e499","Type":"ContainerStarted","Data":"35122e75ea46e2ea3b6530a0397270dacb6c59b85cad33101caca92f1ec7e96f"} Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.080381 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.104689 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=4.104672192 podStartE2EDuration="4.104672192s" podCreationTimestamp="2026-01-28 21:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:04:08.104488366 +0000 UTC m=+1476.060674710" watchObservedRunningTime="2026-01-28 21:04:08.104672192 +0000 UTC m=+1476.060858546" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.105196 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"54f66341-e026-4e4e-a7d4-be4f199ff3d6","Type":"ContainerStarted","Data":"7862f5e5b80526c901907a5db4ce7a101727f5009bd6d6285e8d4341b08f3598"} Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.116585 4746 generic.go:334] "Generic (PLEG): container finished" podID="8dc0a819-0fb7-4d64-a2f3-4e762be61026" containerID="bdd5e5ce30341333f996d8fd3b7934f5a86c197b203f920715526262a91ee325" exitCode=0 Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.116629 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" event={"ID":"8dc0a819-0fb7-4d64-a2f3-4e762be61026","Type":"ContainerDied","Data":"bdd5e5ce30341333f996d8fd3b7934f5a86c197b203f920715526262a91ee325"} Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.116653 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" event={"ID":"8dc0a819-0fb7-4d64-a2f3-4e762be61026","Type":"ContainerDied","Data":"3100f2210755488d4c4fa735b19dd7709134fbc5c927b02e3e0e00c3a899650e"} Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.116669 4746 scope.go:117] "RemoveContainer" containerID="bdd5e5ce30341333f996d8fd3b7934f5a86c197b203f920715526262a91ee325" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.116783 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-zstkj" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.150647 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8dc0a819-0fb7-4d64-a2f3-4e762be61026" (UID: "8dc0a819-0fb7-4d64-a2f3-4e762be61026"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.154215 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.621180339 podStartE2EDuration="4.154195331s" podCreationTimestamp="2026-01-28 21:04:04 +0000 UTC" firstStartedPulling="2026-01-28 21:04:05.901720441 +0000 UTC m=+1473.857906805" lastFinishedPulling="2026-01-28 21:04:07.434735433 +0000 UTC m=+1475.390921797" observedRunningTime="2026-01-28 21:04:08.153962755 +0000 UTC m=+1476.110149109" watchObservedRunningTime="2026-01-28 21:04:08.154195331 +0000 UTC m=+1476.110381685" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.179739 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8dc0a819-0fb7-4d64-a2f3-4e762be61026" (UID: "8dc0a819-0fb7-4d64-a2f3-4e762be61026"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.187321 4746 scope.go:117] "RemoveContainer" containerID="99f3f9bede6349668311a33fbe8d25243120e4d3f65c682327304b86efe81700" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.190706 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8dc0a819-0fb7-4d64-a2f3-4e762be61026" (UID: "8dc0a819-0fb7-4d64-a2f3-4e762be61026"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.217451 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-config" (OuterVolumeSpecName: "config") pod "8dc0a819-0fb7-4d64-a2f3-4e762be61026" (UID: "8dc0a819-0fb7-4d64-a2f3-4e762be61026"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.226425 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8dc0a819-0fb7-4d64-a2f3-4e762be61026" (UID: "8dc0a819-0fb7-4d64-a2f3-4e762be61026"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.242448 4746 scope.go:117] "RemoveContainer" containerID="bdd5e5ce30341333f996d8fd3b7934f5a86c197b203f920715526262a91ee325" Jan 28 21:04:08 crc kubenswrapper[4746]: E0128 21:04:08.243055 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdd5e5ce30341333f996d8fd3b7934f5a86c197b203f920715526262a91ee325\": container with ID starting with bdd5e5ce30341333f996d8fd3b7934f5a86c197b203f920715526262a91ee325 not found: ID does not exist" containerID="bdd5e5ce30341333f996d8fd3b7934f5a86c197b203f920715526262a91ee325" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.243135 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdd5e5ce30341333f996d8fd3b7934f5a86c197b203f920715526262a91ee325"} err="failed to get container status \"bdd5e5ce30341333f996d8fd3b7934f5a86c197b203f920715526262a91ee325\": rpc error: code = NotFound desc = could not find container \"bdd5e5ce30341333f996d8fd3b7934f5a86c197b203f920715526262a91ee325\": container with ID starting with bdd5e5ce30341333f996d8fd3b7934f5a86c197b203f920715526262a91ee325 not found: ID does not exist" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.243172 4746 scope.go:117] "RemoveContainer" containerID="99f3f9bede6349668311a33fbe8d25243120e4d3f65c682327304b86efe81700" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.246365 4746 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.246393 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-config\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.246402 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.246411 4746 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.246422 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8dc0a819-0fb7-4d64-a2f3-4e762be61026-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:08 crc kubenswrapper[4746]: E0128 21:04:08.248887 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99f3f9bede6349668311a33fbe8d25243120e4d3f65c682327304b86efe81700\": container with ID starting with 99f3f9bede6349668311a33fbe8d25243120e4d3f65c682327304b86efe81700 not found: ID does not exist" containerID="99f3f9bede6349668311a33fbe8d25243120e4d3f65c682327304b86efe81700" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.248925 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99f3f9bede6349668311a33fbe8d25243120e4d3f65c682327304b86efe81700"} err="failed to get container status \"99f3f9bede6349668311a33fbe8d25243120e4d3f65c682327304b86efe81700\": rpc error: code = NotFound desc = could not find container \"99f3f9bede6349668311a33fbe8d25243120e4d3f65c682327304b86efe81700\": container with ID starting with 99f3f9bede6349668311a33fbe8d25243120e4d3f65c682327304b86efe81700 not found: ID does not exist" Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.449961 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-zstkj"] Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.459542 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-zstkj"] Jan 28 21:04:08 crc kubenswrapper[4746]: W0128 21:04:08.536666 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f070414_7083_40c4_b7aa_db248c3fd681.slice/crio-fbb715681442d1fad3d9edb8ae78bdaf3c35afdc98af1a7c002224e511140898 WatchSource:0}: Error finding container fbb715681442d1fad3d9edb8ae78bdaf3c35afdc98af1a7c002224e511140898: Status 404 returned error can't find the container with id fbb715681442d1fad3d9edb8ae78bdaf3c35afdc98af1a7c002224e511140898 Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.538426 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-pndxq"] Jan 28 21:04:08 crc kubenswrapper[4746]: I0128 21:04:08.852097 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dc0a819-0fb7-4d64-a2f3-4e762be61026" path="/var/lib/kubelet/pods/8dc0a819-0fb7-4d64-a2f3-4e762be61026/volumes" Jan 28 21:04:09 crc kubenswrapper[4746]: I0128 21:04:09.220251 4746 generic.go:334] "Generic (PLEG): container finished" podID="7f070414-7083-40c4-b7aa-db248c3fd681" containerID="33a788c6ff31a05b339ef1778af7735d380aaee6c5c44a4f4121c726eb214401" exitCode=0 Jan 28 21:04:09 crc kubenswrapper[4746]: I0128 21:04:09.221252 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-pndxq" event={"ID":"7f070414-7083-40c4-b7aa-db248c3fd681","Type":"ContainerDied","Data":"33a788c6ff31a05b339ef1778af7735d380aaee6c5c44a4f4121c726eb214401"} Jan 28 21:04:09 crc kubenswrapper[4746]: I0128 21:04:09.221326 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-pndxq" event={"ID":"7f070414-7083-40c4-b7aa-db248c3fd681","Type":"ContainerStarted","Data":"fbb715681442d1fad3d9edb8ae78bdaf3c35afdc98af1a7c002224e511140898"} Jan 28 21:04:10 crc kubenswrapper[4746]: I0128 21:04:10.230456 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-pndxq" event={"ID":"7f070414-7083-40c4-b7aa-db248c3fd681","Type":"ContainerStarted","Data":"821e744fbbc8157e8678f2f5e05f2bd3cc17b430136b64950d704f9b37f5a9d7"} Jan 28 21:04:10 crc kubenswrapper[4746]: I0128 21:04:10.250284 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85f64749dc-pndxq" podStartSLOduration=3.250263919 podStartE2EDuration="3.250263919s" podCreationTimestamp="2026-01-28 21:04:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:04:10.248802389 +0000 UTC m=+1478.204988743" watchObservedRunningTime="2026-01-28 21:04:10.250263919 +0000 UTC m=+1478.206450283" Jan 28 21:04:11 crc kubenswrapper[4746]: I0128 21:04:11.240532 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:15 crc kubenswrapper[4746]: I0128 21:04:15.871608 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:04:15 crc kubenswrapper[4746]: I0128 21:04:15.872184 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:04:15 crc kubenswrapper[4746]: I0128 21:04:15.872226 4746 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 21:04:15 crc kubenswrapper[4746]: I0128 21:04:15.873061 4746 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"551b5dbcacfba813c1158522c098223ffafd54f7aa789c2d4402da75877d8079"} pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 21:04:15 crc kubenswrapper[4746]: I0128 21:04:15.873146 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" containerID="cri-o://551b5dbcacfba813c1158522c098223ffafd54f7aa789c2d4402da75877d8079" gracePeriod=600 Jan 28 21:04:16 crc kubenswrapper[4746]: I0128 21:04:16.309112 4746 generic.go:334] "Generic (PLEG): container finished" podID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerID="551b5dbcacfba813c1158522c098223ffafd54f7aa789c2d4402da75877d8079" exitCode=0 Jan 28 21:04:16 crc kubenswrapper[4746]: I0128 21:04:16.309169 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerDied","Data":"551b5dbcacfba813c1158522c098223ffafd54f7aa789c2d4402da75877d8079"} Jan 28 21:04:16 crc kubenswrapper[4746]: I0128 21:04:16.309983 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerStarted","Data":"4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968"} Jan 28 21:04:16 crc kubenswrapper[4746]: I0128 21:04:16.310061 4746 scope.go:117] "RemoveContainer" containerID="6862c0afb8f6ee7e41759258bd8f935df2c29be354b170c8fd2a76edbba23242" Jan 28 21:04:16 crc kubenswrapper[4746]: I0128 21:04:16.967759 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 28 21:04:17 crc kubenswrapper[4746]: I0128 21:04:17.887485 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85f64749dc-pndxq" Jan 28 21:04:17 crc kubenswrapper[4746]: I0128 21:04:17.978225 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-c8pfb"] Jan 28 21:04:17 crc kubenswrapper[4746]: I0128 21:04:17.978513 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" podUID="368e0053-e824-487b-995b-0805d6bbc718" containerName="dnsmasq-dns" containerID="cri-o://a74e8853e3cb3f6ae9f3fd62e7fecd463506bc19f908f0501c233215f24f6c9f" gracePeriod=10 Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.338684 4746 generic.go:334] "Generic (PLEG): container finished" podID="368e0053-e824-487b-995b-0805d6bbc718" containerID="a74e8853e3cb3f6ae9f3fd62e7fecd463506bc19f908f0501c233215f24f6c9f" exitCode=0 Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.339186 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" event={"ID":"368e0053-e824-487b-995b-0805d6bbc718","Type":"ContainerDied","Data":"a74e8853e3cb3f6ae9f3fd62e7fecd463506bc19f908f0501c233215f24f6c9f"} Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.574109 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.756332 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-openstack-edpm-ipam\") pod \"368e0053-e824-487b-995b-0805d6bbc718\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.756646 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-ovsdbserver-nb\") pod \"368e0053-e824-487b-995b-0805d6bbc718\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.756752 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-dns-swift-storage-0\") pod \"368e0053-e824-487b-995b-0805d6bbc718\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.756778 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-ovsdbserver-sb\") pod \"368e0053-e824-487b-995b-0805d6bbc718\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.756872 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-config\") pod \"368e0053-e824-487b-995b-0805d6bbc718\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.756908 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-dns-svc\") pod \"368e0053-e824-487b-995b-0805d6bbc718\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.756984 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv9bq\" (UniqueName: \"kubernetes.io/projected/368e0053-e824-487b-995b-0805d6bbc718-kube-api-access-xv9bq\") pod \"368e0053-e824-487b-995b-0805d6bbc718\" (UID: \"368e0053-e824-487b-995b-0805d6bbc718\") " Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.771369 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/368e0053-e824-487b-995b-0805d6bbc718-kube-api-access-xv9bq" (OuterVolumeSpecName: "kube-api-access-xv9bq") pod "368e0053-e824-487b-995b-0805d6bbc718" (UID: "368e0053-e824-487b-995b-0805d6bbc718"). InnerVolumeSpecName "kube-api-access-xv9bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.811388 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "368e0053-e824-487b-995b-0805d6bbc718" (UID: "368e0053-e824-487b-995b-0805d6bbc718"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.822255 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "368e0053-e824-487b-995b-0805d6bbc718" (UID: "368e0053-e824-487b-995b-0805d6bbc718"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.824445 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-config" (OuterVolumeSpecName: "config") pod "368e0053-e824-487b-995b-0805d6bbc718" (UID: "368e0053-e824-487b-995b-0805d6bbc718"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.833560 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "368e0053-e824-487b-995b-0805d6bbc718" (UID: "368e0053-e824-487b-995b-0805d6bbc718"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.836750 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "368e0053-e824-487b-995b-0805d6bbc718" (UID: "368e0053-e824-487b-995b-0805d6bbc718"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.851547 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "368e0053-e824-487b-995b-0805d6bbc718" (UID: "368e0053-e824-487b-995b-0805d6bbc718"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.862674 4746 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-config\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.862706 4746 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.862719 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv9bq\" (UniqueName: \"kubernetes.io/projected/368e0053-e824-487b-995b-0805d6bbc718-kube-api-access-xv9bq\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.862734 4746 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.862746 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.862759 4746 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:18 crc kubenswrapper[4746]: I0128 21:04:18.862772 4746 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/368e0053-e824-487b-995b-0805d6bbc718-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:19 crc kubenswrapper[4746]: I0128 21:04:19.352691 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" event={"ID":"368e0053-e824-487b-995b-0805d6bbc718","Type":"ContainerDied","Data":"2073fae9ee44ae79a884355411d83360a232ea7e11c9b0ec884055c99270a217"} Jan 28 21:04:19 crc kubenswrapper[4746]: I0128 21:04:19.352965 4746 scope.go:117] "RemoveContainer" containerID="a74e8853e3cb3f6ae9f3fd62e7fecd463506bc19f908f0501c233215f24f6c9f" Jan 28 21:04:19 crc kubenswrapper[4746]: I0128 21:04:19.352749 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-c8pfb" Jan 28 21:04:19 crc kubenswrapper[4746]: I0128 21:04:19.393348 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-c8pfb"] Jan 28 21:04:19 crc kubenswrapper[4746]: I0128 21:04:19.406003 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-c8pfb"] Jan 28 21:04:19 crc kubenswrapper[4746]: I0128 21:04:19.426441 4746 scope.go:117] "RemoveContainer" containerID="42b99a7a24b240593ee48cc6243b657b4f84e2469c3597254a889f7b3a2fa567" Jan 28 21:04:20 crc kubenswrapper[4746]: I0128 21:04:20.853729 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="368e0053-e824-487b-995b-0805d6bbc718" path="/var/lib/kubelet/pods/368e0053-e824-487b-995b-0805d6bbc718/volumes" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.051610 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p"] Jan 28 21:04:31 crc kubenswrapper[4746]: E0128 21:04:31.052519 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc0a819-0fb7-4d64-a2f3-4e762be61026" containerName="init" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.052534 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc0a819-0fb7-4d64-a2f3-4e762be61026" containerName="init" Jan 28 21:04:31 crc kubenswrapper[4746]: E0128 21:04:31.052565 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="368e0053-e824-487b-995b-0805d6bbc718" containerName="init" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.052571 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="368e0053-e824-487b-995b-0805d6bbc718" containerName="init" Jan 28 21:04:31 crc kubenswrapper[4746]: E0128 21:04:31.052591 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc0a819-0fb7-4d64-a2f3-4e762be61026" containerName="dnsmasq-dns" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.052597 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc0a819-0fb7-4d64-a2f3-4e762be61026" containerName="dnsmasq-dns" Jan 28 21:04:31 crc kubenswrapper[4746]: E0128 21:04:31.052609 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="368e0053-e824-487b-995b-0805d6bbc718" containerName="dnsmasq-dns" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.052614 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="368e0053-e824-487b-995b-0805d6bbc718" containerName="dnsmasq-dns" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.052801 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dc0a819-0fb7-4d64-a2f3-4e762be61026" containerName="dnsmasq-dns" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.052826 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="368e0053-e824-487b-995b-0805d6bbc718" containerName="dnsmasq-dns" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.053533 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.056673 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.058034 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.058298 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xb8vv" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.058328 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.078846 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p"] Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.245420 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90153a28-4812-4b2e-a3a3-2443a8618c3d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p\" (UID: \"90153a28-4812-4b2e-a3a3-2443a8618c3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.245470 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4dqd\" (UniqueName: \"kubernetes.io/projected/90153a28-4812-4b2e-a3a3-2443a8618c3d-kube-api-access-c4dqd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p\" (UID: \"90153a28-4812-4b2e-a3a3-2443a8618c3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.245508 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90153a28-4812-4b2e-a3a3-2443a8618c3d-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p\" (UID: \"90153a28-4812-4b2e-a3a3-2443a8618c3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.245598 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90153a28-4812-4b2e-a3a3-2443a8618c3d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p\" (UID: \"90153a28-4812-4b2e-a3a3-2443a8618c3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.347759 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90153a28-4812-4b2e-a3a3-2443a8618c3d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p\" (UID: \"90153a28-4812-4b2e-a3a3-2443a8618c3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.347951 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90153a28-4812-4b2e-a3a3-2443a8618c3d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p\" (UID: \"90153a28-4812-4b2e-a3a3-2443a8618c3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.347982 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4dqd\" (UniqueName: \"kubernetes.io/projected/90153a28-4812-4b2e-a3a3-2443a8618c3d-kube-api-access-c4dqd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p\" (UID: \"90153a28-4812-4b2e-a3a3-2443a8618c3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.348014 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90153a28-4812-4b2e-a3a3-2443a8618c3d-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p\" (UID: \"90153a28-4812-4b2e-a3a3-2443a8618c3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.354528 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90153a28-4812-4b2e-a3a3-2443a8618c3d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p\" (UID: \"90153a28-4812-4b2e-a3a3-2443a8618c3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.354714 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90153a28-4812-4b2e-a3a3-2443a8618c3d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p\" (UID: \"90153a28-4812-4b2e-a3a3-2443a8618c3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.365215 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90153a28-4812-4b2e-a3a3-2443a8618c3d-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p\" (UID: \"90153a28-4812-4b2e-a3a3-2443a8618c3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.369666 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4dqd\" (UniqueName: \"kubernetes.io/projected/90153a28-4812-4b2e-a3a3-2443a8618c3d-kube-api-access-c4dqd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p\" (UID: \"90153a28-4812-4b2e-a3a3-2443a8618c3d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.415690 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" Jan 28 21:04:31 crc kubenswrapper[4746]: I0128 21:04:31.956483 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p"] Jan 28 21:04:32 crc kubenswrapper[4746]: I0128 21:04:32.523559 4746 generic.go:334] "Generic (PLEG): container finished" podID="f330def9-769c-4adf-9df3-c1a7c54cd502" containerID="3a642c8737fafc652b84b55ccbe91af81f095d9f955b946a79d237b4834de974" exitCode=0 Jan 28 21:04:32 crc kubenswrapper[4746]: I0128 21:04:32.523590 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f330def9-769c-4adf-9df3-c1a7c54cd502","Type":"ContainerDied","Data":"3a642c8737fafc652b84b55ccbe91af81f095d9f955b946a79d237b4834de974"} Jan 28 21:04:32 crc kubenswrapper[4746]: I0128 21:04:32.525363 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" event={"ID":"90153a28-4812-4b2e-a3a3-2443a8618c3d","Type":"ContainerStarted","Data":"7b36f241f4050198d6a4981ab98131119fe2969152bda9ed2871c414a5ae867b"} Jan 28 21:04:33 crc kubenswrapper[4746]: I0128 21:04:33.544321 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f330def9-769c-4adf-9df3-c1a7c54cd502","Type":"ContainerStarted","Data":"a4706a8a2c8820a17af33653d5775eb296959fdd7e2ff6b1d19d622aec5bcc52"} Jan 28 21:04:33 crc kubenswrapper[4746]: I0128 21:04:33.545089 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 28 21:04:33 crc kubenswrapper[4746]: I0128 21:04:33.549111 4746 generic.go:334] "Generic (PLEG): container finished" podID="31ed4da0-c996-4afb-aa3d-d61a7c13ccfb" containerID="6b6b0e1be301e05489f4153d6dbadee75639f3523767e22c80b88f0c7b0b5fe1" exitCode=0 Jan 28 21:04:33 crc kubenswrapper[4746]: I0128 21:04:33.549435 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb","Type":"ContainerDied","Data":"6b6b0e1be301e05489f4153d6dbadee75639f3523767e22c80b88f0c7b0b5fe1"} Jan 28 21:04:33 crc kubenswrapper[4746]: I0128 21:04:33.591268 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.591228651 podStartE2EDuration="37.591228651s" podCreationTimestamp="2026-01-28 21:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:04:33.567555561 +0000 UTC m=+1501.523741915" watchObservedRunningTime="2026-01-28 21:04:33.591228651 +0000 UTC m=+1501.547415005" Jan 28 21:04:35 crc kubenswrapper[4746]: I0128 21:04:35.573610 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"31ed4da0-c996-4afb-aa3d-d61a7c13ccfb","Type":"ContainerStarted","Data":"c783a9b69026dea4b26fe1b6643a9fd1ed95d2afde8c76a96b7b4e1906ce2858"} Jan 28 21:04:35 crc kubenswrapper[4746]: I0128 21:04:35.574628 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:04:35 crc kubenswrapper[4746]: I0128 21:04:35.612509 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.612486885 podStartE2EDuration="39.612486885s" podCreationTimestamp="2026-01-28 21:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:04:35.599990127 +0000 UTC m=+1503.556176481" watchObservedRunningTime="2026-01-28 21:04:35.612486885 +0000 UTC m=+1503.568673239" Jan 28 21:04:41 crc kubenswrapper[4746]: I0128 21:04:41.457047 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 21:04:42 crc kubenswrapper[4746]: I0128 21:04:42.464537 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Jan 28 21:04:42 crc kubenswrapper[4746]: I0128 21:04:42.642180 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" event={"ID":"90153a28-4812-4b2e-a3a3-2443a8618c3d","Type":"ContainerStarted","Data":"aa60e896e6a946d042d4bdf800042d1ddbd6606e030de7e0463486e476f8bdfb"} Jan 28 21:04:42 crc kubenswrapper[4746]: I0128 21:04:42.666447 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" podStartSLOduration=2.170700262 podStartE2EDuration="11.666426577s" podCreationTimestamp="2026-01-28 21:04:31 +0000 UTC" firstStartedPulling="2026-01-28 21:04:31.958169823 +0000 UTC m=+1499.914356177" lastFinishedPulling="2026-01-28 21:04:41.453896118 +0000 UTC m=+1509.410082492" observedRunningTime="2026-01-28 21:04:42.658200425 +0000 UTC m=+1510.614386779" watchObservedRunningTime="2026-01-28 21:04:42.666426577 +0000 UTC m=+1510.622612931" Jan 28 21:04:47 crc kubenswrapper[4746]: I0128 21:04:47.259211 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 28 21:04:47 crc kubenswrapper[4746]: I0128 21:04:47.596253 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 28 21:04:51 crc kubenswrapper[4746]: I0128 21:04:51.749704 4746 generic.go:334] "Generic (PLEG): container finished" podID="90153a28-4812-4b2e-a3a3-2443a8618c3d" containerID="aa60e896e6a946d042d4bdf800042d1ddbd6606e030de7e0463486e476f8bdfb" exitCode=0 Jan 28 21:04:51 crc kubenswrapper[4746]: I0128 21:04:51.749805 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" event={"ID":"90153a28-4812-4b2e-a3a3-2443a8618c3d","Type":"ContainerDied","Data":"aa60e896e6a946d042d4bdf800042d1ddbd6606e030de7e0463486e476f8bdfb"} Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.277693 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.340985 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4dqd\" (UniqueName: \"kubernetes.io/projected/90153a28-4812-4b2e-a3a3-2443a8618c3d-kube-api-access-c4dqd\") pod \"90153a28-4812-4b2e-a3a3-2443a8618c3d\" (UID: \"90153a28-4812-4b2e-a3a3-2443a8618c3d\") " Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.341122 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90153a28-4812-4b2e-a3a3-2443a8618c3d-inventory\") pod \"90153a28-4812-4b2e-a3a3-2443a8618c3d\" (UID: \"90153a28-4812-4b2e-a3a3-2443a8618c3d\") " Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.341310 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90153a28-4812-4b2e-a3a3-2443a8618c3d-repo-setup-combined-ca-bundle\") pod \"90153a28-4812-4b2e-a3a3-2443a8618c3d\" (UID: \"90153a28-4812-4b2e-a3a3-2443a8618c3d\") " Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.341366 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90153a28-4812-4b2e-a3a3-2443a8618c3d-ssh-key-openstack-edpm-ipam\") pod \"90153a28-4812-4b2e-a3a3-2443a8618c3d\" (UID: \"90153a28-4812-4b2e-a3a3-2443a8618c3d\") " Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.346872 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90153a28-4812-4b2e-a3a3-2443a8618c3d-kube-api-access-c4dqd" (OuterVolumeSpecName: "kube-api-access-c4dqd") pod "90153a28-4812-4b2e-a3a3-2443a8618c3d" (UID: "90153a28-4812-4b2e-a3a3-2443a8618c3d"). InnerVolumeSpecName "kube-api-access-c4dqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.347477 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90153a28-4812-4b2e-a3a3-2443a8618c3d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "90153a28-4812-4b2e-a3a3-2443a8618c3d" (UID: "90153a28-4812-4b2e-a3a3-2443a8618c3d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.374219 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90153a28-4812-4b2e-a3a3-2443a8618c3d-inventory" (OuterVolumeSpecName: "inventory") pod "90153a28-4812-4b2e-a3a3-2443a8618c3d" (UID: "90153a28-4812-4b2e-a3a3-2443a8618c3d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.386389 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90153a28-4812-4b2e-a3a3-2443a8618c3d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "90153a28-4812-4b2e-a3a3-2443a8618c3d" (UID: "90153a28-4812-4b2e-a3a3-2443a8618c3d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.444946 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4dqd\" (UniqueName: \"kubernetes.io/projected/90153a28-4812-4b2e-a3a3-2443a8618c3d-kube-api-access-c4dqd\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.445184 4746 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90153a28-4812-4b2e-a3a3-2443a8618c3d-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.445267 4746 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90153a28-4812-4b2e-a3a3-2443a8618c3d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.445340 4746 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/90153a28-4812-4b2e-a3a3-2443a8618c3d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.781877 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" event={"ID":"90153a28-4812-4b2e-a3a3-2443a8618c3d","Type":"ContainerDied","Data":"7b36f241f4050198d6a4981ab98131119fe2969152bda9ed2871c414a5ae867b"} Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.781929 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b36f241f4050198d6a4981ab98131119fe2969152bda9ed2871c414a5ae867b" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.781992 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.875551 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2"] Jan 28 21:04:53 crc kubenswrapper[4746]: E0128 21:04:53.878449 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90153a28-4812-4b2e-a3a3-2443a8618c3d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.878477 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="90153a28-4812-4b2e-a3a3-2443a8618c3d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.879002 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="90153a28-4812-4b2e-a3a3-2443a8618c3d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.881807 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.884294 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xb8vv" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.886698 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.886930 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.887266 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.895230 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2"] Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.963874 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5bda0ca-2718-41bf-84d6-6c08d35d16b1-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p7xp2\" (UID: \"a5bda0ca-2718-41bf-84d6-6c08d35d16b1\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.963935 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x7fq\" (UniqueName: \"kubernetes.io/projected/a5bda0ca-2718-41bf-84d6-6c08d35d16b1-kube-api-access-7x7fq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p7xp2\" (UID: \"a5bda0ca-2718-41bf-84d6-6c08d35d16b1\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2" Jan 28 21:04:53 crc kubenswrapper[4746]: I0128 21:04:53.964048 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5bda0ca-2718-41bf-84d6-6c08d35d16b1-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p7xp2\" (UID: \"a5bda0ca-2718-41bf-84d6-6c08d35d16b1\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2" Jan 28 21:04:54 crc kubenswrapper[4746]: I0128 21:04:54.066250 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5bda0ca-2718-41bf-84d6-6c08d35d16b1-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p7xp2\" (UID: \"a5bda0ca-2718-41bf-84d6-6c08d35d16b1\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2" Jan 28 21:04:54 crc kubenswrapper[4746]: I0128 21:04:54.066482 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5bda0ca-2718-41bf-84d6-6c08d35d16b1-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p7xp2\" (UID: \"a5bda0ca-2718-41bf-84d6-6c08d35d16b1\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2" Jan 28 21:04:54 crc kubenswrapper[4746]: I0128 21:04:54.066554 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x7fq\" (UniqueName: \"kubernetes.io/projected/a5bda0ca-2718-41bf-84d6-6c08d35d16b1-kube-api-access-7x7fq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p7xp2\" (UID: \"a5bda0ca-2718-41bf-84d6-6c08d35d16b1\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2" Jan 28 21:04:54 crc kubenswrapper[4746]: I0128 21:04:54.071626 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5bda0ca-2718-41bf-84d6-6c08d35d16b1-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p7xp2\" (UID: \"a5bda0ca-2718-41bf-84d6-6c08d35d16b1\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2" Jan 28 21:04:54 crc kubenswrapper[4746]: I0128 21:04:54.071652 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5bda0ca-2718-41bf-84d6-6c08d35d16b1-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p7xp2\" (UID: \"a5bda0ca-2718-41bf-84d6-6c08d35d16b1\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2" Jan 28 21:04:54 crc kubenswrapper[4746]: I0128 21:04:54.083565 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x7fq\" (UniqueName: \"kubernetes.io/projected/a5bda0ca-2718-41bf-84d6-6c08d35d16b1-kube-api-access-7x7fq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-p7xp2\" (UID: \"a5bda0ca-2718-41bf-84d6-6c08d35d16b1\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2" Jan 28 21:04:54 crc kubenswrapper[4746]: I0128 21:04:54.236984 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2" Jan 28 21:04:54 crc kubenswrapper[4746]: W0128 21:04:54.758590 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5bda0ca_2718_41bf_84d6_6c08d35d16b1.slice/crio-7025e37105a9c4ddffee5e7cdaf264eb9d4fa71fcdf7d4f814f029c228b4eb5f WatchSource:0}: Error finding container 7025e37105a9c4ddffee5e7cdaf264eb9d4fa71fcdf7d4f814f029c228b4eb5f: Status 404 returned error can't find the container with id 7025e37105a9c4ddffee5e7cdaf264eb9d4fa71fcdf7d4f814f029c228b4eb5f Jan 28 21:04:54 crc kubenswrapper[4746]: I0128 21:04:54.766251 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2"] Jan 28 21:04:54 crc kubenswrapper[4746]: I0128 21:04:54.796466 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2" event={"ID":"a5bda0ca-2718-41bf-84d6-6c08d35d16b1","Type":"ContainerStarted","Data":"7025e37105a9c4ddffee5e7cdaf264eb9d4fa71fcdf7d4f814f029c228b4eb5f"} Jan 28 21:04:55 crc kubenswrapper[4746]: I0128 21:04:55.620680 4746 scope.go:117] "RemoveContainer" containerID="86285df83ebe099072ee72ae8481ed25518db8ae522dc26fa319741f5e10321d" Jan 28 21:04:55 crc kubenswrapper[4746]: I0128 21:04:55.648223 4746 scope.go:117] "RemoveContainer" containerID="5efd43f67cdb780cefb6d4e7f5b9be112a6ae66e6f4fa4a140595b1baac5638d" Jan 28 21:04:55 crc kubenswrapper[4746]: I0128 21:04:55.704646 4746 scope.go:117] "RemoveContainer" containerID="c77139041dada346e753825ccb577fd7e0206e3e918bc5faa9fb0bae913ce06a" Jan 28 21:04:55 crc kubenswrapper[4746]: I0128 21:04:55.751694 4746 scope.go:117] "RemoveContainer" containerID="906ae965fc7672b88c668edce8c19b1b658c73b843592f53f153c5d2ad471532" Jan 28 21:04:55 crc kubenswrapper[4746]: I0128 21:04:55.780289 4746 scope.go:117] "RemoveContainer" containerID="d4a7991e03a107fe03667d4568bdff822eccb58ad8a1254b56c2585d609c36b0" Jan 28 21:04:55 crc kubenswrapper[4746]: I0128 21:04:55.808009 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2" event={"ID":"a5bda0ca-2718-41bf-84d6-6c08d35d16b1","Type":"ContainerStarted","Data":"1136686af248c1fa3cc1b2db675cb5a0493a6c3001172dd1b596455736543557"} Jan 28 21:04:55 crc kubenswrapper[4746]: I0128 21:04:55.832490 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2" podStartSLOduration=2.416853791 podStartE2EDuration="2.832465367s" podCreationTimestamp="2026-01-28 21:04:53 +0000 UTC" firstStartedPulling="2026-01-28 21:04:54.761332883 +0000 UTC m=+1522.717519257" lastFinishedPulling="2026-01-28 21:04:55.176944469 +0000 UTC m=+1523.133130833" observedRunningTime="2026-01-28 21:04:55.822994621 +0000 UTC m=+1523.779180995" watchObservedRunningTime="2026-01-28 21:04:55.832465367 +0000 UTC m=+1523.788651751" Jan 28 21:04:57 crc kubenswrapper[4746]: I0128 21:04:57.847535 4746 generic.go:334] "Generic (PLEG): container finished" podID="a5bda0ca-2718-41bf-84d6-6c08d35d16b1" containerID="1136686af248c1fa3cc1b2db675cb5a0493a6c3001172dd1b596455736543557" exitCode=0 Jan 28 21:04:57 crc kubenswrapper[4746]: I0128 21:04:57.847666 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2" event={"ID":"a5bda0ca-2718-41bf-84d6-6c08d35d16b1","Type":"ContainerDied","Data":"1136686af248c1fa3cc1b2db675cb5a0493a6c3001172dd1b596455736543557"} Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.415612 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2" Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.519416 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5bda0ca-2718-41bf-84d6-6c08d35d16b1-inventory\") pod \"a5bda0ca-2718-41bf-84d6-6c08d35d16b1\" (UID: \"a5bda0ca-2718-41bf-84d6-6c08d35d16b1\") " Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.519508 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5bda0ca-2718-41bf-84d6-6c08d35d16b1-ssh-key-openstack-edpm-ipam\") pod \"a5bda0ca-2718-41bf-84d6-6c08d35d16b1\" (UID: \"a5bda0ca-2718-41bf-84d6-6c08d35d16b1\") " Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.519579 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x7fq\" (UniqueName: \"kubernetes.io/projected/a5bda0ca-2718-41bf-84d6-6c08d35d16b1-kube-api-access-7x7fq\") pod \"a5bda0ca-2718-41bf-84d6-6c08d35d16b1\" (UID: \"a5bda0ca-2718-41bf-84d6-6c08d35d16b1\") " Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.526216 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5bda0ca-2718-41bf-84d6-6c08d35d16b1-kube-api-access-7x7fq" (OuterVolumeSpecName: "kube-api-access-7x7fq") pod "a5bda0ca-2718-41bf-84d6-6c08d35d16b1" (UID: "a5bda0ca-2718-41bf-84d6-6c08d35d16b1"). InnerVolumeSpecName "kube-api-access-7x7fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.559792 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5bda0ca-2718-41bf-84d6-6c08d35d16b1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a5bda0ca-2718-41bf-84d6-6c08d35d16b1" (UID: "a5bda0ca-2718-41bf-84d6-6c08d35d16b1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.560128 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5bda0ca-2718-41bf-84d6-6c08d35d16b1-inventory" (OuterVolumeSpecName: "inventory") pod "a5bda0ca-2718-41bf-84d6-6c08d35d16b1" (UID: "a5bda0ca-2718-41bf-84d6-6c08d35d16b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.622283 4746 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5bda0ca-2718-41bf-84d6-6c08d35d16b1-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.622317 4746 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5bda0ca-2718-41bf-84d6-6c08d35d16b1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.622328 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x7fq\" (UniqueName: \"kubernetes.io/projected/a5bda0ca-2718-41bf-84d6-6c08d35d16b1-kube-api-access-7x7fq\") on node \"crc\" DevicePath \"\"" Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.886896 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2" event={"ID":"a5bda0ca-2718-41bf-84d6-6c08d35d16b1","Type":"ContainerDied","Data":"7025e37105a9c4ddffee5e7cdaf264eb9d4fa71fcdf7d4f814f029c228b4eb5f"} Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.886943 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7025e37105a9c4ddffee5e7cdaf264eb9d4fa71fcdf7d4f814f029c228b4eb5f" Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.886959 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-p7xp2" Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.978922 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr"] Jan 28 21:04:59 crc kubenswrapper[4746]: E0128 21:04:59.980162 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bda0ca-2718-41bf-84d6-6c08d35d16b1" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.980190 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bda0ca-2718-41bf-84d6-6c08d35d16b1" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.980442 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5bda0ca-2718-41bf-84d6-6c08d35d16b1" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.981485 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.984706 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xb8vv" Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.985595 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.985710 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 21:04:59 crc kubenswrapper[4746]: I0128 21:04:59.985928 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 21:05:00 crc kubenswrapper[4746]: I0128 21:05:00.006269 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr"] Jan 28 21:05:00 crc kubenswrapper[4746]: I0128 21:05:00.132532 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55hl6\" (UniqueName: \"kubernetes.io/projected/ed8a3948-98ae-4e2a-a9f8-435287fc9583-kube-api-access-55hl6\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr\" (UID: \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" Jan 28 21:05:00 crc kubenswrapper[4746]: I0128 21:05:00.132840 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed8a3948-98ae-4e2a-a9f8-435287fc9583-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr\" (UID: \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" Jan 28 21:05:00 crc kubenswrapper[4746]: I0128 21:05:00.133246 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed8a3948-98ae-4e2a-a9f8-435287fc9583-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr\" (UID: \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" Jan 28 21:05:00 crc kubenswrapper[4746]: I0128 21:05:00.133357 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed8a3948-98ae-4e2a-a9f8-435287fc9583-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr\" (UID: \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" Jan 28 21:05:00 crc kubenswrapper[4746]: I0128 21:05:00.234755 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed8a3948-98ae-4e2a-a9f8-435287fc9583-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr\" (UID: \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" Jan 28 21:05:00 crc kubenswrapper[4746]: I0128 21:05:00.234832 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed8a3948-98ae-4e2a-a9f8-435287fc9583-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr\" (UID: \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" Jan 28 21:05:00 crc kubenswrapper[4746]: I0128 21:05:00.234857 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed8a3948-98ae-4e2a-a9f8-435287fc9583-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr\" (UID: \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" Jan 28 21:05:00 crc kubenswrapper[4746]: I0128 21:05:00.234948 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55hl6\" (UniqueName: \"kubernetes.io/projected/ed8a3948-98ae-4e2a-a9f8-435287fc9583-kube-api-access-55hl6\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr\" (UID: \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" Jan 28 21:05:00 crc kubenswrapper[4746]: I0128 21:05:00.239425 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed8a3948-98ae-4e2a-a9f8-435287fc9583-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr\" (UID: \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" Jan 28 21:05:00 crc kubenswrapper[4746]: I0128 21:05:00.239906 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed8a3948-98ae-4e2a-a9f8-435287fc9583-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr\" (UID: \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" Jan 28 21:05:00 crc kubenswrapper[4746]: I0128 21:05:00.247037 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed8a3948-98ae-4e2a-a9f8-435287fc9583-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr\" (UID: \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" Jan 28 21:05:00 crc kubenswrapper[4746]: I0128 21:05:00.257844 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55hl6\" (UniqueName: \"kubernetes.io/projected/ed8a3948-98ae-4e2a-a9f8-435287fc9583-kube-api-access-55hl6\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr\" (UID: \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" Jan 28 21:05:00 crc kubenswrapper[4746]: I0128 21:05:00.310692 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" Jan 28 21:05:00 crc kubenswrapper[4746]: I0128 21:05:00.981146 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr"] Jan 28 21:05:00 crc kubenswrapper[4746]: W0128 21:05:00.986450 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded8a3948_98ae_4e2a_a9f8_435287fc9583.slice/crio-f7b0227070af981c6647962485fdf02455245a7b5293c6700a6926d58b2e7157 WatchSource:0}: Error finding container f7b0227070af981c6647962485fdf02455245a7b5293c6700a6926d58b2e7157: Status 404 returned error can't find the container with id f7b0227070af981c6647962485fdf02455245a7b5293c6700a6926d58b2e7157 Jan 28 21:05:01 crc kubenswrapper[4746]: I0128 21:05:01.905701 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" event={"ID":"ed8a3948-98ae-4e2a-a9f8-435287fc9583","Type":"ContainerStarted","Data":"c1be310356611b73e569e6c7b288ac4052e90ea7ff0ab4a72356cfccf84f28a2"} Jan 28 21:05:01 crc kubenswrapper[4746]: I0128 21:05:01.906035 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" event={"ID":"ed8a3948-98ae-4e2a-a9f8-435287fc9583","Type":"ContainerStarted","Data":"f7b0227070af981c6647962485fdf02455245a7b5293c6700a6926d58b2e7157"} Jan 28 21:05:01 crc kubenswrapper[4746]: I0128 21:05:01.930610 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" podStartSLOduration=2.526305077 podStartE2EDuration="2.930591097s" podCreationTimestamp="2026-01-28 21:04:59 +0000 UTC" firstStartedPulling="2026-01-28 21:05:00.989921423 +0000 UTC m=+1528.946107817" lastFinishedPulling="2026-01-28 21:05:01.394207473 +0000 UTC m=+1529.350393837" observedRunningTime="2026-01-28 21:05:01.923555936 +0000 UTC m=+1529.879742300" watchObservedRunningTime="2026-01-28 21:05:01.930591097 +0000 UTC m=+1529.886777451" Jan 28 21:05:33 crc kubenswrapper[4746]: I0128 21:05:33.181572 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gl5gj"] Jan 28 21:05:33 crc kubenswrapper[4746]: I0128 21:05:33.184744 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gl5gj" Jan 28 21:05:33 crc kubenswrapper[4746]: I0128 21:05:33.195720 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gl5gj"] Jan 28 21:05:33 crc kubenswrapper[4746]: I0128 21:05:33.303603 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9k9h\" (UniqueName: \"kubernetes.io/projected/7f7c8907-8450-4217-a028-fac47f4644bf-kube-api-access-x9k9h\") pod \"community-operators-gl5gj\" (UID: \"7f7c8907-8450-4217-a028-fac47f4644bf\") " pod="openshift-marketplace/community-operators-gl5gj" Jan 28 21:05:33 crc kubenswrapper[4746]: I0128 21:05:33.303734 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f7c8907-8450-4217-a028-fac47f4644bf-catalog-content\") pod \"community-operators-gl5gj\" (UID: \"7f7c8907-8450-4217-a028-fac47f4644bf\") " pod="openshift-marketplace/community-operators-gl5gj" Jan 28 21:05:33 crc kubenswrapper[4746]: I0128 21:05:33.303971 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f7c8907-8450-4217-a028-fac47f4644bf-utilities\") pod \"community-operators-gl5gj\" (UID: \"7f7c8907-8450-4217-a028-fac47f4644bf\") " pod="openshift-marketplace/community-operators-gl5gj" Jan 28 21:05:33 crc kubenswrapper[4746]: I0128 21:05:33.407596 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9k9h\" (UniqueName: \"kubernetes.io/projected/7f7c8907-8450-4217-a028-fac47f4644bf-kube-api-access-x9k9h\") pod \"community-operators-gl5gj\" (UID: \"7f7c8907-8450-4217-a028-fac47f4644bf\") " pod="openshift-marketplace/community-operators-gl5gj" Jan 28 21:05:33 crc kubenswrapper[4746]: I0128 21:05:33.407732 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f7c8907-8450-4217-a028-fac47f4644bf-catalog-content\") pod \"community-operators-gl5gj\" (UID: \"7f7c8907-8450-4217-a028-fac47f4644bf\") " pod="openshift-marketplace/community-operators-gl5gj" Jan 28 21:05:33 crc kubenswrapper[4746]: I0128 21:05:33.407819 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f7c8907-8450-4217-a028-fac47f4644bf-utilities\") pod \"community-operators-gl5gj\" (UID: \"7f7c8907-8450-4217-a028-fac47f4644bf\") " pod="openshift-marketplace/community-operators-gl5gj" Jan 28 21:05:33 crc kubenswrapper[4746]: I0128 21:05:33.408429 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f7c8907-8450-4217-a028-fac47f4644bf-utilities\") pod \"community-operators-gl5gj\" (UID: \"7f7c8907-8450-4217-a028-fac47f4644bf\") " pod="openshift-marketplace/community-operators-gl5gj" Jan 28 21:05:33 crc kubenswrapper[4746]: I0128 21:05:33.408578 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f7c8907-8450-4217-a028-fac47f4644bf-catalog-content\") pod \"community-operators-gl5gj\" (UID: \"7f7c8907-8450-4217-a028-fac47f4644bf\") " pod="openshift-marketplace/community-operators-gl5gj" Jan 28 21:05:33 crc kubenswrapper[4746]: I0128 21:05:33.433663 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9k9h\" (UniqueName: \"kubernetes.io/projected/7f7c8907-8450-4217-a028-fac47f4644bf-kube-api-access-x9k9h\") pod \"community-operators-gl5gj\" (UID: \"7f7c8907-8450-4217-a028-fac47f4644bf\") " pod="openshift-marketplace/community-operators-gl5gj" Jan 28 21:05:33 crc kubenswrapper[4746]: I0128 21:05:33.521190 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gl5gj" Jan 28 21:05:34 crc kubenswrapper[4746]: I0128 21:05:34.076012 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gl5gj"] Jan 28 21:05:34 crc kubenswrapper[4746]: W0128 21:05:34.091742 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f7c8907_8450_4217_a028_fac47f4644bf.slice/crio-9571656cfc166ddb19a528f5d3062314de1b508eeac757f1f1770196b154a0f2 WatchSource:0}: Error finding container 9571656cfc166ddb19a528f5d3062314de1b508eeac757f1f1770196b154a0f2: Status 404 returned error can't find the container with id 9571656cfc166ddb19a528f5d3062314de1b508eeac757f1f1770196b154a0f2 Jan 28 21:05:34 crc kubenswrapper[4746]: I0128 21:05:34.306292 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl5gj" event={"ID":"7f7c8907-8450-4217-a028-fac47f4644bf","Type":"ContainerStarted","Data":"8e0f9e52489344bbeccaf213b5330aa1eb09b13a274a24d7bdf5041ffea513dc"} Jan 28 21:05:34 crc kubenswrapper[4746]: I0128 21:05:34.306369 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl5gj" event={"ID":"7f7c8907-8450-4217-a028-fac47f4644bf","Type":"ContainerStarted","Data":"9571656cfc166ddb19a528f5d3062314de1b508eeac757f1f1770196b154a0f2"} Jan 28 21:05:35 crc kubenswrapper[4746]: I0128 21:05:35.320023 4746 generic.go:334] "Generic (PLEG): container finished" podID="7f7c8907-8450-4217-a028-fac47f4644bf" containerID="8e0f9e52489344bbeccaf213b5330aa1eb09b13a274a24d7bdf5041ffea513dc" exitCode=0 Jan 28 21:05:35 crc kubenswrapper[4746]: I0128 21:05:35.320111 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl5gj" event={"ID":"7f7c8907-8450-4217-a028-fac47f4644bf","Type":"ContainerDied","Data":"8e0f9e52489344bbeccaf213b5330aa1eb09b13a274a24d7bdf5041ffea513dc"} Jan 28 21:05:35 crc kubenswrapper[4746]: I0128 21:05:35.322799 4746 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 21:05:36 crc kubenswrapper[4746]: I0128 21:05:36.333039 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl5gj" event={"ID":"7f7c8907-8450-4217-a028-fac47f4644bf","Type":"ContainerStarted","Data":"0a1fa61f189149a8b084880fd603c313f193f34cf133bb289ff3f99ae5076c8d"} Jan 28 21:05:38 crc kubenswrapper[4746]: I0128 21:05:38.364295 4746 generic.go:334] "Generic (PLEG): container finished" podID="7f7c8907-8450-4217-a028-fac47f4644bf" containerID="0a1fa61f189149a8b084880fd603c313f193f34cf133bb289ff3f99ae5076c8d" exitCode=0 Jan 28 21:05:38 crc kubenswrapper[4746]: I0128 21:05:38.364352 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl5gj" event={"ID":"7f7c8907-8450-4217-a028-fac47f4644bf","Type":"ContainerDied","Data":"0a1fa61f189149a8b084880fd603c313f193f34cf133bb289ff3f99ae5076c8d"} Jan 28 21:05:39 crc kubenswrapper[4746]: I0128 21:05:39.385854 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl5gj" event={"ID":"7f7c8907-8450-4217-a028-fac47f4644bf","Type":"ContainerStarted","Data":"6eb59019cc87b67715da920a19f82941539d38e75652506890f817e5460f81a8"} Jan 28 21:05:39 crc kubenswrapper[4746]: I0128 21:05:39.424821 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gl5gj" podStartSLOduration=2.991348791 podStartE2EDuration="6.424799276s" podCreationTimestamp="2026-01-28 21:05:33 +0000 UTC" firstStartedPulling="2026-01-28 21:05:35.322433911 +0000 UTC m=+1563.278620275" lastFinishedPulling="2026-01-28 21:05:38.755884366 +0000 UTC m=+1566.712070760" observedRunningTime="2026-01-28 21:05:39.419136253 +0000 UTC m=+1567.375322617" watchObservedRunningTime="2026-01-28 21:05:39.424799276 +0000 UTC m=+1567.380985630" Jan 28 21:05:43 crc kubenswrapper[4746]: I0128 21:05:43.521919 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gl5gj" Jan 28 21:05:43 crc kubenswrapper[4746]: I0128 21:05:43.522652 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gl5gj" Jan 28 21:05:43 crc kubenswrapper[4746]: I0128 21:05:43.569889 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gl5gj" Jan 28 21:05:44 crc kubenswrapper[4746]: I0128 21:05:44.266339 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nfn49"] Jan 28 21:05:44 crc kubenswrapper[4746]: I0128 21:05:44.269473 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nfn49" Jan 28 21:05:44 crc kubenswrapper[4746]: I0128 21:05:44.307626 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nfn49"] Jan 28 21:05:44 crc kubenswrapper[4746]: I0128 21:05:44.344688 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6-catalog-content\") pod \"certified-operators-nfn49\" (UID: \"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6\") " pod="openshift-marketplace/certified-operators-nfn49" Jan 28 21:05:44 crc kubenswrapper[4746]: I0128 21:05:44.344896 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6-utilities\") pod \"certified-operators-nfn49\" (UID: \"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6\") " pod="openshift-marketplace/certified-operators-nfn49" Jan 28 21:05:44 crc kubenswrapper[4746]: I0128 21:05:44.344987 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdsqb\" (UniqueName: \"kubernetes.io/projected/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6-kube-api-access-wdsqb\") pod \"certified-operators-nfn49\" (UID: \"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6\") " pod="openshift-marketplace/certified-operators-nfn49" Jan 28 21:05:44 crc kubenswrapper[4746]: I0128 21:05:44.446981 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6-utilities\") pod \"certified-operators-nfn49\" (UID: \"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6\") " pod="openshift-marketplace/certified-operators-nfn49" Jan 28 21:05:44 crc kubenswrapper[4746]: I0128 21:05:44.447069 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdsqb\" (UniqueName: \"kubernetes.io/projected/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6-kube-api-access-wdsqb\") pod \"certified-operators-nfn49\" (UID: \"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6\") " pod="openshift-marketplace/certified-operators-nfn49" Jan 28 21:05:44 crc kubenswrapper[4746]: I0128 21:05:44.447170 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6-catalog-content\") pod \"certified-operators-nfn49\" (UID: \"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6\") " pod="openshift-marketplace/certified-operators-nfn49" Jan 28 21:05:44 crc kubenswrapper[4746]: I0128 21:05:44.447641 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6-catalog-content\") pod \"certified-operators-nfn49\" (UID: \"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6\") " pod="openshift-marketplace/certified-operators-nfn49" Jan 28 21:05:44 crc kubenswrapper[4746]: I0128 21:05:44.447860 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6-utilities\") pod \"certified-operators-nfn49\" (UID: \"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6\") " pod="openshift-marketplace/certified-operators-nfn49" Jan 28 21:05:44 crc kubenswrapper[4746]: I0128 21:05:44.485047 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdsqb\" (UniqueName: \"kubernetes.io/projected/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6-kube-api-access-wdsqb\") pod \"certified-operators-nfn49\" (UID: \"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6\") " pod="openshift-marketplace/certified-operators-nfn49" Jan 28 21:05:44 crc kubenswrapper[4746]: I0128 21:05:44.500843 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gl5gj" Jan 28 21:05:44 crc kubenswrapper[4746]: I0128 21:05:44.626440 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nfn49" Jan 28 21:05:45 crc kubenswrapper[4746]: I0128 21:05:45.112963 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nfn49"] Jan 28 21:05:45 crc kubenswrapper[4746]: I0128 21:05:45.454833 4746 generic.go:334] "Generic (PLEG): container finished" podID="07d1ee4c-0b00-45b6-9fda-ee32dd6149c6" containerID="b821d21f91c2cf41368010de7fdc91b5b11d545421e4a60133a888232bf76e6c" exitCode=0 Jan 28 21:05:45 crc kubenswrapper[4746]: I0128 21:05:45.454922 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfn49" event={"ID":"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6","Type":"ContainerDied","Data":"b821d21f91c2cf41368010de7fdc91b5b11d545421e4a60133a888232bf76e6c"} Jan 28 21:05:45 crc kubenswrapper[4746]: I0128 21:05:45.455164 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfn49" event={"ID":"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6","Type":"ContainerStarted","Data":"557f5299cd99808ac57104f0afd23ab5b8c59693b3676589787351fad9a4a811"} Jan 28 21:05:46 crc kubenswrapper[4746]: I0128 21:05:46.469641 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfn49" event={"ID":"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6","Type":"ContainerStarted","Data":"8b3d7557c695890ceecd1cf263e92545a57452a004067734ca10ff47056a1e89"} Jan 28 21:05:46 crc kubenswrapper[4746]: I0128 21:05:46.817003 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gl5gj"] Jan 28 21:05:46 crc kubenswrapper[4746]: I0128 21:05:46.817293 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gl5gj" podUID="7f7c8907-8450-4217-a028-fac47f4644bf" containerName="registry-server" containerID="cri-o://6eb59019cc87b67715da920a19f82941539d38e75652506890f817e5460f81a8" gracePeriod=2 Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.471311 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gl5gj" Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.483972 4746 generic.go:334] "Generic (PLEG): container finished" podID="7f7c8907-8450-4217-a028-fac47f4644bf" containerID="6eb59019cc87b67715da920a19f82941539d38e75652506890f817e5460f81a8" exitCode=0 Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.484026 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gl5gj" Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.484184 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl5gj" event={"ID":"7f7c8907-8450-4217-a028-fac47f4644bf","Type":"ContainerDied","Data":"6eb59019cc87b67715da920a19f82941539d38e75652506890f817e5460f81a8"} Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.484218 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gl5gj" event={"ID":"7f7c8907-8450-4217-a028-fac47f4644bf","Type":"ContainerDied","Data":"9571656cfc166ddb19a528f5d3062314de1b508eeac757f1f1770196b154a0f2"} Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.484239 4746 scope.go:117] "RemoveContainer" containerID="6eb59019cc87b67715da920a19f82941539d38e75652506890f817e5460f81a8" Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.512619 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f7c8907-8450-4217-a028-fac47f4644bf-catalog-content\") pod \"7f7c8907-8450-4217-a028-fac47f4644bf\" (UID: \"7f7c8907-8450-4217-a028-fac47f4644bf\") " Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.512712 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9k9h\" (UniqueName: \"kubernetes.io/projected/7f7c8907-8450-4217-a028-fac47f4644bf-kube-api-access-x9k9h\") pod \"7f7c8907-8450-4217-a028-fac47f4644bf\" (UID: \"7f7c8907-8450-4217-a028-fac47f4644bf\") " Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.512766 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f7c8907-8450-4217-a028-fac47f4644bf-utilities\") pod \"7f7c8907-8450-4217-a028-fac47f4644bf\" (UID: \"7f7c8907-8450-4217-a028-fac47f4644bf\") " Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.513684 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f7c8907-8450-4217-a028-fac47f4644bf-utilities" (OuterVolumeSpecName: "utilities") pod "7f7c8907-8450-4217-a028-fac47f4644bf" (UID: "7f7c8907-8450-4217-a028-fac47f4644bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.519414 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f7c8907-8450-4217-a028-fac47f4644bf-kube-api-access-x9k9h" (OuterVolumeSpecName: "kube-api-access-x9k9h") pod "7f7c8907-8450-4217-a028-fac47f4644bf" (UID: "7f7c8907-8450-4217-a028-fac47f4644bf"). InnerVolumeSpecName "kube-api-access-x9k9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.520025 4746 scope.go:117] "RemoveContainer" containerID="0a1fa61f189149a8b084880fd603c313f193f34cf133bb289ff3f99ae5076c8d" Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.559527 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f7c8907-8450-4217-a028-fac47f4644bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f7c8907-8450-4217-a028-fac47f4644bf" (UID: "7f7c8907-8450-4217-a028-fac47f4644bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.593752 4746 scope.go:117] "RemoveContainer" containerID="8e0f9e52489344bbeccaf213b5330aa1eb09b13a274a24d7bdf5041ffea513dc" Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.615420 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f7c8907-8450-4217-a028-fac47f4644bf-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.615453 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9k9h\" (UniqueName: \"kubernetes.io/projected/7f7c8907-8450-4217-a028-fac47f4644bf-kube-api-access-x9k9h\") on node \"crc\" DevicePath \"\"" Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.615467 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f7c8907-8450-4217-a028-fac47f4644bf-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.651020 4746 scope.go:117] "RemoveContainer" containerID="6eb59019cc87b67715da920a19f82941539d38e75652506890f817e5460f81a8" Jan 28 21:05:47 crc kubenswrapper[4746]: E0128 21:05:47.651576 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eb59019cc87b67715da920a19f82941539d38e75652506890f817e5460f81a8\": container with ID starting with 6eb59019cc87b67715da920a19f82941539d38e75652506890f817e5460f81a8 not found: ID does not exist" containerID="6eb59019cc87b67715da920a19f82941539d38e75652506890f817e5460f81a8" Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.651619 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eb59019cc87b67715da920a19f82941539d38e75652506890f817e5460f81a8"} err="failed to get container status \"6eb59019cc87b67715da920a19f82941539d38e75652506890f817e5460f81a8\": rpc error: code = NotFound desc = could not find container \"6eb59019cc87b67715da920a19f82941539d38e75652506890f817e5460f81a8\": container with ID starting with 6eb59019cc87b67715da920a19f82941539d38e75652506890f817e5460f81a8 not found: ID does not exist" Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.651647 4746 scope.go:117] "RemoveContainer" containerID="0a1fa61f189149a8b084880fd603c313f193f34cf133bb289ff3f99ae5076c8d" Jan 28 21:05:47 crc kubenswrapper[4746]: E0128 21:05:47.651986 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a1fa61f189149a8b084880fd603c313f193f34cf133bb289ff3f99ae5076c8d\": container with ID starting with 0a1fa61f189149a8b084880fd603c313f193f34cf133bb289ff3f99ae5076c8d not found: ID does not exist" containerID="0a1fa61f189149a8b084880fd603c313f193f34cf133bb289ff3f99ae5076c8d" Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.652036 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a1fa61f189149a8b084880fd603c313f193f34cf133bb289ff3f99ae5076c8d"} err="failed to get container status \"0a1fa61f189149a8b084880fd603c313f193f34cf133bb289ff3f99ae5076c8d\": rpc error: code = NotFound desc = could not find container \"0a1fa61f189149a8b084880fd603c313f193f34cf133bb289ff3f99ae5076c8d\": container with ID starting with 0a1fa61f189149a8b084880fd603c313f193f34cf133bb289ff3f99ae5076c8d not found: ID does not exist" Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.652068 4746 scope.go:117] "RemoveContainer" containerID="8e0f9e52489344bbeccaf213b5330aa1eb09b13a274a24d7bdf5041ffea513dc" Jan 28 21:05:47 crc kubenswrapper[4746]: E0128 21:05:47.652605 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e0f9e52489344bbeccaf213b5330aa1eb09b13a274a24d7bdf5041ffea513dc\": container with ID starting with 8e0f9e52489344bbeccaf213b5330aa1eb09b13a274a24d7bdf5041ffea513dc not found: ID does not exist" containerID="8e0f9e52489344bbeccaf213b5330aa1eb09b13a274a24d7bdf5041ffea513dc" Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.652631 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e0f9e52489344bbeccaf213b5330aa1eb09b13a274a24d7bdf5041ffea513dc"} err="failed to get container status \"8e0f9e52489344bbeccaf213b5330aa1eb09b13a274a24d7bdf5041ffea513dc\": rpc error: code = NotFound desc = could not find container \"8e0f9e52489344bbeccaf213b5330aa1eb09b13a274a24d7bdf5041ffea513dc\": container with ID starting with 8e0f9e52489344bbeccaf213b5330aa1eb09b13a274a24d7bdf5041ffea513dc not found: ID does not exist" Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.839574 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gl5gj"] Jan 28 21:05:47 crc kubenswrapper[4746]: I0128 21:05:47.848675 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gl5gj"] Jan 28 21:05:48 crc kubenswrapper[4746]: I0128 21:05:48.496955 4746 generic.go:334] "Generic (PLEG): container finished" podID="07d1ee4c-0b00-45b6-9fda-ee32dd6149c6" containerID="8b3d7557c695890ceecd1cf263e92545a57452a004067734ca10ff47056a1e89" exitCode=0 Jan 28 21:05:48 crc kubenswrapper[4746]: I0128 21:05:48.497028 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfn49" event={"ID":"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6","Type":"ContainerDied","Data":"8b3d7557c695890ceecd1cf263e92545a57452a004067734ca10ff47056a1e89"} Jan 28 21:05:48 crc kubenswrapper[4746]: I0128 21:05:48.877980 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f7c8907-8450-4217-a028-fac47f4644bf" path="/var/lib/kubelet/pods/7f7c8907-8450-4217-a028-fac47f4644bf/volumes" Jan 28 21:05:49 crc kubenswrapper[4746]: I0128 21:05:49.507739 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfn49" event={"ID":"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6","Type":"ContainerStarted","Data":"8d46f0a2efa20cac68f1b93a5000076f5fc01cf17207240811d8de5e44f1e37a"} Jan 28 21:05:54 crc kubenswrapper[4746]: I0128 21:05:54.627645 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nfn49" Jan 28 21:05:54 crc kubenswrapper[4746]: I0128 21:05:54.628255 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nfn49" Jan 28 21:05:54 crc kubenswrapper[4746]: I0128 21:05:54.690039 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nfn49" Jan 28 21:05:54 crc kubenswrapper[4746]: I0128 21:05:54.720121 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nfn49" podStartSLOduration=7.251784896 podStartE2EDuration="10.720066732s" podCreationTimestamp="2026-01-28 21:05:44 +0000 UTC" firstStartedPulling="2026-01-28 21:05:45.456364837 +0000 UTC m=+1573.412551191" lastFinishedPulling="2026-01-28 21:05:48.924646673 +0000 UTC m=+1576.880833027" observedRunningTime="2026-01-28 21:05:49.531974779 +0000 UTC m=+1577.488161143" watchObservedRunningTime="2026-01-28 21:05:54.720066732 +0000 UTC m=+1582.676253106" Jan 28 21:05:55 crc kubenswrapper[4746]: I0128 21:05:55.627200 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nfn49" Jan 28 21:05:55 crc kubenswrapper[4746]: I0128 21:05:55.684304 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nfn49"] Jan 28 21:05:56 crc kubenswrapper[4746]: I0128 21:05:56.006432 4746 scope.go:117] "RemoveContainer" containerID="c22169d83f3f2782b74332c459ce50634172feb1cacb9d57d5cbf093c2c11259" Jan 28 21:05:56 crc kubenswrapper[4746]: I0128 21:05:56.042262 4746 scope.go:117] "RemoveContainer" containerID="3c603454762a8447520867b8a7b5bc8f97ead8fd9b37d66bcdf6542f41afe2d5" Jan 28 21:05:56 crc kubenswrapper[4746]: I0128 21:05:56.110169 4746 scope.go:117] "RemoveContainer" containerID="5ba395585c8aa92dd55ae0dedeea33152ce0bd55c3762da0432ff57b87331f99" Jan 28 21:05:56 crc kubenswrapper[4746]: I0128 21:05:56.157380 4746 scope.go:117] "RemoveContainer" containerID="029111cbc3349dbbb9ed934ead4096eddb944845f11de68a6c2f8df6bb8f1d66" Jan 28 21:05:56 crc kubenswrapper[4746]: I0128 21:05:56.212520 4746 scope.go:117] "RemoveContainer" containerID="67006d1c6795615fb8ceda95b35ade7bd11941f66b5f0268526602e315e60f7e" Jan 28 21:05:56 crc kubenswrapper[4746]: I0128 21:05:56.249121 4746 scope.go:117] "RemoveContainer" containerID="745dea18a1ae2a9ea69ab20defd44eb5a20396c2fb60159aa14e6d18d0ea07b8" Jan 28 21:05:57 crc kubenswrapper[4746]: I0128 21:05:57.593911 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nfn49" podUID="07d1ee4c-0b00-45b6-9fda-ee32dd6149c6" containerName="registry-server" containerID="cri-o://8d46f0a2efa20cac68f1b93a5000076f5fc01cf17207240811d8de5e44f1e37a" gracePeriod=2 Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.186325 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nfn49" Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.249148 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdsqb\" (UniqueName: \"kubernetes.io/projected/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6-kube-api-access-wdsqb\") pod \"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6\" (UID: \"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6\") " Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.249188 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6-catalog-content\") pod \"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6\" (UID: \"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6\") " Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.249269 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6-utilities\") pod \"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6\" (UID: \"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6\") " Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.250489 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6-utilities" (OuterVolumeSpecName: "utilities") pod "07d1ee4c-0b00-45b6-9fda-ee32dd6149c6" (UID: "07d1ee4c-0b00-45b6-9fda-ee32dd6149c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.255874 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6-kube-api-access-wdsqb" (OuterVolumeSpecName: "kube-api-access-wdsqb") pod "07d1ee4c-0b00-45b6-9fda-ee32dd6149c6" (UID: "07d1ee4c-0b00-45b6-9fda-ee32dd6149c6"). InnerVolumeSpecName "kube-api-access-wdsqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.300097 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07d1ee4c-0b00-45b6-9fda-ee32dd6149c6" (UID: "07d1ee4c-0b00-45b6-9fda-ee32dd6149c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.351202 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdsqb\" (UniqueName: \"kubernetes.io/projected/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6-kube-api-access-wdsqb\") on node \"crc\" DevicePath \"\"" Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.351248 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.351265 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.607357 4746 generic.go:334] "Generic (PLEG): container finished" podID="07d1ee4c-0b00-45b6-9fda-ee32dd6149c6" containerID="8d46f0a2efa20cac68f1b93a5000076f5fc01cf17207240811d8de5e44f1e37a" exitCode=0 Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.607402 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfn49" event={"ID":"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6","Type":"ContainerDied","Data":"8d46f0a2efa20cac68f1b93a5000076f5fc01cf17207240811d8de5e44f1e37a"} Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.607428 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nfn49" event={"ID":"07d1ee4c-0b00-45b6-9fda-ee32dd6149c6","Type":"ContainerDied","Data":"557f5299cd99808ac57104f0afd23ab5b8c59693b3676589787351fad9a4a811"} Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.607447 4746 scope.go:117] "RemoveContainer" containerID="8d46f0a2efa20cac68f1b93a5000076f5fc01cf17207240811d8de5e44f1e37a" Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.607568 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nfn49" Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.641975 4746 scope.go:117] "RemoveContainer" containerID="8b3d7557c695890ceecd1cf263e92545a57452a004067734ca10ff47056a1e89" Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.649440 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nfn49"] Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.660669 4746 scope.go:117] "RemoveContainer" containerID="b821d21f91c2cf41368010de7fdc91b5b11d545421e4a60133a888232bf76e6c" Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.664904 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nfn49"] Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.722933 4746 scope.go:117] "RemoveContainer" containerID="8d46f0a2efa20cac68f1b93a5000076f5fc01cf17207240811d8de5e44f1e37a" Jan 28 21:05:58 crc kubenswrapper[4746]: E0128 21:05:58.723373 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d46f0a2efa20cac68f1b93a5000076f5fc01cf17207240811d8de5e44f1e37a\": container with ID starting with 8d46f0a2efa20cac68f1b93a5000076f5fc01cf17207240811d8de5e44f1e37a not found: ID does not exist" containerID="8d46f0a2efa20cac68f1b93a5000076f5fc01cf17207240811d8de5e44f1e37a" Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.723407 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d46f0a2efa20cac68f1b93a5000076f5fc01cf17207240811d8de5e44f1e37a"} err="failed to get container status \"8d46f0a2efa20cac68f1b93a5000076f5fc01cf17207240811d8de5e44f1e37a\": rpc error: code = NotFound desc = could not find container \"8d46f0a2efa20cac68f1b93a5000076f5fc01cf17207240811d8de5e44f1e37a\": container with ID starting with 8d46f0a2efa20cac68f1b93a5000076f5fc01cf17207240811d8de5e44f1e37a not found: ID does not exist" Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.723432 4746 scope.go:117] "RemoveContainer" containerID="8b3d7557c695890ceecd1cf263e92545a57452a004067734ca10ff47056a1e89" Jan 28 21:05:58 crc kubenswrapper[4746]: E0128 21:05:58.723695 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b3d7557c695890ceecd1cf263e92545a57452a004067734ca10ff47056a1e89\": container with ID starting with 8b3d7557c695890ceecd1cf263e92545a57452a004067734ca10ff47056a1e89 not found: ID does not exist" containerID="8b3d7557c695890ceecd1cf263e92545a57452a004067734ca10ff47056a1e89" Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.723743 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b3d7557c695890ceecd1cf263e92545a57452a004067734ca10ff47056a1e89"} err="failed to get container status \"8b3d7557c695890ceecd1cf263e92545a57452a004067734ca10ff47056a1e89\": rpc error: code = NotFound desc = could not find container \"8b3d7557c695890ceecd1cf263e92545a57452a004067734ca10ff47056a1e89\": container with ID starting with 8b3d7557c695890ceecd1cf263e92545a57452a004067734ca10ff47056a1e89 not found: ID does not exist" Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.723792 4746 scope.go:117] "RemoveContainer" containerID="b821d21f91c2cf41368010de7fdc91b5b11d545421e4a60133a888232bf76e6c" Jan 28 21:05:58 crc kubenswrapper[4746]: E0128 21:05:58.724110 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b821d21f91c2cf41368010de7fdc91b5b11d545421e4a60133a888232bf76e6c\": container with ID starting with b821d21f91c2cf41368010de7fdc91b5b11d545421e4a60133a888232bf76e6c not found: ID does not exist" containerID="b821d21f91c2cf41368010de7fdc91b5b11d545421e4a60133a888232bf76e6c" Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.724143 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b821d21f91c2cf41368010de7fdc91b5b11d545421e4a60133a888232bf76e6c"} err="failed to get container status \"b821d21f91c2cf41368010de7fdc91b5b11d545421e4a60133a888232bf76e6c\": rpc error: code = NotFound desc = could not find container \"b821d21f91c2cf41368010de7fdc91b5b11d545421e4a60133a888232bf76e6c\": container with ID starting with b821d21f91c2cf41368010de7fdc91b5b11d545421e4a60133a888232bf76e6c not found: ID does not exist" Jan 28 21:05:58 crc kubenswrapper[4746]: I0128 21:05:58.851783 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07d1ee4c-0b00-45b6-9fda-ee32dd6149c6" path="/var/lib/kubelet/pods/07d1ee4c-0b00-45b6-9fda-ee32dd6149c6/volumes" Jan 28 21:06:45 crc kubenswrapper[4746]: I0128 21:06:45.892672 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:06:45 crc kubenswrapper[4746]: I0128 21:06:45.893249 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:06:56 crc kubenswrapper[4746]: I0128 21:06:56.429978 4746 scope.go:117] "RemoveContainer" containerID="40b2d2ea4815db4c2fec0bd255e8c9e9ad80dbc33d89a14cc2fc53328b9606b3" Jan 28 21:06:56 crc kubenswrapper[4746]: I0128 21:06:56.486391 4746 scope.go:117] "RemoveContainer" containerID="e2f9bc8207b8a6d49eb69dfd58db2119cf145886998787bfafc4df9d95d1eb66" Jan 28 21:06:56 crc kubenswrapper[4746]: I0128 21:06:56.518169 4746 scope.go:117] "RemoveContainer" containerID="3a0fc6e40b63c3d0054964bf86868df5bae172f80bc0b177d845ac3b30f9697f" Jan 28 21:07:15 crc kubenswrapper[4746]: I0128 21:07:15.871417 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:07:15 crc kubenswrapper[4746]: I0128 21:07:15.872185 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:07:45 crc kubenswrapper[4746]: I0128 21:07:45.871607 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:07:45 crc kubenswrapper[4746]: I0128 21:07:45.873021 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:07:45 crc kubenswrapper[4746]: I0128 21:07:45.873168 4746 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 21:07:45 crc kubenswrapper[4746]: I0128 21:07:45.874075 4746 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968"} pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 21:07:45 crc kubenswrapper[4746]: I0128 21:07:45.874295 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" containerID="cri-o://4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" gracePeriod=600 Jan 28 21:07:46 crc kubenswrapper[4746]: E0128 21:07:46.000884 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:07:46 crc kubenswrapper[4746]: I0128 21:07:46.853205 4746 generic.go:334] "Generic (PLEG): container finished" podID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" exitCode=0 Jan 28 21:07:46 crc kubenswrapper[4746]: I0128 21:07:46.853456 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerDied","Data":"4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968"} Jan 28 21:07:46 crc kubenswrapper[4746]: I0128 21:07:46.853534 4746 scope.go:117] "RemoveContainer" containerID="551b5dbcacfba813c1158522c098223ffafd54f7aa789c2d4402da75877d8079" Jan 28 21:07:46 crc kubenswrapper[4746]: I0128 21:07:46.854411 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:07:46 crc kubenswrapper[4746]: E0128 21:07:46.854770 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:07:57 crc kubenswrapper[4746]: I0128 21:07:57.836923 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:07:57 crc kubenswrapper[4746]: E0128 21:07:57.839167 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:08:00 crc kubenswrapper[4746]: I0128 21:08:00.018685 4746 generic.go:334] "Generic (PLEG): container finished" podID="ed8a3948-98ae-4e2a-a9f8-435287fc9583" containerID="c1be310356611b73e569e6c7b288ac4052e90ea7ff0ab4a72356cfccf84f28a2" exitCode=0 Jan 28 21:08:00 crc kubenswrapper[4746]: I0128 21:08:00.018825 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" event={"ID":"ed8a3948-98ae-4e2a-a9f8-435287fc9583","Type":"ContainerDied","Data":"c1be310356611b73e569e6c7b288ac4052e90ea7ff0ab4a72356cfccf84f28a2"} Jan 28 21:08:01 crc kubenswrapper[4746]: I0128 21:08:01.573671 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" Jan 28 21:08:01 crc kubenswrapper[4746]: I0128 21:08:01.728316 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed8a3948-98ae-4e2a-a9f8-435287fc9583-inventory\") pod \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\" (UID: \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\") " Jan 28 21:08:01 crc kubenswrapper[4746]: I0128 21:08:01.728417 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55hl6\" (UniqueName: \"kubernetes.io/projected/ed8a3948-98ae-4e2a-a9f8-435287fc9583-kube-api-access-55hl6\") pod \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\" (UID: \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\") " Jan 28 21:08:01 crc kubenswrapper[4746]: I0128 21:08:01.728530 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed8a3948-98ae-4e2a-a9f8-435287fc9583-ssh-key-openstack-edpm-ipam\") pod \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\" (UID: \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\") " Jan 28 21:08:01 crc kubenswrapper[4746]: I0128 21:08:01.728608 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed8a3948-98ae-4e2a-a9f8-435287fc9583-bootstrap-combined-ca-bundle\") pod \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\" (UID: \"ed8a3948-98ae-4e2a-a9f8-435287fc9583\") " Jan 28 21:08:01 crc kubenswrapper[4746]: I0128 21:08:01.736208 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed8a3948-98ae-4e2a-a9f8-435287fc9583-kube-api-access-55hl6" (OuterVolumeSpecName: "kube-api-access-55hl6") pod "ed8a3948-98ae-4e2a-a9f8-435287fc9583" (UID: "ed8a3948-98ae-4e2a-a9f8-435287fc9583"). InnerVolumeSpecName "kube-api-access-55hl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:08:01 crc kubenswrapper[4746]: I0128 21:08:01.740552 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed8a3948-98ae-4e2a-a9f8-435287fc9583-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ed8a3948-98ae-4e2a-a9f8-435287fc9583" (UID: "ed8a3948-98ae-4e2a-a9f8-435287fc9583"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:08:01 crc kubenswrapper[4746]: I0128 21:08:01.760264 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed8a3948-98ae-4e2a-a9f8-435287fc9583-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ed8a3948-98ae-4e2a-a9f8-435287fc9583" (UID: "ed8a3948-98ae-4e2a-a9f8-435287fc9583"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:08:01 crc kubenswrapper[4746]: I0128 21:08:01.771579 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed8a3948-98ae-4e2a-a9f8-435287fc9583-inventory" (OuterVolumeSpecName: "inventory") pod "ed8a3948-98ae-4e2a-a9f8-435287fc9583" (UID: "ed8a3948-98ae-4e2a-a9f8-435287fc9583"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:08:01 crc kubenswrapper[4746]: I0128 21:08:01.831037 4746 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed8a3948-98ae-4e2a-a9f8-435287fc9583-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 21:08:01 crc kubenswrapper[4746]: I0128 21:08:01.831079 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55hl6\" (UniqueName: \"kubernetes.io/projected/ed8a3948-98ae-4e2a-a9f8-435287fc9583-kube-api-access-55hl6\") on node \"crc\" DevicePath \"\"" Jan 28 21:08:01 crc kubenswrapper[4746]: I0128 21:08:01.831110 4746 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed8a3948-98ae-4e2a-a9f8-435287fc9583-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 21:08:01 crc kubenswrapper[4746]: I0128 21:08:01.831125 4746 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed8a3948-98ae-4e2a-a9f8-435287fc9583-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.045835 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" event={"ID":"ed8a3948-98ae-4e2a-a9f8-435287fc9583","Type":"ContainerDied","Data":"f7b0227070af981c6647962485fdf02455245a7b5293c6700a6926d58b2e7157"} Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.045886 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7b0227070af981c6647962485fdf02455245a7b5293c6700a6926d58b2e7157" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.045951 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.134200 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m"] Jan 28 21:08:02 crc kubenswrapper[4746]: E0128 21:08:02.134770 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f7c8907-8450-4217-a028-fac47f4644bf" containerName="registry-server" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.134800 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f7c8907-8450-4217-a028-fac47f4644bf" containerName="registry-server" Jan 28 21:08:02 crc kubenswrapper[4746]: E0128 21:08:02.134824 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d1ee4c-0b00-45b6-9fda-ee32dd6149c6" containerName="extract-content" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.134836 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d1ee4c-0b00-45b6-9fda-ee32dd6149c6" containerName="extract-content" Jan 28 21:08:02 crc kubenswrapper[4746]: E0128 21:08:02.134875 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8a3948-98ae-4e2a-a9f8-435287fc9583" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.134889 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8a3948-98ae-4e2a-a9f8-435287fc9583" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 28 21:08:02 crc kubenswrapper[4746]: E0128 21:08:02.134913 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d1ee4c-0b00-45b6-9fda-ee32dd6149c6" containerName="extract-utilities" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.134925 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d1ee4c-0b00-45b6-9fda-ee32dd6149c6" containerName="extract-utilities" Jan 28 21:08:02 crc kubenswrapper[4746]: E0128 21:08:02.134944 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d1ee4c-0b00-45b6-9fda-ee32dd6149c6" containerName="registry-server" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.134952 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d1ee4c-0b00-45b6-9fda-ee32dd6149c6" containerName="registry-server" Jan 28 21:08:02 crc kubenswrapper[4746]: E0128 21:08:02.134984 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f7c8907-8450-4217-a028-fac47f4644bf" containerName="extract-content" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.134992 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f7c8907-8450-4217-a028-fac47f4644bf" containerName="extract-content" Jan 28 21:08:02 crc kubenswrapper[4746]: E0128 21:08:02.135016 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f7c8907-8450-4217-a028-fac47f4644bf" containerName="extract-utilities" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.135040 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f7c8907-8450-4217-a028-fac47f4644bf" containerName="extract-utilities" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.135307 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f7c8907-8450-4217-a028-fac47f4644bf" containerName="registry-server" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.135323 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="07d1ee4c-0b00-45b6-9fda-ee32dd6149c6" containerName="registry-server" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.135338 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed8a3948-98ae-4e2a-a9f8-435287fc9583" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.136252 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.138243 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xb8vv" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.139620 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.139898 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.140915 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.144837 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m"] Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.239598 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db82t\" (UniqueName: \"kubernetes.io/projected/bd3a62cf-5636-4a92-8cc8-8025e70ad3d0-kube-api-access-db82t\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m\" (UID: \"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.239763 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd3a62cf-5636-4a92-8cc8-8025e70ad3d0-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m\" (UID: \"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.240237 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd3a62cf-5636-4a92-8cc8-8025e70ad3d0-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m\" (UID: \"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.343304 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db82t\" (UniqueName: \"kubernetes.io/projected/bd3a62cf-5636-4a92-8cc8-8025e70ad3d0-kube-api-access-db82t\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m\" (UID: \"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.343828 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd3a62cf-5636-4a92-8cc8-8025e70ad3d0-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m\" (UID: \"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.344193 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd3a62cf-5636-4a92-8cc8-8025e70ad3d0-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m\" (UID: \"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.348997 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd3a62cf-5636-4a92-8cc8-8025e70ad3d0-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m\" (UID: \"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.361451 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd3a62cf-5636-4a92-8cc8-8025e70ad3d0-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m\" (UID: \"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.363596 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db82t\" (UniqueName: \"kubernetes.io/projected/bd3a62cf-5636-4a92-8cc8-8025e70ad3d0-kube-api-access-db82t\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m\" (UID: \"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m" Jan 28 21:08:02 crc kubenswrapper[4746]: I0128 21:08:02.456881 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m" Jan 28 21:08:03 crc kubenswrapper[4746]: I0128 21:08:03.011808 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m"] Jan 28 21:08:03 crc kubenswrapper[4746]: I0128 21:08:03.055128 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m" event={"ID":"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0","Type":"ContainerStarted","Data":"df9f3e6a649ca8e82c011470431873ee8eea258ac80ce5ab383ee0870e3b41cb"} Jan 28 21:08:04 crc kubenswrapper[4746]: I0128 21:08:04.064684 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m" event={"ID":"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0","Type":"ContainerStarted","Data":"c4e178eef7781dd48685b9d18fcc3d74fb1859ff88d1fc74ca2d75f4c8b8774d"} Jan 28 21:08:04 crc kubenswrapper[4746]: I0128 21:08:04.086954 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m" podStartSLOduration=1.664754994 podStartE2EDuration="2.086932294s" podCreationTimestamp="2026-01-28 21:08:02 +0000 UTC" firstStartedPulling="2026-01-28 21:08:03.024208863 +0000 UTC m=+1710.980395237" lastFinishedPulling="2026-01-28 21:08:03.446386163 +0000 UTC m=+1711.402572537" observedRunningTime="2026-01-28 21:08:04.078094002 +0000 UTC m=+1712.034280376" watchObservedRunningTime="2026-01-28 21:08:04.086932294 +0000 UTC m=+1712.043118658" Jan 28 21:08:08 crc kubenswrapper[4746]: I0128 21:08:08.836324 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:08:08 crc kubenswrapper[4746]: E0128 21:08:08.836984 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:08:19 crc kubenswrapper[4746]: I0128 21:08:19.836425 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:08:19 crc kubenswrapper[4746]: E0128 21:08:19.837134 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:08:32 crc kubenswrapper[4746]: I0128 21:08:32.852635 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:08:32 crc kubenswrapper[4746]: E0128 21:08:32.853469 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:08:43 crc kubenswrapper[4746]: I0128 21:08:43.063712 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-4wrhd"] Jan 28 21:08:43 crc kubenswrapper[4746]: I0128 21:08:43.076899 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-2435-account-create-update-bm9tc"] Jan 28 21:08:43 crc kubenswrapper[4746]: I0128 21:08:43.089554 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-zmt7g"] Jan 28 21:08:43 crc kubenswrapper[4746]: I0128 21:08:43.098595 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-zmt7g"] Jan 28 21:08:43 crc kubenswrapper[4746]: I0128 21:08:43.108855 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-2435-account-create-update-bm9tc"] Jan 28 21:08:43 crc kubenswrapper[4746]: I0128 21:08:43.122295 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-4wrhd"] Jan 28 21:08:44 crc kubenswrapper[4746]: I0128 21:08:44.035669 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-dlfdf"] Jan 28 21:08:44 crc kubenswrapper[4746]: I0128 21:08:44.048749 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-dlfdf"] Jan 28 21:08:44 crc kubenswrapper[4746]: I0128 21:08:44.059395 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-350d-account-create-update-q5qhn"] Jan 28 21:08:44 crc kubenswrapper[4746]: I0128 21:08:44.068756 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-20b2-account-create-update-6nbn5"] Jan 28 21:08:44 crc kubenswrapper[4746]: I0128 21:08:44.077759 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-20b2-account-create-update-6nbn5"] Jan 28 21:08:44 crc kubenswrapper[4746]: I0128 21:08:44.086397 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-350d-account-create-update-q5qhn"] Jan 28 21:08:44 crc kubenswrapper[4746]: I0128 21:08:44.857730 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53a7b226-48b5-4c3c-ba60-fe472d7c6694" path="/var/lib/kubelet/pods/53a7b226-48b5-4c3c-ba60-fe472d7c6694/volumes" Jan 28 21:08:44 crc kubenswrapper[4746]: I0128 21:08:44.858980 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d335033-aade-4271-ae71-4bb277438111" path="/var/lib/kubelet/pods/5d335033-aade-4271-ae71-4bb277438111/volumes" Jan 28 21:08:44 crc kubenswrapper[4746]: I0128 21:08:44.859810 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="617495c8-6e95-4b00-a9ae-8a89fdf3eb3f" path="/var/lib/kubelet/pods/617495c8-6e95-4b00-a9ae-8a89fdf3eb3f/volumes" Jan 28 21:08:44 crc kubenswrapper[4746]: I0128 21:08:44.860605 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80e2e7c1-645d-4709-b83e-c5604fcc4dfe" path="/var/lib/kubelet/pods/80e2e7c1-645d-4709-b83e-c5604fcc4dfe/volumes" Jan 28 21:08:44 crc kubenswrapper[4746]: I0128 21:08:44.862010 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95d6c762-62af-4a0e-bbb9-af154d84b913" path="/var/lib/kubelet/pods/95d6c762-62af-4a0e-bbb9-af154d84b913/volumes" Jan 28 21:08:44 crc kubenswrapper[4746]: I0128 21:08:44.862820 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d587d573-77d2-41a6-a9c9-3cf63b24512d" path="/var/lib/kubelet/pods/d587d573-77d2-41a6-a9c9-3cf63b24512d/volumes" Jan 28 21:08:46 crc kubenswrapper[4746]: I0128 21:08:46.836965 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:08:46 crc kubenswrapper[4746]: E0128 21:08:46.837577 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:08:56 crc kubenswrapper[4746]: I0128 21:08:56.694192 4746 scope.go:117] "RemoveContainer" containerID="0c3c88cb04a0226468c5a48fbfb28b8557b199093188a9fd188e97ad04b45164" Jan 28 21:08:56 crc kubenswrapper[4746]: I0128 21:08:56.737615 4746 scope.go:117] "RemoveContainer" containerID="ed4f4d5a3a8a4e636f9e1a6642cf8b94121509448f29e951cec4ffb529f71c91" Jan 28 21:08:56 crc kubenswrapper[4746]: I0128 21:08:56.786229 4746 scope.go:117] "RemoveContainer" containerID="2dc4c36f3f34e1b08ec18c6496cc292ab5dccc3a1294d7cc357e44ca14604cc7" Jan 28 21:08:56 crc kubenswrapper[4746]: I0128 21:08:56.849296 4746 scope.go:117] "RemoveContainer" containerID="e8afef6b78d19d71b1ada15e5cbe8848c9bdd70ed3fa51c3220528d8c7b7eec8" Jan 28 21:08:56 crc kubenswrapper[4746]: I0128 21:08:56.916131 4746 scope.go:117] "RemoveContainer" containerID="e004cc641b94cea2da5f068e6c073b64b753e32e8bc0a28daff4bf2d2e5875da" Jan 28 21:08:56 crc kubenswrapper[4746]: I0128 21:08:56.962679 4746 scope.go:117] "RemoveContainer" containerID="f0f29a1577f25521e0aec68786abc50afce8b05ccc2caa26cfb0a16ffb4c82ce" Jan 28 21:08:59 crc kubenswrapper[4746]: I0128 21:08:59.836208 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:08:59 crc kubenswrapper[4746]: E0128 21:08:59.838578 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.046756 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-6pvd6"] Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.078795 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-dlwzg"] Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.090092 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-6pvd6"] Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.101544 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2635-account-create-update-hz28g"] Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.112697 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-27prr"] Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.123658 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2635-account-create-update-hz28g"] Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.133853 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-dlwzg"] Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.142825 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-27prr"] Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.151628 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-6f70-account-create-update-zsths"] Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.161978 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-6f70-account-create-update-zsths"] Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.190786 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4eb8-account-create-update-k449n"] Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.204282 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-564b-account-create-update-m6nh8"] Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.215151 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-564b-account-create-update-m6nh8"] Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.223731 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-4eb8-account-create-update-k449n"] Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.232170 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-kp8xl"] Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.241500 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-kp8xl"] Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.859059 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aed034c-35b5-4fd4-b0c4-cebbdfb41da2" path="/var/lib/kubelet/pods/2aed034c-35b5-4fd4-b0c4-cebbdfb41da2/volumes" Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.860297 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5" path="/var/lib/kubelet/pods/a7ca0a5b-c5ce-48d9-89bc-5f693869c1d5/volumes" Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.861392 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4a37860-0f42-4d7c-89e0-8505e8e49c59" path="/var/lib/kubelet/pods/c4a37860-0f42-4d7c-89e0-8505e8e49c59/volumes" Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.864248 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da1723e7-7026-414e-b1e1-79911e331408" path="/var/lib/kubelet/pods/da1723e7-7026-414e-b1e1-79911e331408/volumes" Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.865403 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db403f32-6720-4260-a66d-45e5d0e7b5c6" path="/var/lib/kubelet/pods/db403f32-6720-4260-a66d-45e5d0e7b5c6/volumes" Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.866530 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc" path="/var/lib/kubelet/pods/e851c5c3-8d3c-4a3d-ac8c-95d5eb7113fc/volumes" Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.867630 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaf1cab3-21ae-4850-a732-d0e75f55ffc4" path="/var/lib/kubelet/pods/eaf1cab3-21ae-4850-a732-d0e75f55ffc4/volumes" Jan 28 21:09:00 crc kubenswrapper[4746]: I0128 21:09:00.869828 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb4a489a-d80d-49a4-9624-ecfe6a4200ca" path="/var/lib/kubelet/pods/fb4a489a-d80d-49a4-9624-ecfe6a4200ca/volumes" Jan 28 21:09:14 crc kubenswrapper[4746]: I0128 21:09:14.840369 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:09:14 crc kubenswrapper[4746]: E0128 21:09:14.841308 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:09:15 crc kubenswrapper[4746]: I0128 21:09:15.037026 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-7mlnk"] Jan 28 21:09:15 crc kubenswrapper[4746]: I0128 21:09:15.048858 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-7mlnk"] Jan 28 21:09:16 crc kubenswrapper[4746]: I0128 21:09:16.855209 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1176d52c-0fec-4346-ad79-af25ac4c3f62" path="/var/lib/kubelet/pods/1176d52c-0fec-4346-ad79-af25ac4c3f62/volumes" Jan 28 21:09:21 crc kubenswrapper[4746]: I0128 21:09:21.045896 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-dsp4s"] Jan 28 21:09:21 crc kubenswrapper[4746]: I0128 21:09:21.061445 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-dsp4s"] Jan 28 21:09:22 crc kubenswrapper[4746]: I0128 21:09:22.850677 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faabc487-475c-4f5b-b135-5a96d1ed9269" path="/var/lib/kubelet/pods/faabc487-475c-4f5b-b135-5a96d1ed9269/volumes" Jan 28 21:09:28 crc kubenswrapper[4746]: I0128 21:09:28.838118 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:09:28 crc kubenswrapper[4746]: E0128 21:09:28.839507 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:09:37 crc kubenswrapper[4746]: I0128 21:09:37.135493 4746 generic.go:334] "Generic (PLEG): container finished" podID="bd3a62cf-5636-4a92-8cc8-8025e70ad3d0" containerID="c4e178eef7781dd48685b9d18fcc3d74fb1859ff88d1fc74ca2d75f4c8b8774d" exitCode=0 Jan 28 21:09:37 crc kubenswrapper[4746]: I0128 21:09:37.135723 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m" event={"ID":"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0","Type":"ContainerDied","Data":"c4e178eef7781dd48685b9d18fcc3d74fb1859ff88d1fc74ca2d75f4c8b8774d"} Jan 28 21:09:38 crc kubenswrapper[4746]: I0128 21:09:38.665782 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m" Jan 28 21:09:38 crc kubenswrapper[4746]: I0128 21:09:38.807632 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd3a62cf-5636-4a92-8cc8-8025e70ad3d0-ssh-key-openstack-edpm-ipam\") pod \"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0\" (UID: \"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0\") " Jan 28 21:09:38 crc kubenswrapper[4746]: I0128 21:09:38.807736 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd3a62cf-5636-4a92-8cc8-8025e70ad3d0-inventory\") pod \"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0\" (UID: \"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0\") " Jan 28 21:09:38 crc kubenswrapper[4746]: I0128 21:09:38.807918 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db82t\" (UniqueName: \"kubernetes.io/projected/bd3a62cf-5636-4a92-8cc8-8025e70ad3d0-kube-api-access-db82t\") pod \"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0\" (UID: \"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0\") " Jan 28 21:09:38 crc kubenswrapper[4746]: I0128 21:09:38.814906 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd3a62cf-5636-4a92-8cc8-8025e70ad3d0-kube-api-access-db82t" (OuterVolumeSpecName: "kube-api-access-db82t") pod "bd3a62cf-5636-4a92-8cc8-8025e70ad3d0" (UID: "bd3a62cf-5636-4a92-8cc8-8025e70ad3d0"). InnerVolumeSpecName "kube-api-access-db82t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:09:38 crc kubenswrapper[4746]: I0128 21:09:38.846729 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd3a62cf-5636-4a92-8cc8-8025e70ad3d0-inventory" (OuterVolumeSpecName: "inventory") pod "bd3a62cf-5636-4a92-8cc8-8025e70ad3d0" (UID: "bd3a62cf-5636-4a92-8cc8-8025e70ad3d0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:09:38 crc kubenswrapper[4746]: I0128 21:09:38.863211 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd3a62cf-5636-4a92-8cc8-8025e70ad3d0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bd3a62cf-5636-4a92-8cc8-8025e70ad3d0" (UID: "bd3a62cf-5636-4a92-8cc8-8025e70ad3d0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:09:38 crc kubenswrapper[4746]: I0128 21:09:38.911355 4746 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd3a62cf-5636-4a92-8cc8-8025e70ad3d0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 21:09:38 crc kubenswrapper[4746]: I0128 21:09:38.911559 4746 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd3a62cf-5636-4a92-8cc8-8025e70ad3d0-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 21:09:38 crc kubenswrapper[4746]: I0128 21:09:38.911668 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db82t\" (UniqueName: \"kubernetes.io/projected/bd3a62cf-5636-4a92-8cc8-8025e70ad3d0-kube-api-access-db82t\") on node \"crc\" DevicePath \"\"" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.162018 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m" event={"ID":"bd3a62cf-5636-4a92-8cc8-8025e70ad3d0","Type":"ContainerDied","Data":"df9f3e6a649ca8e82c011470431873ee8eea258ac80ce5ab383ee0870e3b41cb"} Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.162124 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.162179 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df9f3e6a649ca8e82c011470431873ee8eea258ac80ce5ab383ee0870e3b41cb" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.274219 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf"] Jan 28 21:09:39 crc kubenswrapper[4746]: E0128 21:09:39.274741 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd3a62cf-5636-4a92-8cc8-8025e70ad3d0" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.274761 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd3a62cf-5636-4a92-8cc8-8025e70ad3d0" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.274983 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd3a62cf-5636-4a92-8cc8-8025e70ad3d0" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.275926 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.281018 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.281115 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xb8vv" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.281130 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.281116 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.283701 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf"] Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.436089 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kczl7\" (UniqueName: \"kubernetes.io/projected/f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2-kube-api-access-kczl7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf\" (UID: \"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.436134 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf\" (UID: \"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.436930 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf\" (UID: \"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.538745 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kczl7\" (UniqueName: \"kubernetes.io/projected/f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2-kube-api-access-kczl7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf\" (UID: \"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.538805 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf\" (UID: \"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.538935 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf\" (UID: \"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.542235 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf\" (UID: \"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.543628 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf\" (UID: \"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.556608 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kczl7\" (UniqueName: \"kubernetes.io/projected/f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2-kube-api-access-kczl7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf\" (UID: \"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.601508 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf" Jan 28 21:09:39 crc kubenswrapper[4746]: I0128 21:09:39.836410 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:09:39 crc kubenswrapper[4746]: E0128 21:09:39.838743 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:09:40 crc kubenswrapper[4746]: I0128 21:09:40.190364 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf"] Jan 28 21:09:41 crc kubenswrapper[4746]: I0128 21:09:41.183112 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf" event={"ID":"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2","Type":"ContainerStarted","Data":"955c9e291f8c4561cd2936354d9b3aa4c0b1eae2a933e7a9e6da4e3ad8cdf50f"} Jan 28 21:09:41 crc kubenswrapper[4746]: I0128 21:09:41.183502 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf" event={"ID":"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2","Type":"ContainerStarted","Data":"5dfe601e827a7b5e5b38f650bb7dd206ae9900aaa8d89915095cc62a90354467"} Jan 28 21:09:41 crc kubenswrapper[4746]: I0128 21:09:41.211499 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf" podStartSLOduration=1.737924037 podStartE2EDuration="2.211478112s" podCreationTimestamp="2026-01-28 21:09:39 +0000 UTC" firstStartedPulling="2026-01-28 21:09:40.198302837 +0000 UTC m=+1808.154489191" lastFinishedPulling="2026-01-28 21:09:40.671856912 +0000 UTC m=+1808.628043266" observedRunningTime="2026-01-28 21:09:41.201837269 +0000 UTC m=+1809.158023643" watchObservedRunningTime="2026-01-28 21:09:41.211478112 +0000 UTC m=+1809.167664476" Jan 28 21:09:48 crc kubenswrapper[4746]: I0128 21:09:48.064669 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-g7kpz"] Jan 28 21:09:48 crc kubenswrapper[4746]: I0128 21:09:48.082552 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-g7kpz"] Jan 28 21:09:48 crc kubenswrapper[4746]: I0128 21:09:48.859702 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b820b96e-5237-4984-a3e9-246b04980cbb" path="/var/lib/kubelet/pods/b820b96e-5237-4984-a3e9-246b04980cbb/volumes" Jan 28 21:09:53 crc kubenswrapper[4746]: I0128 21:09:53.836031 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:09:53 crc kubenswrapper[4746]: E0128 21:09:53.836999 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:09:57 crc kubenswrapper[4746]: I0128 21:09:57.160779 4746 scope.go:117] "RemoveContainer" containerID="adf6407c69131aebac7a4d54d91e59942a3a262af5a8ce578e90d9e795210918" Jan 28 21:09:57 crc kubenswrapper[4746]: I0128 21:09:57.210012 4746 scope.go:117] "RemoveContainer" containerID="389aa29b705d25b729a72555f23f81dc2c5269229ea8207977650d2dd547da96" Jan 28 21:09:57 crc kubenswrapper[4746]: I0128 21:09:57.256665 4746 scope.go:117] "RemoveContainer" containerID="3e8bf0fad3a221f9307149e72a8c7a4c16b411a4b0558781df4f71024476360a" Jan 28 21:09:57 crc kubenswrapper[4746]: I0128 21:09:57.320343 4746 scope.go:117] "RemoveContainer" containerID="81caf5ae6d81fbfe614995cb0c9805b2ad35632a010c63d5790e8ab11eccc724" Jan 28 21:09:57 crc kubenswrapper[4746]: I0128 21:09:57.381207 4746 scope.go:117] "RemoveContainer" containerID="d7e34ce150b2322e65670a106c0878e3573205aebc249421f89da536b3a40251" Jan 28 21:09:57 crc kubenswrapper[4746]: I0128 21:09:57.408452 4746 scope.go:117] "RemoveContainer" containerID="68613854b61f89e684047d7c10e6855b82ef3a9fefe04b5e735ff2578df9b978" Jan 28 21:09:57 crc kubenswrapper[4746]: I0128 21:09:57.446598 4746 scope.go:117] "RemoveContainer" containerID="5948b5a9f847a8b980f1decf747168335524f42737ad09efc0b1fca00783847a" Jan 28 21:09:57 crc kubenswrapper[4746]: I0128 21:09:57.474697 4746 scope.go:117] "RemoveContainer" containerID="d97d47184e7df00db5d7e48a8f7a9092af3bdb44d39a15968acfaacb8c9ef76d" Jan 28 21:09:57 crc kubenswrapper[4746]: I0128 21:09:57.494867 4746 scope.go:117] "RemoveContainer" containerID="41aa4c87a1fca91145b590894f1cb3b229262a1f21e38f4a02bcd4065c56e8e1" Jan 28 21:09:57 crc kubenswrapper[4746]: I0128 21:09:57.516707 4746 scope.go:117] "RemoveContainer" containerID="51190e28e35697c5ce6351b8e77e5a505b8502b14e2c1dcb41d9348a50e9aae0" Jan 28 21:09:57 crc kubenswrapper[4746]: I0128 21:09:57.538011 4746 scope.go:117] "RemoveContainer" containerID="b6c27ab23e35d3e023d033707ffd8c5967ba793323f79bc0ea3639bf0b9ccab6" Jan 28 21:09:58 crc kubenswrapper[4746]: I0128 21:09:58.046678 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-lwqj2"] Jan 28 21:09:58 crc kubenswrapper[4746]: I0128 21:09:58.061874 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jms86"] Jan 28 21:09:58 crc kubenswrapper[4746]: I0128 21:09:58.071862 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jms86"] Jan 28 21:09:58 crc kubenswrapper[4746]: I0128 21:09:58.080530 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-lwqj2"] Jan 28 21:09:58 crc kubenswrapper[4746]: I0128 21:09:58.852784 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d79950b-c574-4952-8620-ff635db5e8de" path="/var/lib/kubelet/pods/1d79950b-c574-4952-8620-ff635db5e8de/volumes" Jan 28 21:09:58 crc kubenswrapper[4746]: I0128 21:09:58.853638 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f81b35d0-5755-476e-a5c9-30036d654d53" path="/var/lib/kubelet/pods/f81b35d0-5755-476e-a5c9-30036d654d53/volumes" Jan 28 21:10:06 crc kubenswrapper[4746]: I0128 21:10:06.837197 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:10:06 crc kubenswrapper[4746]: E0128 21:10:06.838826 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:10:14 crc kubenswrapper[4746]: I0128 21:10:14.050344 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-wwfkc"] Jan 28 21:10:14 crc kubenswrapper[4746]: I0128 21:10:14.062550 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-wwfkc"] Jan 28 21:10:14 crc kubenswrapper[4746]: I0128 21:10:14.846175 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a587d3d9-972c-47ae-8e29-5bfd977ff429" path="/var/lib/kubelet/pods/a587d3d9-972c-47ae-8e29-5bfd977ff429/volumes" Jan 28 21:10:15 crc kubenswrapper[4746]: I0128 21:10:15.033316 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-29n2p"] Jan 28 21:10:15 crc kubenswrapper[4746]: I0128 21:10:15.045876 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-29n2p"] Jan 28 21:10:16 crc kubenswrapper[4746]: I0128 21:10:16.848596 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70e766dc-9f84-4d0c-af5b-3b044e06c09f" path="/var/lib/kubelet/pods/70e766dc-9f84-4d0c-af5b-3b044e06c09f/volumes" Jan 28 21:10:17 crc kubenswrapper[4746]: I0128 21:10:17.836263 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:10:17 crc kubenswrapper[4746]: E0128 21:10:17.837288 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:10:28 crc kubenswrapper[4746]: I0128 21:10:28.851614 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:10:28 crc kubenswrapper[4746]: E0128 21:10:28.852709 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:10:29 crc kubenswrapper[4746]: I0128 21:10:29.030030 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-55g9q"] Jan 28 21:10:29 crc kubenswrapper[4746]: I0128 21:10:29.039184 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-55g9q"] Jan 28 21:10:30 crc kubenswrapper[4746]: I0128 21:10:30.851926 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="766b5979-538e-4a54-a1b5-3351e3988f70" path="/var/lib/kubelet/pods/766b5979-538e-4a54-a1b5-3351e3988f70/volumes" Jan 28 21:10:36 crc kubenswrapper[4746]: I0128 21:10:36.325664 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w67kh"] Jan 28 21:10:36 crc kubenswrapper[4746]: I0128 21:10:36.328034 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w67kh" Jan 28 21:10:36 crc kubenswrapper[4746]: I0128 21:10:36.341424 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w67kh"] Jan 28 21:10:36 crc kubenswrapper[4746]: I0128 21:10:36.451488 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e93abc2-ecc1-4350-823c-9a8e87f5df39-catalog-content\") pod \"redhat-marketplace-w67kh\" (UID: \"1e93abc2-ecc1-4350-823c-9a8e87f5df39\") " pod="openshift-marketplace/redhat-marketplace-w67kh" Jan 28 21:10:36 crc kubenswrapper[4746]: I0128 21:10:36.451566 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e93abc2-ecc1-4350-823c-9a8e87f5df39-utilities\") pod \"redhat-marketplace-w67kh\" (UID: \"1e93abc2-ecc1-4350-823c-9a8e87f5df39\") " pod="openshift-marketplace/redhat-marketplace-w67kh" Jan 28 21:10:36 crc kubenswrapper[4746]: I0128 21:10:36.451708 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwxzt\" (UniqueName: \"kubernetes.io/projected/1e93abc2-ecc1-4350-823c-9a8e87f5df39-kube-api-access-xwxzt\") pod \"redhat-marketplace-w67kh\" (UID: \"1e93abc2-ecc1-4350-823c-9a8e87f5df39\") " pod="openshift-marketplace/redhat-marketplace-w67kh" Jan 28 21:10:36 crc kubenswrapper[4746]: I0128 21:10:36.553283 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e93abc2-ecc1-4350-823c-9a8e87f5df39-utilities\") pod \"redhat-marketplace-w67kh\" (UID: \"1e93abc2-ecc1-4350-823c-9a8e87f5df39\") " pod="openshift-marketplace/redhat-marketplace-w67kh" Jan 28 21:10:36 crc kubenswrapper[4746]: I0128 21:10:36.553429 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwxzt\" (UniqueName: \"kubernetes.io/projected/1e93abc2-ecc1-4350-823c-9a8e87f5df39-kube-api-access-xwxzt\") pod \"redhat-marketplace-w67kh\" (UID: \"1e93abc2-ecc1-4350-823c-9a8e87f5df39\") " pod="openshift-marketplace/redhat-marketplace-w67kh" Jan 28 21:10:36 crc kubenswrapper[4746]: I0128 21:10:36.553492 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e93abc2-ecc1-4350-823c-9a8e87f5df39-catalog-content\") pod \"redhat-marketplace-w67kh\" (UID: \"1e93abc2-ecc1-4350-823c-9a8e87f5df39\") " pod="openshift-marketplace/redhat-marketplace-w67kh" Jan 28 21:10:36 crc kubenswrapper[4746]: I0128 21:10:36.553917 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e93abc2-ecc1-4350-823c-9a8e87f5df39-utilities\") pod \"redhat-marketplace-w67kh\" (UID: \"1e93abc2-ecc1-4350-823c-9a8e87f5df39\") " pod="openshift-marketplace/redhat-marketplace-w67kh" Jan 28 21:10:36 crc kubenswrapper[4746]: I0128 21:10:36.553955 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e93abc2-ecc1-4350-823c-9a8e87f5df39-catalog-content\") pod \"redhat-marketplace-w67kh\" (UID: \"1e93abc2-ecc1-4350-823c-9a8e87f5df39\") " pod="openshift-marketplace/redhat-marketplace-w67kh" Jan 28 21:10:36 crc kubenswrapper[4746]: I0128 21:10:36.583132 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwxzt\" (UniqueName: \"kubernetes.io/projected/1e93abc2-ecc1-4350-823c-9a8e87f5df39-kube-api-access-xwxzt\") pod \"redhat-marketplace-w67kh\" (UID: \"1e93abc2-ecc1-4350-823c-9a8e87f5df39\") " pod="openshift-marketplace/redhat-marketplace-w67kh" Jan 28 21:10:36 crc kubenswrapper[4746]: I0128 21:10:36.654433 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w67kh" Jan 28 21:10:37 crc kubenswrapper[4746]: I0128 21:10:37.149719 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w67kh"] Jan 28 21:10:37 crc kubenswrapper[4746]: I0128 21:10:37.812273 4746 generic.go:334] "Generic (PLEG): container finished" podID="1e93abc2-ecc1-4350-823c-9a8e87f5df39" containerID="8b752a2bef30df0d1961dc5c00379a434a7fe2b5c06c7ba36c396c2a529f893f" exitCode=0 Jan 28 21:10:37 crc kubenswrapper[4746]: I0128 21:10:37.812325 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w67kh" event={"ID":"1e93abc2-ecc1-4350-823c-9a8e87f5df39","Type":"ContainerDied","Data":"8b752a2bef30df0d1961dc5c00379a434a7fe2b5c06c7ba36c396c2a529f893f"} Jan 28 21:10:37 crc kubenswrapper[4746]: I0128 21:10:37.812620 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w67kh" event={"ID":"1e93abc2-ecc1-4350-823c-9a8e87f5df39","Type":"ContainerStarted","Data":"60e400e468443490ef0034998dbd9ef5b955d7a12792525e34dcd56ccd06a78a"} Jan 28 21:10:37 crc kubenswrapper[4746]: I0128 21:10:37.814912 4746 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 21:10:38 crc kubenswrapper[4746]: I0128 21:10:38.823982 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w67kh" event={"ID":"1e93abc2-ecc1-4350-823c-9a8e87f5df39","Type":"ContainerStarted","Data":"eb6a9d0c25b1d778a16ffb72a145a673294f85e6a0970df22e2959cd38c19e86"} Jan 28 21:10:39 crc kubenswrapper[4746]: I0128 21:10:39.836776 4746 generic.go:334] "Generic (PLEG): container finished" podID="1e93abc2-ecc1-4350-823c-9a8e87f5df39" containerID="eb6a9d0c25b1d778a16ffb72a145a673294f85e6a0970df22e2959cd38c19e86" exitCode=0 Jan 28 21:10:39 crc kubenswrapper[4746]: I0128 21:10:39.836871 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w67kh" event={"ID":"1e93abc2-ecc1-4350-823c-9a8e87f5df39","Type":"ContainerDied","Data":"eb6a9d0c25b1d778a16ffb72a145a673294f85e6a0970df22e2959cd38c19e86"} Jan 28 21:10:40 crc kubenswrapper[4746]: I0128 21:10:40.852519 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w67kh" event={"ID":"1e93abc2-ecc1-4350-823c-9a8e87f5df39","Type":"ContainerStarted","Data":"f7220d6f98c27168ee557f71c197427d6126f605ba9ef22d24a862419b0362d8"} Jan 28 21:10:40 crc kubenswrapper[4746]: I0128 21:10:40.884959 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w67kh" podStartSLOduration=2.430416912 podStartE2EDuration="4.884935979s" podCreationTimestamp="2026-01-28 21:10:36 +0000 UTC" firstStartedPulling="2026-01-28 21:10:37.814591161 +0000 UTC m=+1865.770777515" lastFinishedPulling="2026-01-28 21:10:40.269110228 +0000 UTC m=+1868.225296582" observedRunningTime="2026-01-28 21:10:40.873571773 +0000 UTC m=+1868.829758127" watchObservedRunningTime="2026-01-28 21:10:40.884935979 +0000 UTC m=+1868.841122333" Jan 28 21:10:41 crc kubenswrapper[4746]: I0128 21:10:41.835310 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:10:41 crc kubenswrapper[4746]: E0128 21:10:41.835715 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:10:46 crc kubenswrapper[4746]: I0128 21:10:46.655156 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w67kh" Jan 28 21:10:46 crc kubenswrapper[4746]: I0128 21:10:46.655948 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w67kh" Jan 28 21:10:46 crc kubenswrapper[4746]: I0128 21:10:46.729544 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w67kh" Jan 28 21:10:46 crc kubenswrapper[4746]: I0128 21:10:46.992146 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w67kh" Jan 28 21:10:47 crc kubenswrapper[4746]: I0128 21:10:47.045056 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w67kh"] Jan 28 21:10:48 crc kubenswrapper[4746]: I0128 21:10:48.939243 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w67kh" podUID="1e93abc2-ecc1-4350-823c-9a8e87f5df39" containerName="registry-server" containerID="cri-o://f7220d6f98c27168ee557f71c197427d6126f605ba9ef22d24a862419b0362d8" gracePeriod=2 Jan 28 21:10:49 crc kubenswrapper[4746]: I0128 21:10:49.537131 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w67kh" Jan 28 21:10:49 crc kubenswrapper[4746]: I0128 21:10:49.639611 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e93abc2-ecc1-4350-823c-9a8e87f5df39-utilities\") pod \"1e93abc2-ecc1-4350-823c-9a8e87f5df39\" (UID: \"1e93abc2-ecc1-4350-823c-9a8e87f5df39\") " Jan 28 21:10:49 crc kubenswrapper[4746]: I0128 21:10:49.639822 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e93abc2-ecc1-4350-823c-9a8e87f5df39-catalog-content\") pod \"1e93abc2-ecc1-4350-823c-9a8e87f5df39\" (UID: \"1e93abc2-ecc1-4350-823c-9a8e87f5df39\") " Jan 28 21:10:49 crc kubenswrapper[4746]: I0128 21:10:49.639894 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwxzt\" (UniqueName: \"kubernetes.io/projected/1e93abc2-ecc1-4350-823c-9a8e87f5df39-kube-api-access-xwxzt\") pod \"1e93abc2-ecc1-4350-823c-9a8e87f5df39\" (UID: \"1e93abc2-ecc1-4350-823c-9a8e87f5df39\") " Jan 28 21:10:49 crc kubenswrapper[4746]: I0128 21:10:49.640577 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e93abc2-ecc1-4350-823c-9a8e87f5df39-utilities" (OuterVolumeSpecName: "utilities") pod "1e93abc2-ecc1-4350-823c-9a8e87f5df39" (UID: "1e93abc2-ecc1-4350-823c-9a8e87f5df39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:10:49 crc kubenswrapper[4746]: I0128 21:10:49.647424 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e93abc2-ecc1-4350-823c-9a8e87f5df39-kube-api-access-xwxzt" (OuterVolumeSpecName: "kube-api-access-xwxzt") pod "1e93abc2-ecc1-4350-823c-9a8e87f5df39" (UID: "1e93abc2-ecc1-4350-823c-9a8e87f5df39"). InnerVolumeSpecName "kube-api-access-xwxzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:10:49 crc kubenswrapper[4746]: I0128 21:10:49.662435 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e93abc2-ecc1-4350-823c-9a8e87f5df39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e93abc2-ecc1-4350-823c-9a8e87f5df39" (UID: "1e93abc2-ecc1-4350-823c-9a8e87f5df39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:10:49 crc kubenswrapper[4746]: I0128 21:10:49.742589 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e93abc2-ecc1-4350-823c-9a8e87f5df39-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 21:10:49 crc kubenswrapper[4746]: I0128 21:10:49.742623 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e93abc2-ecc1-4350-823c-9a8e87f5df39-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 21:10:49 crc kubenswrapper[4746]: I0128 21:10:49.742635 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwxzt\" (UniqueName: \"kubernetes.io/projected/1e93abc2-ecc1-4350-823c-9a8e87f5df39-kube-api-access-xwxzt\") on node \"crc\" DevicePath \"\"" Jan 28 21:10:50 crc kubenswrapper[4746]: I0128 21:10:50.002289 4746 generic.go:334] "Generic (PLEG): container finished" podID="1e93abc2-ecc1-4350-823c-9a8e87f5df39" containerID="f7220d6f98c27168ee557f71c197427d6126f605ba9ef22d24a862419b0362d8" exitCode=0 Jan 28 21:10:50 crc kubenswrapper[4746]: I0128 21:10:50.002393 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w67kh" event={"ID":"1e93abc2-ecc1-4350-823c-9a8e87f5df39","Type":"ContainerDied","Data":"f7220d6f98c27168ee557f71c197427d6126f605ba9ef22d24a862419b0362d8"} Jan 28 21:10:50 crc kubenswrapper[4746]: I0128 21:10:50.002450 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w67kh" event={"ID":"1e93abc2-ecc1-4350-823c-9a8e87f5df39","Type":"ContainerDied","Data":"60e400e468443490ef0034998dbd9ef5b955d7a12792525e34dcd56ccd06a78a"} Jan 28 21:10:50 crc kubenswrapper[4746]: I0128 21:10:50.002472 4746 scope.go:117] "RemoveContainer" containerID="f7220d6f98c27168ee557f71c197427d6126f605ba9ef22d24a862419b0362d8" Jan 28 21:10:50 crc kubenswrapper[4746]: I0128 21:10:50.002540 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w67kh" Jan 28 21:10:50 crc kubenswrapper[4746]: I0128 21:10:50.010906 4746 generic.go:334] "Generic (PLEG): container finished" podID="f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2" containerID="955c9e291f8c4561cd2936354d9b3aa4c0b1eae2a933e7a9e6da4e3ad8cdf50f" exitCode=0 Jan 28 21:10:50 crc kubenswrapper[4746]: I0128 21:10:50.010951 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf" event={"ID":"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2","Type":"ContainerDied","Data":"955c9e291f8c4561cd2936354d9b3aa4c0b1eae2a933e7a9e6da4e3ad8cdf50f"} Jan 28 21:10:50 crc kubenswrapper[4746]: I0128 21:10:50.047267 4746 scope.go:117] "RemoveContainer" containerID="eb6a9d0c25b1d778a16ffb72a145a673294f85e6a0970df22e2959cd38c19e86" Jan 28 21:10:50 crc kubenswrapper[4746]: I0128 21:10:50.110409 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w67kh"] Jan 28 21:10:50 crc kubenswrapper[4746]: I0128 21:10:50.116292 4746 scope.go:117] "RemoveContainer" containerID="8b752a2bef30df0d1961dc5c00379a434a7fe2b5c06c7ba36c396c2a529f893f" Jan 28 21:10:50 crc kubenswrapper[4746]: I0128 21:10:50.127756 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w67kh"] Jan 28 21:10:50 crc kubenswrapper[4746]: I0128 21:10:50.182692 4746 scope.go:117] "RemoveContainer" containerID="f7220d6f98c27168ee557f71c197427d6126f605ba9ef22d24a862419b0362d8" Jan 28 21:10:50 crc kubenswrapper[4746]: E0128 21:10:50.183573 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7220d6f98c27168ee557f71c197427d6126f605ba9ef22d24a862419b0362d8\": container with ID starting with f7220d6f98c27168ee557f71c197427d6126f605ba9ef22d24a862419b0362d8 not found: ID does not exist" containerID="f7220d6f98c27168ee557f71c197427d6126f605ba9ef22d24a862419b0362d8" Jan 28 21:10:50 crc kubenswrapper[4746]: I0128 21:10:50.183777 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7220d6f98c27168ee557f71c197427d6126f605ba9ef22d24a862419b0362d8"} err="failed to get container status \"f7220d6f98c27168ee557f71c197427d6126f605ba9ef22d24a862419b0362d8\": rpc error: code = NotFound desc = could not find container \"f7220d6f98c27168ee557f71c197427d6126f605ba9ef22d24a862419b0362d8\": container with ID starting with f7220d6f98c27168ee557f71c197427d6126f605ba9ef22d24a862419b0362d8 not found: ID does not exist" Jan 28 21:10:50 crc kubenswrapper[4746]: I0128 21:10:50.183801 4746 scope.go:117] "RemoveContainer" containerID="eb6a9d0c25b1d778a16ffb72a145a673294f85e6a0970df22e2959cd38c19e86" Jan 28 21:10:50 crc kubenswrapper[4746]: E0128 21:10:50.187812 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb6a9d0c25b1d778a16ffb72a145a673294f85e6a0970df22e2959cd38c19e86\": container with ID starting with eb6a9d0c25b1d778a16ffb72a145a673294f85e6a0970df22e2959cd38c19e86 not found: ID does not exist" containerID="eb6a9d0c25b1d778a16ffb72a145a673294f85e6a0970df22e2959cd38c19e86" Jan 28 21:10:50 crc kubenswrapper[4746]: I0128 21:10:50.187857 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb6a9d0c25b1d778a16ffb72a145a673294f85e6a0970df22e2959cd38c19e86"} err="failed to get container status \"eb6a9d0c25b1d778a16ffb72a145a673294f85e6a0970df22e2959cd38c19e86\": rpc error: code = NotFound desc = could not find container \"eb6a9d0c25b1d778a16ffb72a145a673294f85e6a0970df22e2959cd38c19e86\": container with ID starting with eb6a9d0c25b1d778a16ffb72a145a673294f85e6a0970df22e2959cd38c19e86 not found: ID does not exist" Jan 28 21:10:50 crc kubenswrapper[4746]: I0128 21:10:50.187883 4746 scope.go:117] "RemoveContainer" containerID="8b752a2bef30df0d1961dc5c00379a434a7fe2b5c06c7ba36c396c2a529f893f" Jan 28 21:10:50 crc kubenswrapper[4746]: E0128 21:10:50.188802 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b752a2bef30df0d1961dc5c00379a434a7fe2b5c06c7ba36c396c2a529f893f\": container with ID starting with 8b752a2bef30df0d1961dc5c00379a434a7fe2b5c06c7ba36c396c2a529f893f not found: ID does not exist" containerID="8b752a2bef30df0d1961dc5c00379a434a7fe2b5c06c7ba36c396c2a529f893f" Jan 28 21:10:50 crc kubenswrapper[4746]: I0128 21:10:50.188834 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b752a2bef30df0d1961dc5c00379a434a7fe2b5c06c7ba36c396c2a529f893f"} err="failed to get container status \"8b752a2bef30df0d1961dc5c00379a434a7fe2b5c06c7ba36c396c2a529f893f\": rpc error: code = NotFound desc = could not find container \"8b752a2bef30df0d1961dc5c00379a434a7fe2b5c06c7ba36c396c2a529f893f\": container with ID starting with 8b752a2bef30df0d1961dc5c00379a434a7fe2b5c06c7ba36c396c2a529f893f not found: ID does not exist" Jan 28 21:10:50 crc kubenswrapper[4746]: I0128 21:10:50.850488 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e93abc2-ecc1-4350-823c-9a8e87f5df39" path="/var/lib/kubelet/pods/1e93abc2-ecc1-4350-823c-9a8e87f5df39/volumes" Jan 28 21:10:51 crc kubenswrapper[4746]: I0128 21:10:51.578335 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf" Jan 28 21:10:51 crc kubenswrapper[4746]: I0128 21:10:51.692676 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2-ssh-key-openstack-edpm-ipam\") pod \"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2\" (UID: \"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2\") " Jan 28 21:10:51 crc kubenswrapper[4746]: I0128 21:10:51.692829 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2-inventory\") pod \"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2\" (UID: \"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2\") " Jan 28 21:10:51 crc kubenswrapper[4746]: I0128 21:10:51.692866 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kczl7\" (UniqueName: \"kubernetes.io/projected/f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2-kube-api-access-kczl7\") pod \"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2\" (UID: \"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2\") " Jan 28 21:10:51 crc kubenswrapper[4746]: I0128 21:10:51.700420 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2-kube-api-access-kczl7" (OuterVolumeSpecName: "kube-api-access-kczl7") pod "f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2" (UID: "f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2"). InnerVolumeSpecName "kube-api-access-kczl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:10:51 crc kubenswrapper[4746]: I0128 21:10:51.735420 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2-inventory" (OuterVolumeSpecName: "inventory") pod "f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2" (UID: "f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:10:51 crc kubenswrapper[4746]: I0128 21:10:51.739836 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2" (UID: "f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:10:51 crc kubenswrapper[4746]: I0128 21:10:51.795824 4746 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 21:10:51 crc kubenswrapper[4746]: I0128 21:10:51.795858 4746 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 21:10:51 crc kubenswrapper[4746]: I0128 21:10:51.795867 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kczl7\" (UniqueName: \"kubernetes.io/projected/f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2-kube-api-access-kczl7\") on node \"crc\" DevicePath \"\"" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.039129 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf" event={"ID":"f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2","Type":"ContainerDied","Data":"5dfe601e827a7b5e5b38f650bb7dd206ae9900aaa8d89915095cc62a90354467"} Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.039178 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.039189 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dfe601e827a7b5e5b38f650bb7dd206ae9900aaa8d89915095cc62a90354467" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.156817 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv"] Jan 28 21:10:52 crc kubenswrapper[4746]: E0128 21:10:52.157359 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e93abc2-ecc1-4350-823c-9a8e87f5df39" containerName="extract-utilities" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.157578 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e93abc2-ecc1-4350-823c-9a8e87f5df39" containerName="extract-utilities" Jan 28 21:10:52 crc kubenswrapper[4746]: E0128 21:10:52.157620 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e93abc2-ecc1-4350-823c-9a8e87f5df39" containerName="extract-content" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.157631 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e93abc2-ecc1-4350-823c-9a8e87f5df39" containerName="extract-content" Jan 28 21:10:52 crc kubenswrapper[4746]: E0128 21:10:52.157642 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.157650 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 28 21:10:52 crc kubenswrapper[4746]: E0128 21:10:52.157674 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e93abc2-ecc1-4350-823c-9a8e87f5df39" containerName="registry-server" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.157683 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e93abc2-ecc1-4350-823c-9a8e87f5df39" containerName="registry-server" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.157966 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.157998 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e93abc2-ecc1-4350-823c-9a8e87f5df39" containerName="registry-server" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.166389 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.173071 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.173139 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.174398 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xb8vv" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.174452 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.192372 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv"] Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.305979 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8728e263-d102-4878-a40e-30e414240224-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rjltv\" (UID: \"8728e263-d102-4878-a40e-30e414240224\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.306028 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8728e263-d102-4878-a40e-30e414240224-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rjltv\" (UID: \"8728e263-d102-4878-a40e-30e414240224\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.306146 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrf58\" (UniqueName: \"kubernetes.io/projected/8728e263-d102-4878-a40e-30e414240224-kube-api-access-wrf58\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rjltv\" (UID: \"8728e263-d102-4878-a40e-30e414240224\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.407677 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8728e263-d102-4878-a40e-30e414240224-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rjltv\" (UID: \"8728e263-d102-4878-a40e-30e414240224\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.407729 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8728e263-d102-4878-a40e-30e414240224-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rjltv\" (UID: \"8728e263-d102-4878-a40e-30e414240224\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.407794 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrf58\" (UniqueName: \"kubernetes.io/projected/8728e263-d102-4878-a40e-30e414240224-kube-api-access-wrf58\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rjltv\" (UID: \"8728e263-d102-4878-a40e-30e414240224\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.411719 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8728e263-d102-4878-a40e-30e414240224-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rjltv\" (UID: \"8728e263-d102-4878-a40e-30e414240224\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.412663 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8728e263-d102-4878-a40e-30e414240224-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rjltv\" (UID: \"8728e263-d102-4878-a40e-30e414240224\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.427714 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrf58\" (UniqueName: \"kubernetes.io/projected/8728e263-d102-4878-a40e-30e414240224-kube-api-access-wrf58\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-rjltv\" (UID: \"8728e263-d102-4878-a40e-30e414240224\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.504548 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv" Jan 28 21:10:52 crc kubenswrapper[4746]: I0128 21:10:52.846312 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:10:52 crc kubenswrapper[4746]: E0128 21:10:52.846897 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:10:53 crc kubenswrapper[4746]: I0128 21:10:53.044927 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv"] Jan 28 21:10:54 crc kubenswrapper[4746]: I0128 21:10:54.056051 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv" event={"ID":"8728e263-d102-4878-a40e-30e414240224","Type":"ContainerStarted","Data":"fdeda4fd776f169ffbbd7edd080f09db3c883c3c0c373b4958456779f61ef72f"} Jan 28 21:10:54 crc kubenswrapper[4746]: I0128 21:10:54.056455 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv" event={"ID":"8728e263-d102-4878-a40e-30e414240224","Type":"ContainerStarted","Data":"00630491c543b2a0ec7436a3a8f7545709d1652f05ce583eb195885e73c00275"} Jan 28 21:10:54 crc kubenswrapper[4746]: I0128 21:10:54.074879 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv" podStartSLOduration=1.470102308 podStartE2EDuration="2.07486147s" podCreationTimestamp="2026-01-28 21:10:52 +0000 UTC" firstStartedPulling="2026-01-28 21:10:53.043898683 +0000 UTC m=+1881.000085037" lastFinishedPulling="2026-01-28 21:10:53.648657845 +0000 UTC m=+1881.604844199" observedRunningTime="2026-01-28 21:10:54.070518893 +0000 UTC m=+1882.026705267" watchObservedRunningTime="2026-01-28 21:10:54.07486147 +0000 UTC m=+1882.031047824" Jan 28 21:10:57 crc kubenswrapper[4746]: I0128 21:10:57.741529 4746 scope.go:117] "RemoveContainer" containerID="bd6ee0278b68974bd3186415cd7d01885d9dea5c6ff0d0064894eda32cec0c8f" Jan 28 21:10:57 crc kubenswrapper[4746]: I0128 21:10:57.813527 4746 scope.go:117] "RemoveContainer" containerID="b2ac020924b66e52e098563168d3a468eaa3d99aa0608464dec14ec0357795ec" Jan 28 21:10:57 crc kubenswrapper[4746]: I0128 21:10:57.863316 4746 scope.go:117] "RemoveContainer" containerID="c5ca0c4931baaab786455361cdeeb73786cf5eb3e4b9d0efcad67e061109a926" Jan 28 21:10:57 crc kubenswrapper[4746]: I0128 21:10:57.937539 4746 scope.go:117] "RemoveContainer" containerID="fac7d70a5b957886a2ba853350ecac76e6383948d810c424b865d3c3f9334bfa" Jan 28 21:10:57 crc kubenswrapper[4746]: I0128 21:10:57.979296 4746 scope.go:117] "RemoveContainer" containerID="d23c067486da07f978541c7dba8e6461c6957dc68fe5452d6fe0d4b93cf13ed7" Jan 28 21:10:59 crc kubenswrapper[4746]: I0128 21:10:59.117794 4746 generic.go:334] "Generic (PLEG): container finished" podID="8728e263-d102-4878-a40e-30e414240224" containerID="fdeda4fd776f169ffbbd7edd080f09db3c883c3c0c373b4958456779f61ef72f" exitCode=0 Jan 28 21:10:59 crc kubenswrapper[4746]: I0128 21:10:59.117844 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv" event={"ID":"8728e263-d102-4878-a40e-30e414240224","Type":"ContainerDied","Data":"fdeda4fd776f169ffbbd7edd080f09db3c883c3c0c373b4958456779f61ef72f"} Jan 28 21:11:00 crc kubenswrapper[4746]: I0128 21:11:00.613960 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv" Jan 28 21:11:00 crc kubenswrapper[4746]: I0128 21:11:00.774860 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrf58\" (UniqueName: \"kubernetes.io/projected/8728e263-d102-4878-a40e-30e414240224-kube-api-access-wrf58\") pod \"8728e263-d102-4878-a40e-30e414240224\" (UID: \"8728e263-d102-4878-a40e-30e414240224\") " Jan 28 21:11:00 crc kubenswrapper[4746]: I0128 21:11:00.775242 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8728e263-d102-4878-a40e-30e414240224-ssh-key-openstack-edpm-ipam\") pod \"8728e263-d102-4878-a40e-30e414240224\" (UID: \"8728e263-d102-4878-a40e-30e414240224\") " Jan 28 21:11:00 crc kubenswrapper[4746]: I0128 21:11:00.775427 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8728e263-d102-4878-a40e-30e414240224-inventory\") pod \"8728e263-d102-4878-a40e-30e414240224\" (UID: \"8728e263-d102-4878-a40e-30e414240224\") " Jan 28 21:11:00 crc kubenswrapper[4746]: I0128 21:11:00.780307 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8728e263-d102-4878-a40e-30e414240224-kube-api-access-wrf58" (OuterVolumeSpecName: "kube-api-access-wrf58") pod "8728e263-d102-4878-a40e-30e414240224" (UID: "8728e263-d102-4878-a40e-30e414240224"). InnerVolumeSpecName "kube-api-access-wrf58". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:11:00 crc kubenswrapper[4746]: I0128 21:11:00.803602 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8728e263-d102-4878-a40e-30e414240224-inventory" (OuterVolumeSpecName: "inventory") pod "8728e263-d102-4878-a40e-30e414240224" (UID: "8728e263-d102-4878-a40e-30e414240224"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:11:00 crc kubenswrapper[4746]: I0128 21:11:00.805851 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8728e263-d102-4878-a40e-30e414240224-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8728e263-d102-4878-a40e-30e414240224" (UID: "8728e263-d102-4878-a40e-30e414240224"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:11:00 crc kubenswrapper[4746]: I0128 21:11:00.878270 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrf58\" (UniqueName: \"kubernetes.io/projected/8728e263-d102-4878-a40e-30e414240224-kube-api-access-wrf58\") on node \"crc\" DevicePath \"\"" Jan 28 21:11:00 crc kubenswrapper[4746]: I0128 21:11:00.878327 4746 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8728e263-d102-4878-a40e-30e414240224-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 21:11:00 crc kubenswrapper[4746]: I0128 21:11:00.878354 4746 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8728e263-d102-4878-a40e-30e414240224-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.145893 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv" event={"ID":"8728e263-d102-4878-a40e-30e414240224","Type":"ContainerDied","Data":"00630491c543b2a0ec7436a3a8f7545709d1652f05ce583eb195885e73c00275"} Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.146396 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00630491c543b2a0ec7436a3a8f7545709d1652f05ce583eb195885e73c00275" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.145938 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-rjltv" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.221752 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw"] Jan 28 21:11:01 crc kubenswrapper[4746]: E0128 21:11:01.222314 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8728e263-d102-4878-a40e-30e414240224" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.222334 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="8728e263-d102-4878-a40e-30e414240224" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.222572 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="8728e263-d102-4878-a40e-30e414240224" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.223484 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.225907 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.226819 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.227274 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.236951 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xb8vv" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.238188 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw"] Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.388747 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55aba866-d60c-4581-8f83-28fc14e421f8-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qdhsw\" (UID: \"55aba866-d60c-4581-8f83-28fc14e421f8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.389325 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55aba866-d60c-4581-8f83-28fc14e421f8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qdhsw\" (UID: \"55aba866-d60c-4581-8f83-28fc14e421f8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.389471 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc4gh\" (UniqueName: \"kubernetes.io/projected/55aba866-d60c-4581-8f83-28fc14e421f8-kube-api-access-zc4gh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qdhsw\" (UID: \"55aba866-d60c-4581-8f83-28fc14e421f8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.491245 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55aba866-d60c-4581-8f83-28fc14e421f8-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qdhsw\" (UID: \"55aba866-d60c-4581-8f83-28fc14e421f8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.491406 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55aba866-d60c-4581-8f83-28fc14e421f8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qdhsw\" (UID: \"55aba866-d60c-4581-8f83-28fc14e421f8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.491448 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc4gh\" (UniqueName: \"kubernetes.io/projected/55aba866-d60c-4581-8f83-28fc14e421f8-kube-api-access-zc4gh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qdhsw\" (UID: \"55aba866-d60c-4581-8f83-28fc14e421f8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.496728 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55aba866-d60c-4581-8f83-28fc14e421f8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qdhsw\" (UID: \"55aba866-d60c-4581-8f83-28fc14e421f8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.497717 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55aba866-d60c-4581-8f83-28fc14e421f8-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qdhsw\" (UID: \"55aba866-d60c-4581-8f83-28fc14e421f8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.513879 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc4gh\" (UniqueName: \"kubernetes.io/projected/55aba866-d60c-4581-8f83-28fc14e421f8-kube-api-access-zc4gh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qdhsw\" (UID: \"55aba866-d60c-4581-8f83-28fc14e421f8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw" Jan 28 21:11:01 crc kubenswrapper[4746]: I0128 21:11:01.541596 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw" Jan 28 21:11:02 crc kubenswrapper[4746]: I0128 21:11:02.108568 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw"] Jan 28 21:11:02 crc kubenswrapper[4746]: I0128 21:11:02.163515 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw" event={"ID":"55aba866-d60c-4581-8f83-28fc14e421f8","Type":"ContainerStarted","Data":"41db69260c3286c21ba2442e864ae2ef580bda8b9deecf74c0d5d6ccb9dc61e1"} Jan 28 21:11:03 crc kubenswrapper[4746]: I0128 21:11:03.173442 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw" event={"ID":"55aba866-d60c-4581-8f83-28fc14e421f8","Type":"ContainerStarted","Data":"11aef63d47ff6bdeb8b91607ae39040b8ff46d81aee0f78fc01cacf626f277c5"} Jan 28 21:11:03 crc kubenswrapper[4746]: I0128 21:11:03.191483 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw" podStartSLOduration=1.761934026 podStartE2EDuration="2.19146336s" podCreationTimestamp="2026-01-28 21:11:01 +0000 UTC" firstStartedPulling="2026-01-28 21:11:02.110762194 +0000 UTC m=+1890.066948548" lastFinishedPulling="2026-01-28 21:11:02.540291538 +0000 UTC m=+1890.496477882" observedRunningTime="2026-01-28 21:11:03.187128664 +0000 UTC m=+1891.143315018" watchObservedRunningTime="2026-01-28 21:11:03.19146336 +0000 UTC m=+1891.147649714" Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.050238 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-e199-account-create-update-v6fd6"] Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.063256 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-j4mld"] Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.075398 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-42e2-account-create-update-k45vw"] Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.089060 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-kp99z"] Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.100936 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c1ea-account-create-update-kph5m"] Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.111099 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-42e2-account-create-update-k45vw"] Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.120985 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-kp99z"] Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.131118 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c1ea-account-create-update-kph5m"] Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.141732 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-j4mld"] Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.153649 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-e199-account-create-update-v6fd6"] Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.165039 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-gmx6k"] Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.175198 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-gmx6k"] Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.836536 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:11:06 crc kubenswrapper[4746]: E0128 21:11:06.836796 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.850706 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a9afff1-311d-4881-a211-e405af09d4a7" path="/var/lib/kubelet/pods/0a9afff1-311d-4881-a211-e405af09d4a7/volumes" Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.851374 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4718d022-41c2-4684-9093-1f83e23dc367" path="/var/lib/kubelet/pods/4718d022-41c2-4684-9093-1f83e23dc367/volumes" Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.851924 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b0f1a53-15ed-455c-80a3-7f92a3851538" path="/var/lib/kubelet/pods/6b0f1a53-15ed-455c-80a3-7f92a3851538/volumes" Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.852455 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f60a726-8cd2-4eca-b252-014d68fded35" path="/var/lib/kubelet/pods/9f60a726-8cd2-4eca-b252-014d68fded35/volumes" Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.853481 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="add7217e-e0fc-4079-a1c4-b3a328588a9a" path="/var/lib/kubelet/pods/add7217e-e0fc-4079-a1c4-b3a328588a9a/volumes" Jan 28 21:11:06 crc kubenswrapper[4746]: I0128 21:11:06.853994 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eae3e347-2cb4-49b7-999d-abdfe9c3d2ac" path="/var/lib/kubelet/pods/eae3e347-2cb4-49b7-999d-abdfe9c3d2ac/volumes" Jan 28 21:11:18 crc kubenswrapper[4746]: I0128 21:11:18.835960 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:11:18 crc kubenswrapper[4746]: E0128 21:11:18.837041 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:11:32 crc kubenswrapper[4746]: I0128 21:11:32.844476 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:11:32 crc kubenswrapper[4746]: E0128 21:11:32.845573 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:11:37 crc kubenswrapper[4746]: I0128 21:11:37.505065 4746 generic.go:334] "Generic (PLEG): container finished" podID="55aba866-d60c-4581-8f83-28fc14e421f8" containerID="11aef63d47ff6bdeb8b91607ae39040b8ff46d81aee0f78fc01cacf626f277c5" exitCode=0 Jan 28 21:11:37 crc kubenswrapper[4746]: I0128 21:11:37.505141 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw" event={"ID":"55aba866-d60c-4581-8f83-28fc14e421f8","Type":"ContainerDied","Data":"11aef63d47ff6bdeb8b91607ae39040b8ff46d81aee0f78fc01cacf626f277c5"} Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.095067 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.228861 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55aba866-d60c-4581-8f83-28fc14e421f8-inventory\") pod \"55aba866-d60c-4581-8f83-28fc14e421f8\" (UID: \"55aba866-d60c-4581-8f83-28fc14e421f8\") " Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.229316 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc4gh\" (UniqueName: \"kubernetes.io/projected/55aba866-d60c-4581-8f83-28fc14e421f8-kube-api-access-zc4gh\") pod \"55aba866-d60c-4581-8f83-28fc14e421f8\" (UID: \"55aba866-d60c-4581-8f83-28fc14e421f8\") " Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.229847 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55aba866-d60c-4581-8f83-28fc14e421f8-ssh-key-openstack-edpm-ipam\") pod \"55aba866-d60c-4581-8f83-28fc14e421f8\" (UID: \"55aba866-d60c-4581-8f83-28fc14e421f8\") " Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.237488 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55aba866-d60c-4581-8f83-28fc14e421f8-kube-api-access-zc4gh" (OuterVolumeSpecName: "kube-api-access-zc4gh") pod "55aba866-d60c-4581-8f83-28fc14e421f8" (UID: "55aba866-d60c-4581-8f83-28fc14e421f8"). InnerVolumeSpecName "kube-api-access-zc4gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.269908 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55aba866-d60c-4581-8f83-28fc14e421f8-inventory" (OuterVolumeSpecName: "inventory") pod "55aba866-d60c-4581-8f83-28fc14e421f8" (UID: "55aba866-d60c-4581-8f83-28fc14e421f8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.287205 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55aba866-d60c-4581-8f83-28fc14e421f8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "55aba866-d60c-4581-8f83-28fc14e421f8" (UID: "55aba866-d60c-4581-8f83-28fc14e421f8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.332244 4746 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55aba866-d60c-4581-8f83-28fc14e421f8-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.332278 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc4gh\" (UniqueName: \"kubernetes.io/projected/55aba866-d60c-4581-8f83-28fc14e421f8-kube-api-access-zc4gh\") on node \"crc\" DevicePath \"\"" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.332291 4746 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55aba866-d60c-4581-8f83-28fc14e421f8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.524702 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw" event={"ID":"55aba866-d60c-4581-8f83-28fc14e421f8","Type":"ContainerDied","Data":"41db69260c3286c21ba2442e864ae2ef580bda8b9deecf74c0d5d6ccb9dc61e1"} Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.524757 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41db69260c3286c21ba2442e864ae2ef580bda8b9deecf74c0d5d6ccb9dc61e1" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.524761 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qdhsw" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.623346 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f"] Jan 28 21:11:39 crc kubenswrapper[4746]: E0128 21:11:39.623908 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55aba866-d60c-4581-8f83-28fc14e421f8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.623932 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="55aba866-d60c-4581-8f83-28fc14e421f8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.624211 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="55aba866-d60c-4581-8f83-28fc14e421f8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.625159 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.627375 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.627489 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.627850 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.630558 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xb8vv" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.635246 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f"] Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.741964 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sxhs\" (UniqueName: \"kubernetes.io/projected/e9b6010d-cd57-4992-b441-1745330a0246-kube-api-access-7sxhs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-42x6f\" (UID: \"e9b6010d-cd57-4992-b441-1745330a0246\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.742024 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9b6010d-cd57-4992-b441-1745330a0246-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-42x6f\" (UID: \"e9b6010d-cd57-4992-b441-1745330a0246\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.742149 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b6010d-cd57-4992-b441-1745330a0246-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-42x6f\" (UID: \"e9b6010d-cd57-4992-b441-1745330a0246\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.844012 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b6010d-cd57-4992-b441-1745330a0246-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-42x6f\" (UID: \"e9b6010d-cd57-4992-b441-1745330a0246\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.844176 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sxhs\" (UniqueName: \"kubernetes.io/projected/e9b6010d-cd57-4992-b441-1745330a0246-kube-api-access-7sxhs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-42x6f\" (UID: \"e9b6010d-cd57-4992-b441-1745330a0246\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.844557 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9b6010d-cd57-4992-b441-1745330a0246-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-42x6f\" (UID: \"e9b6010d-cd57-4992-b441-1745330a0246\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.849601 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b6010d-cd57-4992-b441-1745330a0246-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-42x6f\" (UID: \"e9b6010d-cd57-4992-b441-1745330a0246\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.851709 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9b6010d-cd57-4992-b441-1745330a0246-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-42x6f\" (UID: \"e9b6010d-cd57-4992-b441-1745330a0246\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.860016 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sxhs\" (UniqueName: \"kubernetes.io/projected/e9b6010d-cd57-4992-b441-1745330a0246-kube-api-access-7sxhs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-42x6f\" (UID: \"e9b6010d-cd57-4992-b441-1745330a0246\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f" Jan 28 21:11:39 crc kubenswrapper[4746]: I0128 21:11:39.943447 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f" Jan 28 21:11:40 crc kubenswrapper[4746]: I0128 21:11:40.073152 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-z7sfr"] Jan 28 21:11:40 crc kubenswrapper[4746]: I0128 21:11:40.093054 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-z7sfr"] Jan 28 21:11:40 crc kubenswrapper[4746]: I0128 21:11:40.616420 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f"] Jan 28 21:11:40 crc kubenswrapper[4746]: I0128 21:11:40.847774 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28e5e616-1a83-4053-8231-e3763118ca8e" path="/var/lib/kubelet/pods/28e5e616-1a83-4053-8231-e3763118ca8e/volumes" Jan 28 21:11:41 crc kubenswrapper[4746]: I0128 21:11:41.545718 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f" event={"ID":"e9b6010d-cd57-4992-b441-1745330a0246","Type":"ContainerStarted","Data":"298985ce96cb9362f23819db0ff9eadf1e5cd5265cc2b09e44713cbcde5448f0"} Jan 28 21:11:42 crc kubenswrapper[4746]: I0128 21:11:42.555717 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f" event={"ID":"e9b6010d-cd57-4992-b441-1745330a0246","Type":"ContainerStarted","Data":"fad01eaa510f4455b503993cf88294056e3a71327a854563dfae3d0eb1189663"} Jan 28 21:11:42 crc kubenswrapper[4746]: I0128 21:11:42.574892 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f" podStartSLOduration=2.943895271 podStartE2EDuration="3.57487124s" podCreationTimestamp="2026-01-28 21:11:39 +0000 UTC" firstStartedPulling="2026-01-28 21:11:40.619283876 +0000 UTC m=+1928.575470240" lastFinishedPulling="2026-01-28 21:11:41.250259845 +0000 UTC m=+1929.206446209" observedRunningTime="2026-01-28 21:11:42.569955877 +0000 UTC m=+1930.526142231" watchObservedRunningTime="2026-01-28 21:11:42.57487124 +0000 UTC m=+1930.531057594" Jan 28 21:11:47 crc kubenswrapper[4746]: I0128 21:11:47.837017 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:11:47 crc kubenswrapper[4746]: E0128 21:11:47.838010 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:11:58 crc kubenswrapper[4746]: I0128 21:11:58.178419 4746 scope.go:117] "RemoveContainer" containerID="d169eec66f0269f166e9d76665f5a0824e488ca1c076ea9e824d196e4427738a" Jan 28 21:11:58 crc kubenswrapper[4746]: I0128 21:11:58.213399 4746 scope.go:117] "RemoveContainer" containerID="78345b845e597cbbe0267b558d090ddfd4e9395c0d78a932f6d1dde4fb653c96" Jan 28 21:11:58 crc kubenswrapper[4746]: I0128 21:11:58.263692 4746 scope.go:117] "RemoveContainer" containerID="efe875ab3e58b67563bf8cd1bf69fe6c5daa702a9ea730e11ff8260a0247bba6" Jan 28 21:11:58 crc kubenswrapper[4746]: I0128 21:11:58.326959 4746 scope.go:117] "RemoveContainer" containerID="ca7de28f90af63964789fb78ce766e51fb5b77343d12a3dbcfc483f001c17385" Jan 28 21:11:58 crc kubenswrapper[4746]: I0128 21:11:58.357971 4746 scope.go:117] "RemoveContainer" containerID="d252a81e0c138932fe8347bd1c6f1c0acde2062679fb8d94914bf45b47e500d8" Jan 28 21:11:58 crc kubenswrapper[4746]: I0128 21:11:58.412014 4746 scope.go:117] "RemoveContainer" containerID="8b9d8ff48bd5a1b24dbd3629a43ac6498bf514a55e69aea7af79ec775b405a69" Jan 28 21:11:58 crc kubenswrapper[4746]: I0128 21:11:58.460120 4746 scope.go:117] "RemoveContainer" containerID="74048ad3db6523e71f273f6b209d7e2631b278f3167e76f7d0ff1af092ac027f" Jan 28 21:11:58 crc kubenswrapper[4746]: I0128 21:11:58.838264 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:11:58 crc kubenswrapper[4746]: E0128 21:11:58.838838 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:12:01 crc kubenswrapper[4746]: I0128 21:12:01.030818 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-28g8p"] Jan 28 21:12:01 crc kubenswrapper[4746]: I0128 21:12:01.040366 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-28g8p"] Jan 28 21:12:02 crc kubenswrapper[4746]: I0128 21:12:02.851183 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b92f820-9bba-4102-b4ee-1c541c3a05d7" path="/var/lib/kubelet/pods/8b92f820-9bba-4102-b4ee-1c541c3a05d7/volumes" Jan 28 21:12:05 crc kubenswrapper[4746]: I0128 21:12:05.053400 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8xq2h"] Jan 28 21:12:05 crc kubenswrapper[4746]: I0128 21:12:05.070764 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8xq2h"] Jan 28 21:12:06 crc kubenswrapper[4746]: I0128 21:12:06.845881 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="487dd560-95f2-45e5-b09b-f8d3aae5548a" path="/var/lib/kubelet/pods/487dd560-95f2-45e5-b09b-f8d3aae5548a/volumes" Jan 28 21:12:09 crc kubenswrapper[4746]: I0128 21:12:09.836020 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:12:09 crc kubenswrapper[4746]: E0128 21:12:09.836714 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:12:24 crc kubenswrapper[4746]: I0128 21:12:24.837164 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:12:24 crc kubenswrapper[4746]: E0128 21:12:24.838369 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:12:26 crc kubenswrapper[4746]: I0128 21:12:26.035586 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-jng7h"] Jan 28 21:12:26 crc kubenswrapper[4746]: I0128 21:12:26.046745 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-jng7h"] Jan 28 21:12:26 crc kubenswrapper[4746]: I0128 21:12:26.859207 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3913d1a-3943-41bf-a670-cf63f257f3a4" path="/var/lib/kubelet/pods/c3913d1a-3943-41bf-a670-cf63f257f3a4/volumes" Jan 28 21:12:26 crc kubenswrapper[4746]: I0128 21:12:26.984390 4746 generic.go:334] "Generic (PLEG): container finished" podID="e9b6010d-cd57-4992-b441-1745330a0246" containerID="fad01eaa510f4455b503993cf88294056e3a71327a854563dfae3d0eb1189663" exitCode=0 Jan 28 21:12:26 crc kubenswrapper[4746]: I0128 21:12:26.984453 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f" event={"ID":"e9b6010d-cd57-4992-b441-1745330a0246","Type":"ContainerDied","Data":"fad01eaa510f4455b503993cf88294056e3a71327a854563dfae3d0eb1189663"} Jan 28 21:12:28 crc kubenswrapper[4746]: I0128 21:12:28.534461 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f" Jan 28 21:12:28 crc kubenswrapper[4746]: I0128 21:12:28.705890 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sxhs\" (UniqueName: \"kubernetes.io/projected/e9b6010d-cd57-4992-b441-1745330a0246-kube-api-access-7sxhs\") pod \"e9b6010d-cd57-4992-b441-1745330a0246\" (UID: \"e9b6010d-cd57-4992-b441-1745330a0246\") " Jan 28 21:12:28 crc kubenswrapper[4746]: I0128 21:12:28.706019 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b6010d-cd57-4992-b441-1745330a0246-inventory\") pod \"e9b6010d-cd57-4992-b441-1745330a0246\" (UID: \"e9b6010d-cd57-4992-b441-1745330a0246\") " Jan 28 21:12:28 crc kubenswrapper[4746]: I0128 21:12:28.706428 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9b6010d-cd57-4992-b441-1745330a0246-ssh-key-openstack-edpm-ipam\") pod \"e9b6010d-cd57-4992-b441-1745330a0246\" (UID: \"e9b6010d-cd57-4992-b441-1745330a0246\") " Jan 28 21:12:28 crc kubenswrapper[4746]: I0128 21:12:28.711349 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9b6010d-cd57-4992-b441-1745330a0246-kube-api-access-7sxhs" (OuterVolumeSpecName: "kube-api-access-7sxhs") pod "e9b6010d-cd57-4992-b441-1745330a0246" (UID: "e9b6010d-cd57-4992-b441-1745330a0246"). InnerVolumeSpecName "kube-api-access-7sxhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:12:28 crc kubenswrapper[4746]: I0128 21:12:28.738364 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b6010d-cd57-4992-b441-1745330a0246-inventory" (OuterVolumeSpecName: "inventory") pod "e9b6010d-cd57-4992-b441-1745330a0246" (UID: "e9b6010d-cd57-4992-b441-1745330a0246"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:12:28 crc kubenswrapper[4746]: I0128 21:12:28.745237 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b6010d-cd57-4992-b441-1745330a0246-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e9b6010d-cd57-4992-b441-1745330a0246" (UID: "e9b6010d-cd57-4992-b441-1745330a0246"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:12:28 crc kubenswrapper[4746]: I0128 21:12:28.809163 4746 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9b6010d-cd57-4992-b441-1745330a0246-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 21:12:28 crc kubenswrapper[4746]: I0128 21:12:28.809204 4746 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9b6010d-cd57-4992-b441-1745330a0246-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 21:12:28 crc kubenswrapper[4746]: I0128 21:12:28.809218 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sxhs\" (UniqueName: \"kubernetes.io/projected/e9b6010d-cd57-4992-b441-1745330a0246-kube-api-access-7sxhs\") on node \"crc\" DevicePath \"\"" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.009104 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f" event={"ID":"e9b6010d-cd57-4992-b441-1745330a0246","Type":"ContainerDied","Data":"298985ce96cb9362f23819db0ff9eadf1e5cd5265cc2b09e44713cbcde5448f0"} Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.009143 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="298985ce96cb9362f23819db0ff9eadf1e5cd5265cc2b09e44713cbcde5448f0" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.009164 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-42x6f" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.109324 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-44hcw"] Jan 28 21:12:29 crc kubenswrapper[4746]: E0128 21:12:29.109737 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9b6010d-cd57-4992-b441-1745330a0246" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.109755 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9b6010d-cd57-4992-b441-1745330a0246" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.109953 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9b6010d-cd57-4992-b441-1745330a0246" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.110652 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-44hcw" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.113519 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xb8vv" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.113871 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.114169 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.114293 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.131528 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-44hcw"] Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.217250 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb5bd90c-ea83-463c-aed8-3291063c50bc-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-44hcw\" (UID: \"cb5bd90c-ea83-463c-aed8-3291063c50bc\") " pod="openstack/ssh-known-hosts-edpm-deployment-44hcw" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.217359 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmwfq\" (UniqueName: \"kubernetes.io/projected/cb5bd90c-ea83-463c-aed8-3291063c50bc-kube-api-access-bmwfq\") pod \"ssh-known-hosts-edpm-deployment-44hcw\" (UID: \"cb5bd90c-ea83-463c-aed8-3291063c50bc\") " pod="openstack/ssh-known-hosts-edpm-deployment-44hcw" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.217895 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cb5bd90c-ea83-463c-aed8-3291063c50bc-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-44hcw\" (UID: \"cb5bd90c-ea83-463c-aed8-3291063c50bc\") " pod="openstack/ssh-known-hosts-edpm-deployment-44hcw" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.319842 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cb5bd90c-ea83-463c-aed8-3291063c50bc-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-44hcw\" (UID: \"cb5bd90c-ea83-463c-aed8-3291063c50bc\") " pod="openstack/ssh-known-hosts-edpm-deployment-44hcw" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.319969 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb5bd90c-ea83-463c-aed8-3291063c50bc-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-44hcw\" (UID: \"cb5bd90c-ea83-463c-aed8-3291063c50bc\") " pod="openstack/ssh-known-hosts-edpm-deployment-44hcw" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.320039 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmwfq\" (UniqueName: \"kubernetes.io/projected/cb5bd90c-ea83-463c-aed8-3291063c50bc-kube-api-access-bmwfq\") pod \"ssh-known-hosts-edpm-deployment-44hcw\" (UID: \"cb5bd90c-ea83-463c-aed8-3291063c50bc\") " pod="openstack/ssh-known-hosts-edpm-deployment-44hcw" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.323569 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb5bd90c-ea83-463c-aed8-3291063c50bc-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-44hcw\" (UID: \"cb5bd90c-ea83-463c-aed8-3291063c50bc\") " pod="openstack/ssh-known-hosts-edpm-deployment-44hcw" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.325492 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cb5bd90c-ea83-463c-aed8-3291063c50bc-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-44hcw\" (UID: \"cb5bd90c-ea83-463c-aed8-3291063c50bc\") " pod="openstack/ssh-known-hosts-edpm-deployment-44hcw" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.340178 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmwfq\" (UniqueName: \"kubernetes.io/projected/cb5bd90c-ea83-463c-aed8-3291063c50bc-kube-api-access-bmwfq\") pod \"ssh-known-hosts-edpm-deployment-44hcw\" (UID: \"cb5bd90c-ea83-463c-aed8-3291063c50bc\") " pod="openstack/ssh-known-hosts-edpm-deployment-44hcw" Jan 28 21:12:29 crc kubenswrapper[4746]: I0128 21:12:29.428707 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-44hcw" Jan 28 21:12:30 crc kubenswrapper[4746]: I0128 21:12:30.017292 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-44hcw"] Jan 28 21:12:31 crc kubenswrapper[4746]: I0128 21:12:31.034550 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-44hcw" event={"ID":"cb5bd90c-ea83-463c-aed8-3291063c50bc","Type":"ContainerStarted","Data":"7791bc7820518a057927eefeacfbf1576dfd88bec71eed5122ad79d8178c0eac"} Jan 28 21:12:31 crc kubenswrapper[4746]: I0128 21:12:31.034910 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-44hcw" event={"ID":"cb5bd90c-ea83-463c-aed8-3291063c50bc","Type":"ContainerStarted","Data":"468ebc5d2927e37b5a3e5fac850f8b317ab915f92a24ecce86dd55c1d46d91e3"} Jan 28 21:12:31 crc kubenswrapper[4746]: I0128 21:12:31.055662 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-44hcw" podStartSLOduration=1.596013846 podStartE2EDuration="2.055643451s" podCreationTimestamp="2026-01-28 21:12:29 +0000 UTC" firstStartedPulling="2026-01-28 21:12:30.021764304 +0000 UTC m=+1977.977950658" lastFinishedPulling="2026-01-28 21:12:30.481393899 +0000 UTC m=+1978.437580263" observedRunningTime="2026-01-28 21:12:31.048612551 +0000 UTC m=+1979.004798915" watchObservedRunningTime="2026-01-28 21:12:31.055643451 +0000 UTC m=+1979.011829795" Jan 28 21:12:35 crc kubenswrapper[4746]: I0128 21:12:35.836417 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:12:35 crc kubenswrapper[4746]: E0128 21:12:35.837220 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:12:37 crc kubenswrapper[4746]: I0128 21:12:37.124515 4746 generic.go:334] "Generic (PLEG): container finished" podID="cb5bd90c-ea83-463c-aed8-3291063c50bc" containerID="7791bc7820518a057927eefeacfbf1576dfd88bec71eed5122ad79d8178c0eac" exitCode=0 Jan 28 21:12:37 crc kubenswrapper[4746]: I0128 21:12:37.124618 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-44hcw" event={"ID":"cb5bd90c-ea83-463c-aed8-3291063c50bc","Type":"ContainerDied","Data":"7791bc7820518a057927eefeacfbf1576dfd88bec71eed5122ad79d8178c0eac"} Jan 28 21:12:38 crc kubenswrapper[4746]: I0128 21:12:38.662522 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-44hcw" Jan 28 21:12:38 crc kubenswrapper[4746]: I0128 21:12:38.740650 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmwfq\" (UniqueName: \"kubernetes.io/projected/cb5bd90c-ea83-463c-aed8-3291063c50bc-kube-api-access-bmwfq\") pod \"cb5bd90c-ea83-463c-aed8-3291063c50bc\" (UID: \"cb5bd90c-ea83-463c-aed8-3291063c50bc\") " Jan 28 21:12:38 crc kubenswrapper[4746]: I0128 21:12:38.740743 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cb5bd90c-ea83-463c-aed8-3291063c50bc-inventory-0\") pod \"cb5bd90c-ea83-463c-aed8-3291063c50bc\" (UID: \"cb5bd90c-ea83-463c-aed8-3291063c50bc\") " Jan 28 21:12:38 crc kubenswrapper[4746]: I0128 21:12:38.740989 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb5bd90c-ea83-463c-aed8-3291063c50bc-ssh-key-openstack-edpm-ipam\") pod \"cb5bd90c-ea83-463c-aed8-3291063c50bc\" (UID: \"cb5bd90c-ea83-463c-aed8-3291063c50bc\") " Jan 28 21:12:38 crc kubenswrapper[4746]: I0128 21:12:38.748586 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb5bd90c-ea83-463c-aed8-3291063c50bc-kube-api-access-bmwfq" (OuterVolumeSpecName: "kube-api-access-bmwfq") pod "cb5bd90c-ea83-463c-aed8-3291063c50bc" (UID: "cb5bd90c-ea83-463c-aed8-3291063c50bc"). InnerVolumeSpecName "kube-api-access-bmwfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:12:38 crc kubenswrapper[4746]: I0128 21:12:38.770326 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5bd90c-ea83-463c-aed8-3291063c50bc-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "cb5bd90c-ea83-463c-aed8-3291063c50bc" (UID: "cb5bd90c-ea83-463c-aed8-3291063c50bc"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:12:38 crc kubenswrapper[4746]: I0128 21:12:38.776439 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb5bd90c-ea83-463c-aed8-3291063c50bc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cb5bd90c-ea83-463c-aed8-3291063c50bc" (UID: "cb5bd90c-ea83-463c-aed8-3291063c50bc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:12:38 crc kubenswrapper[4746]: I0128 21:12:38.842857 4746 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb5bd90c-ea83-463c-aed8-3291063c50bc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 21:12:38 crc kubenswrapper[4746]: I0128 21:12:38.842886 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmwfq\" (UniqueName: \"kubernetes.io/projected/cb5bd90c-ea83-463c-aed8-3291063c50bc-kube-api-access-bmwfq\") on node \"crc\" DevicePath \"\"" Jan 28 21:12:38 crc kubenswrapper[4746]: I0128 21:12:38.842895 4746 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cb5bd90c-ea83-463c-aed8-3291063c50bc-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.151625 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-44hcw" event={"ID":"cb5bd90c-ea83-463c-aed8-3291063c50bc","Type":"ContainerDied","Data":"468ebc5d2927e37b5a3e5fac850f8b317ab915f92a24ecce86dd55c1d46d91e3"} Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.151680 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-44hcw" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.151690 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="468ebc5d2927e37b5a3e5fac850f8b317ab915f92a24ecce86dd55c1d46d91e3" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.219914 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb"] Jan 28 21:12:39 crc kubenswrapper[4746]: E0128 21:12:39.220567 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb5bd90c-ea83-463c-aed8-3291063c50bc" containerName="ssh-known-hosts-edpm-deployment" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.220645 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb5bd90c-ea83-463c-aed8-3291063c50bc" containerName="ssh-known-hosts-edpm-deployment" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.220901 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb5bd90c-ea83-463c-aed8-3291063c50bc" containerName="ssh-known-hosts-edpm-deployment" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.221688 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.231785 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.231814 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.232678 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xb8vv" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.234700 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.236484 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb"] Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.351557 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/280f47f9-2f66-4991-bd8f-59b734c5a935-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xv2wb\" (UID: \"280f47f9-2f66-4991-bd8f-59b734c5a935\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.351671 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w95s\" (UniqueName: \"kubernetes.io/projected/280f47f9-2f66-4991-bd8f-59b734c5a935-kube-api-access-8w95s\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xv2wb\" (UID: \"280f47f9-2f66-4991-bd8f-59b734c5a935\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.351860 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/280f47f9-2f66-4991-bd8f-59b734c5a935-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xv2wb\" (UID: \"280f47f9-2f66-4991-bd8f-59b734c5a935\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.454484 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/280f47f9-2f66-4991-bd8f-59b734c5a935-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xv2wb\" (UID: \"280f47f9-2f66-4991-bd8f-59b734c5a935\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.454618 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w95s\" (UniqueName: \"kubernetes.io/projected/280f47f9-2f66-4991-bd8f-59b734c5a935-kube-api-access-8w95s\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xv2wb\" (UID: \"280f47f9-2f66-4991-bd8f-59b734c5a935\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.454687 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/280f47f9-2f66-4991-bd8f-59b734c5a935-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xv2wb\" (UID: \"280f47f9-2f66-4991-bd8f-59b734c5a935\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.459960 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/280f47f9-2f66-4991-bd8f-59b734c5a935-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xv2wb\" (UID: \"280f47f9-2f66-4991-bd8f-59b734c5a935\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.460103 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/280f47f9-2f66-4991-bd8f-59b734c5a935-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xv2wb\" (UID: \"280f47f9-2f66-4991-bd8f-59b734c5a935\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.480775 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w95s\" (UniqueName: \"kubernetes.io/projected/280f47f9-2f66-4991-bd8f-59b734c5a935-kube-api-access-8w95s\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xv2wb\" (UID: \"280f47f9-2f66-4991-bd8f-59b734c5a935\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb" Jan 28 21:12:39 crc kubenswrapper[4746]: I0128 21:12:39.546384 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb" Jan 28 21:12:40 crc kubenswrapper[4746]: I0128 21:12:40.160589 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb"] Jan 28 21:12:41 crc kubenswrapper[4746]: I0128 21:12:41.175397 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb" event={"ID":"280f47f9-2f66-4991-bd8f-59b734c5a935","Type":"ContainerStarted","Data":"27383b0903cadce3370d37dc0135f0893c4fd20c6d819b474b5c67fc9a4beac0"} Jan 28 21:12:42 crc kubenswrapper[4746]: I0128 21:12:42.206371 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb" event={"ID":"280f47f9-2f66-4991-bd8f-59b734c5a935","Type":"ContainerStarted","Data":"6c36dd5c9c3cde5499773f891ed48c3f32d8b6d5a781a898292a250d763cc61c"} Jan 28 21:12:42 crc kubenswrapper[4746]: I0128 21:12:42.246935 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb" podStartSLOduration=2.808893117 podStartE2EDuration="3.24691221s" podCreationTimestamp="2026-01-28 21:12:39 +0000 UTC" firstStartedPulling="2026-01-28 21:12:40.157602517 +0000 UTC m=+1988.113788871" lastFinishedPulling="2026-01-28 21:12:40.59562159 +0000 UTC m=+1988.551807964" observedRunningTime="2026-01-28 21:12:42.228633718 +0000 UTC m=+1990.184820092" watchObservedRunningTime="2026-01-28 21:12:42.24691221 +0000 UTC m=+1990.203098574" Jan 28 21:12:48 crc kubenswrapper[4746]: I0128 21:12:48.836262 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:12:49 crc kubenswrapper[4746]: I0128 21:12:49.280487 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerStarted","Data":"bce8376b1cfe954c40209943c42297ca65bedfa462c1a2c3d2942abd9bcf67ec"} Jan 28 21:12:49 crc kubenswrapper[4746]: I0128 21:12:49.282785 4746 generic.go:334] "Generic (PLEG): container finished" podID="280f47f9-2f66-4991-bd8f-59b734c5a935" containerID="6c36dd5c9c3cde5499773f891ed48c3f32d8b6d5a781a898292a250d763cc61c" exitCode=0 Jan 28 21:12:49 crc kubenswrapper[4746]: I0128 21:12:49.282849 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb" event={"ID":"280f47f9-2f66-4991-bd8f-59b734c5a935","Type":"ContainerDied","Data":"6c36dd5c9c3cde5499773f891ed48c3f32d8b6d5a781a898292a250d763cc61c"} Jan 28 21:12:50 crc kubenswrapper[4746]: I0128 21:12:50.798762 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb" Jan 28 21:12:50 crc kubenswrapper[4746]: I0128 21:12:50.911771 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/280f47f9-2f66-4991-bd8f-59b734c5a935-inventory\") pod \"280f47f9-2f66-4991-bd8f-59b734c5a935\" (UID: \"280f47f9-2f66-4991-bd8f-59b734c5a935\") " Jan 28 21:12:50 crc kubenswrapper[4746]: I0128 21:12:50.911927 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/280f47f9-2f66-4991-bd8f-59b734c5a935-ssh-key-openstack-edpm-ipam\") pod \"280f47f9-2f66-4991-bd8f-59b734c5a935\" (UID: \"280f47f9-2f66-4991-bd8f-59b734c5a935\") " Jan 28 21:12:50 crc kubenswrapper[4746]: I0128 21:12:50.912179 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w95s\" (UniqueName: \"kubernetes.io/projected/280f47f9-2f66-4991-bd8f-59b734c5a935-kube-api-access-8w95s\") pod \"280f47f9-2f66-4991-bd8f-59b734c5a935\" (UID: \"280f47f9-2f66-4991-bd8f-59b734c5a935\") " Jan 28 21:12:50 crc kubenswrapper[4746]: I0128 21:12:50.918651 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/280f47f9-2f66-4991-bd8f-59b734c5a935-kube-api-access-8w95s" (OuterVolumeSpecName: "kube-api-access-8w95s") pod "280f47f9-2f66-4991-bd8f-59b734c5a935" (UID: "280f47f9-2f66-4991-bd8f-59b734c5a935"). InnerVolumeSpecName "kube-api-access-8w95s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:12:50 crc kubenswrapper[4746]: I0128 21:12:50.939910 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/280f47f9-2f66-4991-bd8f-59b734c5a935-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "280f47f9-2f66-4991-bd8f-59b734c5a935" (UID: "280f47f9-2f66-4991-bd8f-59b734c5a935"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:12:50 crc kubenswrapper[4746]: I0128 21:12:50.942778 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/280f47f9-2f66-4991-bd8f-59b734c5a935-inventory" (OuterVolumeSpecName: "inventory") pod "280f47f9-2f66-4991-bd8f-59b734c5a935" (UID: "280f47f9-2f66-4991-bd8f-59b734c5a935"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.015515 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w95s\" (UniqueName: \"kubernetes.io/projected/280f47f9-2f66-4991-bd8f-59b734c5a935-kube-api-access-8w95s\") on node \"crc\" DevicePath \"\"" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.015712 4746 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/280f47f9-2f66-4991-bd8f-59b734c5a935-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.015778 4746 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/280f47f9-2f66-4991-bd8f-59b734c5a935-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.302745 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb" event={"ID":"280f47f9-2f66-4991-bd8f-59b734c5a935","Type":"ContainerDied","Data":"27383b0903cadce3370d37dc0135f0893c4fd20c6d819b474b5c67fc9a4beac0"} Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.302803 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27383b0903cadce3370d37dc0135f0893c4fd20c6d819b474b5c67fc9a4beac0" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.302881 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xv2wb" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.416111 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn"] Jan 28 21:12:51 crc kubenswrapper[4746]: E0128 21:12:51.416942 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="280f47f9-2f66-4991-bd8f-59b734c5a935" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.416960 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="280f47f9-2f66-4991-bd8f-59b734c5a935" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.419660 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="280f47f9-2f66-4991-bd8f-59b734c5a935" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.421048 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.425542 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.425863 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.426042 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xb8vv" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.431234 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn"] Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.432723 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.526951 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-648pt\" (UniqueName: \"kubernetes.io/projected/f2c97be5-d93c-4a83-87ad-48abb73d603c-kube-api-access-648pt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn\" (UID: \"f2c97be5-d93c-4a83-87ad-48abb73d603c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.527022 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2c97be5-d93c-4a83-87ad-48abb73d603c-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn\" (UID: \"f2c97be5-d93c-4a83-87ad-48abb73d603c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.527057 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2c97be5-d93c-4a83-87ad-48abb73d603c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn\" (UID: \"f2c97be5-d93c-4a83-87ad-48abb73d603c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.629500 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-648pt\" (UniqueName: \"kubernetes.io/projected/f2c97be5-d93c-4a83-87ad-48abb73d603c-kube-api-access-648pt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn\" (UID: \"f2c97be5-d93c-4a83-87ad-48abb73d603c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.629620 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2c97be5-d93c-4a83-87ad-48abb73d603c-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn\" (UID: \"f2c97be5-d93c-4a83-87ad-48abb73d603c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.629732 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2c97be5-d93c-4a83-87ad-48abb73d603c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn\" (UID: \"f2c97be5-d93c-4a83-87ad-48abb73d603c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.636112 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2c97be5-d93c-4a83-87ad-48abb73d603c-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn\" (UID: \"f2c97be5-d93c-4a83-87ad-48abb73d603c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.636806 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2c97be5-d93c-4a83-87ad-48abb73d603c-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn\" (UID: \"f2c97be5-d93c-4a83-87ad-48abb73d603c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.665557 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-648pt\" (UniqueName: \"kubernetes.io/projected/f2c97be5-d93c-4a83-87ad-48abb73d603c-kube-api-access-648pt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn\" (UID: \"f2c97be5-d93c-4a83-87ad-48abb73d603c\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn" Jan 28 21:12:51 crc kubenswrapper[4746]: I0128 21:12:51.809124 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn" Jan 28 21:12:52 crc kubenswrapper[4746]: I0128 21:12:52.457372 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn"] Jan 28 21:12:53 crc kubenswrapper[4746]: I0128 21:12:53.322391 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn" event={"ID":"f2c97be5-d93c-4a83-87ad-48abb73d603c","Type":"ContainerStarted","Data":"1fb82fa16228434a88d01816e6eca9ff0b3969a74105583610909e3f87db1ed1"} Jan 28 21:12:53 crc kubenswrapper[4746]: I0128 21:12:53.322684 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn" event={"ID":"f2c97be5-d93c-4a83-87ad-48abb73d603c","Type":"ContainerStarted","Data":"a2c566ab9dee18b1dcebb4c742cff142a66423de294bd0b1373d373bd8cdd359"} Jan 28 21:12:53 crc kubenswrapper[4746]: I0128 21:12:53.351616 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn" podStartSLOduration=1.9029387789999999 podStartE2EDuration="2.351584828s" podCreationTimestamp="2026-01-28 21:12:51 +0000 UTC" firstStartedPulling="2026-01-28 21:12:52.467218967 +0000 UTC m=+2000.423405331" lastFinishedPulling="2026-01-28 21:12:52.915864986 +0000 UTC m=+2000.872051380" observedRunningTime="2026-01-28 21:12:53.341034044 +0000 UTC m=+2001.297220418" watchObservedRunningTime="2026-01-28 21:12:53.351584828 +0000 UTC m=+2001.307771222" Jan 28 21:12:58 crc kubenswrapper[4746]: I0128 21:12:58.618507 4746 scope.go:117] "RemoveContainer" containerID="fa89a1688b3552ca7dec90db7ed13ec1efc1a868ef0d9c578ac3752d210eb6da" Jan 28 21:12:58 crc kubenswrapper[4746]: I0128 21:12:58.685350 4746 scope.go:117] "RemoveContainer" containerID="02357381c64547dca72c19b1c8d373c82811515aadec52b3bfa42aa59d1bc3c3" Jan 28 21:12:58 crc kubenswrapper[4746]: I0128 21:12:58.735594 4746 scope.go:117] "RemoveContainer" containerID="7c214f6d26474e5fb2d9b8ea32566cfc23faa619a749415b5082e72af23ad209" Jan 28 21:13:02 crc kubenswrapper[4746]: I0128 21:13:02.451724 4746 generic.go:334] "Generic (PLEG): container finished" podID="f2c97be5-d93c-4a83-87ad-48abb73d603c" containerID="1fb82fa16228434a88d01816e6eca9ff0b3969a74105583610909e3f87db1ed1" exitCode=0 Jan 28 21:13:02 crc kubenswrapper[4746]: I0128 21:13:02.451815 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn" event={"ID":"f2c97be5-d93c-4a83-87ad-48abb73d603c","Type":"ContainerDied","Data":"1fb82fa16228434a88d01816e6eca9ff0b3969a74105583610909e3f87db1ed1"} Jan 28 21:13:03 crc kubenswrapper[4746]: I0128 21:13:03.959532 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.058142 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-648pt\" (UniqueName: \"kubernetes.io/projected/f2c97be5-d93c-4a83-87ad-48abb73d603c-kube-api-access-648pt\") pod \"f2c97be5-d93c-4a83-87ad-48abb73d603c\" (UID: \"f2c97be5-d93c-4a83-87ad-48abb73d603c\") " Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.058482 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2c97be5-d93c-4a83-87ad-48abb73d603c-ssh-key-openstack-edpm-ipam\") pod \"f2c97be5-d93c-4a83-87ad-48abb73d603c\" (UID: \"f2c97be5-d93c-4a83-87ad-48abb73d603c\") " Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.058685 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2c97be5-d93c-4a83-87ad-48abb73d603c-inventory\") pod \"f2c97be5-d93c-4a83-87ad-48abb73d603c\" (UID: \"f2c97be5-d93c-4a83-87ad-48abb73d603c\") " Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.064856 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2c97be5-d93c-4a83-87ad-48abb73d603c-kube-api-access-648pt" (OuterVolumeSpecName: "kube-api-access-648pt") pod "f2c97be5-d93c-4a83-87ad-48abb73d603c" (UID: "f2c97be5-d93c-4a83-87ad-48abb73d603c"). InnerVolumeSpecName "kube-api-access-648pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.088520 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2c97be5-d93c-4a83-87ad-48abb73d603c-inventory" (OuterVolumeSpecName: "inventory") pod "f2c97be5-d93c-4a83-87ad-48abb73d603c" (UID: "f2c97be5-d93c-4a83-87ad-48abb73d603c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.100041 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2c97be5-d93c-4a83-87ad-48abb73d603c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f2c97be5-d93c-4a83-87ad-48abb73d603c" (UID: "f2c97be5-d93c-4a83-87ad-48abb73d603c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.160489 4746 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2c97be5-d93c-4a83-87ad-48abb73d603c-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.160523 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-648pt\" (UniqueName: \"kubernetes.io/projected/f2c97be5-d93c-4a83-87ad-48abb73d603c-kube-api-access-648pt\") on node \"crc\" DevicePath \"\"" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.160535 4746 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2c97be5-d93c-4a83-87ad-48abb73d603c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.477669 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn" event={"ID":"f2c97be5-d93c-4a83-87ad-48abb73d603c","Type":"ContainerDied","Data":"a2c566ab9dee18b1dcebb4c742cff142a66423de294bd0b1373d373bd8cdd359"} Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.477709 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2c566ab9dee18b1dcebb4c742cff142a66423de294bd0b1373d373bd8cdd359" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.477820 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.561225 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd"] Jan 28 21:13:04 crc kubenswrapper[4746]: E0128 21:13:04.561951 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2c97be5-d93c-4a83-87ad-48abb73d603c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.561972 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c97be5-d93c-4a83-87ad-48abb73d603c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.562193 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2c97be5-d93c-4a83-87ad-48abb73d603c" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.562964 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.566606 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.566818 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.566947 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.567053 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.567124 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xb8vv" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.567065 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.567161 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.567211 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.567239 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dgkm\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-kube-api-access-8dgkm\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.567357 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.567398 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.567447 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.567488 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.567496 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.567518 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.567557 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.567586 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.567608 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.568155 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.568214 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.568337 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.568511 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.580110 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd"] Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.669601 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.669663 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.669696 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.669722 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.669738 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dgkm\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-kube-api-access-8dgkm\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.669769 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.669790 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.669812 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.669831 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.669852 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.669876 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.669894 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.669909 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.669970 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.675968 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.678464 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.679132 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.679295 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.679758 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.680677 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.681034 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.681475 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.682006 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.685021 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.685816 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.693252 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dgkm\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-kube-api-access-8dgkm\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.697415 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.698558 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:04 crc kubenswrapper[4746]: I0128 21:13:04.883197 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:05 crc kubenswrapper[4746]: W0128 21:13:05.544200 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dbab66d_c007_4c33_b6da_1e44860668a0.slice/crio-998d70de2678a6089b8707bb79bef5f0e82579df739913a770d007ad5ff89cef WatchSource:0}: Error finding container 998d70de2678a6089b8707bb79bef5f0e82579df739913a770d007ad5ff89cef: Status 404 returned error can't find the container with id 998d70de2678a6089b8707bb79bef5f0e82579df739913a770d007ad5ff89cef Jan 28 21:13:05 crc kubenswrapper[4746]: I0128 21:13:05.547100 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd"] Jan 28 21:13:06 crc kubenswrapper[4746]: I0128 21:13:06.526239 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" event={"ID":"0dbab66d-c007-4c33-b6da-1e44860668a0","Type":"ContainerStarted","Data":"20b6eb5e222c20b922bfa6e93b4c6089bc8ee4acfbb4976da25aadaf1313a6b1"} Jan 28 21:13:06 crc kubenswrapper[4746]: I0128 21:13:06.527172 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" event={"ID":"0dbab66d-c007-4c33-b6da-1e44860668a0","Type":"ContainerStarted","Data":"998d70de2678a6089b8707bb79bef5f0e82579df739913a770d007ad5ff89cef"} Jan 28 21:13:06 crc kubenswrapper[4746]: I0128 21:13:06.549033 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" podStartSLOduration=2.134707647 podStartE2EDuration="2.549004851s" podCreationTimestamp="2026-01-28 21:13:04 +0000 UTC" firstStartedPulling="2026-01-28 21:13:05.545966755 +0000 UTC m=+2013.502153109" lastFinishedPulling="2026-01-28 21:13:05.960263959 +0000 UTC m=+2013.916450313" observedRunningTime="2026-01-28 21:13:06.546511023 +0000 UTC m=+2014.502697377" watchObservedRunningTime="2026-01-28 21:13:06.549004851 +0000 UTC m=+2014.505191205" Jan 28 21:13:36 crc kubenswrapper[4746]: I0128 21:13:36.876401 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k9wzl"] Jan 28 21:13:36 crc kubenswrapper[4746]: I0128 21:13:36.879580 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k9wzl"] Jan 28 21:13:36 crc kubenswrapper[4746]: I0128 21:13:36.879673 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k9wzl" Jan 28 21:13:36 crc kubenswrapper[4746]: I0128 21:13:36.949710 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb21e5e7-4689-4d83-8b78-377e04af38a2-utilities\") pod \"redhat-operators-k9wzl\" (UID: \"bb21e5e7-4689-4d83-8b78-377e04af38a2\") " pod="openshift-marketplace/redhat-operators-k9wzl" Jan 28 21:13:36 crc kubenswrapper[4746]: I0128 21:13:36.949790 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kfwf\" (UniqueName: \"kubernetes.io/projected/bb21e5e7-4689-4d83-8b78-377e04af38a2-kube-api-access-5kfwf\") pod \"redhat-operators-k9wzl\" (UID: \"bb21e5e7-4689-4d83-8b78-377e04af38a2\") " pod="openshift-marketplace/redhat-operators-k9wzl" Jan 28 21:13:36 crc kubenswrapper[4746]: I0128 21:13:36.949843 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb21e5e7-4689-4d83-8b78-377e04af38a2-catalog-content\") pod \"redhat-operators-k9wzl\" (UID: \"bb21e5e7-4689-4d83-8b78-377e04af38a2\") " pod="openshift-marketplace/redhat-operators-k9wzl" Jan 28 21:13:37 crc kubenswrapper[4746]: I0128 21:13:37.051448 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb21e5e7-4689-4d83-8b78-377e04af38a2-utilities\") pod \"redhat-operators-k9wzl\" (UID: \"bb21e5e7-4689-4d83-8b78-377e04af38a2\") " pod="openshift-marketplace/redhat-operators-k9wzl" Jan 28 21:13:37 crc kubenswrapper[4746]: I0128 21:13:37.051514 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kfwf\" (UniqueName: \"kubernetes.io/projected/bb21e5e7-4689-4d83-8b78-377e04af38a2-kube-api-access-5kfwf\") pod \"redhat-operators-k9wzl\" (UID: \"bb21e5e7-4689-4d83-8b78-377e04af38a2\") " pod="openshift-marketplace/redhat-operators-k9wzl" Jan 28 21:13:37 crc kubenswrapper[4746]: I0128 21:13:37.051557 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb21e5e7-4689-4d83-8b78-377e04af38a2-catalog-content\") pod \"redhat-operators-k9wzl\" (UID: \"bb21e5e7-4689-4d83-8b78-377e04af38a2\") " pod="openshift-marketplace/redhat-operators-k9wzl" Jan 28 21:13:37 crc kubenswrapper[4746]: I0128 21:13:37.052189 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb21e5e7-4689-4d83-8b78-377e04af38a2-utilities\") pod \"redhat-operators-k9wzl\" (UID: \"bb21e5e7-4689-4d83-8b78-377e04af38a2\") " pod="openshift-marketplace/redhat-operators-k9wzl" Jan 28 21:13:37 crc kubenswrapper[4746]: I0128 21:13:37.052225 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb21e5e7-4689-4d83-8b78-377e04af38a2-catalog-content\") pod \"redhat-operators-k9wzl\" (UID: \"bb21e5e7-4689-4d83-8b78-377e04af38a2\") " pod="openshift-marketplace/redhat-operators-k9wzl" Jan 28 21:13:37 crc kubenswrapper[4746]: I0128 21:13:37.074039 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kfwf\" (UniqueName: \"kubernetes.io/projected/bb21e5e7-4689-4d83-8b78-377e04af38a2-kube-api-access-5kfwf\") pod \"redhat-operators-k9wzl\" (UID: \"bb21e5e7-4689-4d83-8b78-377e04af38a2\") " pod="openshift-marketplace/redhat-operators-k9wzl" Jan 28 21:13:37 crc kubenswrapper[4746]: I0128 21:13:37.206317 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k9wzl" Jan 28 21:13:37 crc kubenswrapper[4746]: I0128 21:13:37.769102 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k9wzl"] Jan 28 21:13:37 crc kubenswrapper[4746]: I0128 21:13:37.920727 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9wzl" event={"ID":"bb21e5e7-4689-4d83-8b78-377e04af38a2","Type":"ContainerStarted","Data":"287b26aecfc44e909ee315440e1f15661b0abd5c280d27086443918e4fe0123a"} Jan 28 21:13:38 crc kubenswrapper[4746]: I0128 21:13:38.934687 4746 generic.go:334] "Generic (PLEG): container finished" podID="bb21e5e7-4689-4d83-8b78-377e04af38a2" containerID="6c2d7dbdf815772e6e64d832e421bf6c2bf135f3308aeb7b8954b62d063c8773" exitCode=0 Jan 28 21:13:38 crc kubenswrapper[4746]: I0128 21:13:38.935061 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9wzl" event={"ID":"bb21e5e7-4689-4d83-8b78-377e04af38a2","Type":"ContainerDied","Data":"6c2d7dbdf815772e6e64d832e421bf6c2bf135f3308aeb7b8954b62d063c8773"} Jan 28 21:13:39 crc kubenswrapper[4746]: I0128 21:13:39.948534 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9wzl" event={"ID":"bb21e5e7-4689-4d83-8b78-377e04af38a2","Type":"ContainerStarted","Data":"ced169a2215bba611f3fb09c606519b1d6c7463f52bede6e34fbe2701a0ef540"} Jan 28 21:13:43 crc kubenswrapper[4746]: I0128 21:13:43.764893 4746 generic.go:334] "Generic (PLEG): container finished" podID="0dbab66d-c007-4c33-b6da-1e44860668a0" containerID="20b6eb5e222c20b922bfa6e93b4c6089bc8ee4acfbb4976da25aadaf1313a6b1" exitCode=0 Jan 28 21:13:43 crc kubenswrapper[4746]: I0128 21:13:43.764986 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" event={"ID":"0dbab66d-c007-4c33-b6da-1e44860668a0","Type":"ContainerDied","Data":"20b6eb5e222c20b922bfa6e93b4c6089bc8ee4acfbb4976da25aadaf1313a6b1"} Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.276806 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.441631 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-ovn-combined-ca-bundle\") pod \"0dbab66d-c007-4c33-b6da-1e44860668a0\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.441717 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-repo-setup-combined-ca-bundle\") pod \"0dbab66d-c007-4c33-b6da-1e44860668a0\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.441765 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"0dbab66d-c007-4c33-b6da-1e44860668a0\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.441917 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-bootstrap-combined-ca-bundle\") pod \"0dbab66d-c007-4c33-b6da-1e44860668a0\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.441944 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-libvirt-combined-ca-bundle\") pod \"0dbab66d-c007-4c33-b6da-1e44860668a0\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.441975 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-neutron-metadata-combined-ca-bundle\") pod \"0dbab66d-c007-4c33-b6da-1e44860668a0\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.442053 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"0dbab66d-c007-4c33-b6da-1e44860668a0\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.442162 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"0dbab66d-c007-4c33-b6da-1e44860668a0\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.442252 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"0dbab66d-c007-4c33-b6da-1e44860668a0\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.442282 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dgkm\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-kube-api-access-8dgkm\") pod \"0dbab66d-c007-4c33-b6da-1e44860668a0\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.442314 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-nova-combined-ca-bundle\") pod \"0dbab66d-c007-4c33-b6da-1e44860668a0\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.442341 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-ssh-key-openstack-edpm-ipam\") pod \"0dbab66d-c007-4c33-b6da-1e44860668a0\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.442384 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-telemetry-combined-ca-bundle\") pod \"0dbab66d-c007-4c33-b6da-1e44860668a0\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.442413 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-inventory\") pod \"0dbab66d-c007-4c33-b6da-1e44860668a0\" (UID: \"0dbab66d-c007-4c33-b6da-1e44860668a0\") " Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.449290 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "0dbab66d-c007-4c33-b6da-1e44860668a0" (UID: "0dbab66d-c007-4c33-b6da-1e44860668a0"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.449468 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "0dbab66d-c007-4c33-b6da-1e44860668a0" (UID: "0dbab66d-c007-4c33-b6da-1e44860668a0"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.449777 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0dbab66d-c007-4c33-b6da-1e44860668a0" (UID: "0dbab66d-c007-4c33-b6da-1e44860668a0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.449848 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-kube-api-access-8dgkm" (OuterVolumeSpecName: "kube-api-access-8dgkm") pod "0dbab66d-c007-4c33-b6da-1e44860668a0" (UID: "0dbab66d-c007-4c33-b6da-1e44860668a0"). InnerVolumeSpecName "kube-api-access-8dgkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.450263 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0dbab66d-c007-4c33-b6da-1e44860668a0" (UID: "0dbab66d-c007-4c33-b6da-1e44860668a0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.450593 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "0dbab66d-c007-4c33-b6da-1e44860668a0" (UID: "0dbab66d-c007-4c33-b6da-1e44860668a0"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.451427 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0dbab66d-c007-4c33-b6da-1e44860668a0" (UID: "0dbab66d-c007-4c33-b6da-1e44860668a0"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.451807 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0dbab66d-c007-4c33-b6da-1e44860668a0" (UID: "0dbab66d-c007-4c33-b6da-1e44860668a0"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.452336 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "0dbab66d-c007-4c33-b6da-1e44860668a0" (UID: "0dbab66d-c007-4c33-b6da-1e44860668a0"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.452620 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "0dbab66d-c007-4c33-b6da-1e44860668a0" (UID: "0dbab66d-c007-4c33-b6da-1e44860668a0"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.453339 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0dbab66d-c007-4c33-b6da-1e44860668a0" (UID: "0dbab66d-c007-4c33-b6da-1e44860668a0"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.454242 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "0dbab66d-c007-4c33-b6da-1e44860668a0" (UID: "0dbab66d-c007-4c33-b6da-1e44860668a0"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.476468 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-inventory" (OuterVolumeSpecName: "inventory") pod "0dbab66d-c007-4c33-b6da-1e44860668a0" (UID: "0dbab66d-c007-4c33-b6da-1e44860668a0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.501601 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0dbab66d-c007-4c33-b6da-1e44860668a0" (UID: "0dbab66d-c007-4c33-b6da-1e44860668a0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.544427 4746 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.544474 4746 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.544493 4746 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.544513 4746 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.544532 4746 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.544547 4746 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.544562 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dgkm\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-kube-api-access-8dgkm\") on node \"crc\" DevicePath \"\"" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.544574 4746 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.544586 4746 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.544598 4746 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.544609 4746 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.544622 4746 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.544633 4746 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dbab66d-c007-4c33-b6da-1e44860668a0-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.544647 4746 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0dbab66d-c007-4c33-b6da-1e44860668a0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.786582 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" event={"ID":"0dbab66d-c007-4c33-b6da-1e44860668a0","Type":"ContainerDied","Data":"998d70de2678a6089b8707bb79bef5f0e82579df739913a770d007ad5ff89cef"} Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.786622 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="998d70de2678a6089b8707bb79bef5f0e82579df739913a770d007ad5ff89cef" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.786695 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.904812 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4"] Jan 28 21:13:45 crc kubenswrapper[4746]: E0128 21:13:45.905210 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbab66d-c007-4c33-b6da-1e44860668a0" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.905223 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbab66d-c007-4c33-b6da-1e44860668a0" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.905436 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dbab66d-c007-4c33-b6da-1e44860668a0" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.906171 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.909200 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.909363 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.909490 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.909589 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xb8vv" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.909787 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 21:13:45 crc kubenswrapper[4746]: I0128 21:13:45.932892 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4"] Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.054988 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824d1a68-929d-4c25-801a-17fdf5172893-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sfgd4\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.055063 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv8l6\" (UniqueName: \"kubernetes.io/projected/824d1a68-929d-4c25-801a-17fdf5172893-kube-api-access-wv8l6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sfgd4\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.055560 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/824d1a68-929d-4c25-801a-17fdf5172893-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sfgd4\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.055620 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/824d1a68-929d-4c25-801a-17fdf5172893-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sfgd4\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.055708 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/824d1a68-929d-4c25-801a-17fdf5172893-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sfgd4\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.157127 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824d1a68-929d-4c25-801a-17fdf5172893-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sfgd4\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.157198 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv8l6\" (UniqueName: \"kubernetes.io/projected/824d1a68-929d-4c25-801a-17fdf5172893-kube-api-access-wv8l6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sfgd4\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.157261 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/824d1a68-929d-4c25-801a-17fdf5172893-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sfgd4\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.157295 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/824d1a68-929d-4c25-801a-17fdf5172893-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sfgd4\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.157332 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/824d1a68-929d-4c25-801a-17fdf5172893-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sfgd4\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.158606 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/824d1a68-929d-4c25-801a-17fdf5172893-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sfgd4\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.161698 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/824d1a68-929d-4c25-801a-17fdf5172893-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sfgd4\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.163643 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/824d1a68-929d-4c25-801a-17fdf5172893-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sfgd4\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.174570 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824d1a68-929d-4c25-801a-17fdf5172893-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sfgd4\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.176340 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv8l6\" (UniqueName: \"kubernetes.io/projected/824d1a68-929d-4c25-801a-17fdf5172893-kube-api-access-wv8l6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-sfgd4\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.224340 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.797701 4746 generic.go:334] "Generic (PLEG): container finished" podID="bb21e5e7-4689-4d83-8b78-377e04af38a2" containerID="ced169a2215bba611f3fb09c606519b1d6c7463f52bede6e34fbe2701a0ef540" exitCode=0 Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.797996 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9wzl" event={"ID":"bb21e5e7-4689-4d83-8b78-377e04af38a2","Type":"ContainerDied","Data":"ced169a2215bba611f3fb09c606519b1d6c7463f52bede6e34fbe2701a0ef540"} Jan 28 21:13:46 crc kubenswrapper[4746]: I0128 21:13:46.878252 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4"] Jan 28 21:13:47 crc kubenswrapper[4746]: I0128 21:13:47.808827 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" event={"ID":"824d1a68-929d-4c25-801a-17fdf5172893","Type":"ContainerStarted","Data":"a67a261b4db917976ca72cb69c0812ccd86e92077004f8141545d7ca6dbc716c"} Jan 28 21:13:47 crc kubenswrapper[4746]: I0128 21:13:47.809448 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" event={"ID":"824d1a68-929d-4c25-801a-17fdf5172893","Type":"ContainerStarted","Data":"0b320b76ed6ed5b4d63da9667c1a8f8d9e01997302a9a8f315571fdd0b1d867f"} Jan 28 21:13:47 crc kubenswrapper[4746]: I0128 21:13:47.811188 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9wzl" event={"ID":"bb21e5e7-4689-4d83-8b78-377e04af38a2","Type":"ContainerStarted","Data":"3fd6c462484b1978bd6a856a3d73bb3caec6c0393490ce95c841dc617fc89859"} Jan 28 21:13:47 crc kubenswrapper[4746]: I0128 21:13:47.838195 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" podStartSLOduration=2.423666559 podStartE2EDuration="2.83817954s" podCreationTimestamp="2026-01-28 21:13:45 +0000 UTC" firstStartedPulling="2026-01-28 21:13:46.867914536 +0000 UTC m=+2054.824100890" lastFinishedPulling="2026-01-28 21:13:47.282427517 +0000 UTC m=+2055.238613871" observedRunningTime="2026-01-28 21:13:47.827282487 +0000 UTC m=+2055.783468851" watchObservedRunningTime="2026-01-28 21:13:47.83817954 +0000 UTC m=+2055.794365894" Jan 28 21:13:47 crc kubenswrapper[4746]: I0128 21:13:47.848859 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k9wzl" podStartSLOduration=3.367293976 podStartE2EDuration="11.848843318s" podCreationTimestamp="2026-01-28 21:13:36 +0000 UTC" firstStartedPulling="2026-01-28 21:13:38.937387941 +0000 UTC m=+2046.893574305" lastFinishedPulling="2026-01-28 21:13:47.418937293 +0000 UTC m=+2055.375123647" observedRunningTime="2026-01-28 21:13:47.8470947 +0000 UTC m=+2055.803281064" watchObservedRunningTime="2026-01-28 21:13:47.848843318 +0000 UTC m=+2055.805029672" Jan 28 21:13:56 crc kubenswrapper[4746]: I0128 21:13:56.049043 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-hz5db"] Jan 28 21:13:56 crc kubenswrapper[4746]: I0128 21:13:56.059267 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-hz5db"] Jan 28 21:13:56 crc kubenswrapper[4746]: I0128 21:13:56.847383 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb91f276-e145-42e8-a53a-72b1f8311302" path="/var/lib/kubelet/pods/fb91f276-e145-42e8-a53a-72b1f8311302/volumes" Jan 28 21:13:57 crc kubenswrapper[4746]: I0128 21:13:57.211293 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k9wzl" Jan 28 21:13:57 crc kubenswrapper[4746]: I0128 21:13:57.211612 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k9wzl" Jan 28 21:13:58 crc kubenswrapper[4746]: I0128 21:13:58.268387 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k9wzl" podUID="bb21e5e7-4689-4d83-8b78-377e04af38a2" containerName="registry-server" probeResult="failure" output=< Jan 28 21:13:58 crc kubenswrapper[4746]: timeout: failed to connect service ":50051" within 1s Jan 28 21:13:58 crc kubenswrapper[4746]: > Jan 28 21:13:58 crc kubenswrapper[4746]: I0128 21:13:58.859146 4746 scope.go:117] "RemoveContainer" containerID="b6d0f5ff00d9ed0151d4228f2de581e21e8c62bcc21f24bacd68d86bc5d26826" Jan 28 21:14:02 crc kubenswrapper[4746]: I0128 21:14:02.043990 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-9czms"] Jan 28 21:14:02 crc kubenswrapper[4746]: I0128 21:14:02.060994 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-9czms"] Jan 28 21:14:02 crc kubenswrapper[4746]: I0128 21:14:02.854236 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e31300a2-ace0-435c-8eb9-383dc7a6120b" path="/var/lib/kubelet/pods/e31300a2-ace0-435c-8eb9-383dc7a6120b/volumes" Jan 28 21:14:07 crc kubenswrapper[4746]: I0128 21:14:07.257854 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k9wzl" Jan 28 21:14:07 crc kubenswrapper[4746]: I0128 21:14:07.321044 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k9wzl" Jan 28 21:14:08 crc kubenswrapper[4746]: I0128 21:14:08.047600 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k9wzl"] Jan 28 21:14:09 crc kubenswrapper[4746]: I0128 21:14:09.058602 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k9wzl" podUID="bb21e5e7-4689-4d83-8b78-377e04af38a2" containerName="registry-server" containerID="cri-o://3fd6c462484b1978bd6a856a3d73bb3caec6c0393490ce95c841dc617fc89859" gracePeriod=2 Jan 28 21:14:09 crc kubenswrapper[4746]: I0128 21:14:09.742297 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k9wzl" Jan 28 21:14:09 crc kubenswrapper[4746]: I0128 21:14:09.841831 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb21e5e7-4689-4d83-8b78-377e04af38a2-catalog-content\") pod \"bb21e5e7-4689-4d83-8b78-377e04af38a2\" (UID: \"bb21e5e7-4689-4d83-8b78-377e04af38a2\") " Jan 28 21:14:09 crc kubenswrapper[4746]: I0128 21:14:09.841892 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb21e5e7-4689-4d83-8b78-377e04af38a2-utilities\") pod \"bb21e5e7-4689-4d83-8b78-377e04af38a2\" (UID: \"bb21e5e7-4689-4d83-8b78-377e04af38a2\") " Jan 28 21:14:09 crc kubenswrapper[4746]: I0128 21:14:09.842130 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kfwf\" (UniqueName: \"kubernetes.io/projected/bb21e5e7-4689-4d83-8b78-377e04af38a2-kube-api-access-5kfwf\") pod \"bb21e5e7-4689-4d83-8b78-377e04af38a2\" (UID: \"bb21e5e7-4689-4d83-8b78-377e04af38a2\") " Jan 28 21:14:09 crc kubenswrapper[4746]: I0128 21:14:09.842831 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb21e5e7-4689-4d83-8b78-377e04af38a2-utilities" (OuterVolumeSpecName: "utilities") pod "bb21e5e7-4689-4d83-8b78-377e04af38a2" (UID: "bb21e5e7-4689-4d83-8b78-377e04af38a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:14:09 crc kubenswrapper[4746]: I0128 21:14:09.843514 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb21e5e7-4689-4d83-8b78-377e04af38a2-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 21:14:09 crc kubenswrapper[4746]: I0128 21:14:09.847272 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb21e5e7-4689-4d83-8b78-377e04af38a2-kube-api-access-5kfwf" (OuterVolumeSpecName: "kube-api-access-5kfwf") pod "bb21e5e7-4689-4d83-8b78-377e04af38a2" (UID: "bb21e5e7-4689-4d83-8b78-377e04af38a2"). InnerVolumeSpecName "kube-api-access-5kfwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:14:09 crc kubenswrapper[4746]: I0128 21:14:09.946157 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kfwf\" (UniqueName: \"kubernetes.io/projected/bb21e5e7-4689-4d83-8b78-377e04af38a2-kube-api-access-5kfwf\") on node \"crc\" DevicePath \"\"" Jan 28 21:14:09 crc kubenswrapper[4746]: I0128 21:14:09.962180 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb21e5e7-4689-4d83-8b78-377e04af38a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb21e5e7-4689-4d83-8b78-377e04af38a2" (UID: "bb21e5e7-4689-4d83-8b78-377e04af38a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:14:10 crc kubenswrapper[4746]: I0128 21:14:10.049362 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb21e5e7-4689-4d83-8b78-377e04af38a2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 21:14:10 crc kubenswrapper[4746]: I0128 21:14:10.070693 4746 generic.go:334] "Generic (PLEG): container finished" podID="bb21e5e7-4689-4d83-8b78-377e04af38a2" containerID="3fd6c462484b1978bd6a856a3d73bb3caec6c0393490ce95c841dc617fc89859" exitCode=0 Jan 28 21:14:10 crc kubenswrapper[4746]: I0128 21:14:10.070735 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9wzl" event={"ID":"bb21e5e7-4689-4d83-8b78-377e04af38a2","Type":"ContainerDied","Data":"3fd6c462484b1978bd6a856a3d73bb3caec6c0393490ce95c841dc617fc89859"} Jan 28 21:14:10 crc kubenswrapper[4746]: I0128 21:14:10.070760 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9wzl" event={"ID":"bb21e5e7-4689-4d83-8b78-377e04af38a2","Type":"ContainerDied","Data":"287b26aecfc44e909ee315440e1f15661b0abd5c280d27086443918e4fe0123a"} Jan 28 21:14:10 crc kubenswrapper[4746]: I0128 21:14:10.070775 4746 scope.go:117] "RemoveContainer" containerID="3fd6c462484b1978bd6a856a3d73bb3caec6c0393490ce95c841dc617fc89859" Jan 28 21:14:10 crc kubenswrapper[4746]: I0128 21:14:10.070898 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k9wzl" Jan 28 21:14:10 crc kubenswrapper[4746]: I0128 21:14:10.105243 4746 scope.go:117] "RemoveContainer" containerID="ced169a2215bba611f3fb09c606519b1d6c7463f52bede6e34fbe2701a0ef540" Jan 28 21:14:10 crc kubenswrapper[4746]: I0128 21:14:10.117965 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k9wzl"] Jan 28 21:14:10 crc kubenswrapper[4746]: I0128 21:14:10.129369 4746 scope.go:117] "RemoveContainer" containerID="6c2d7dbdf815772e6e64d832e421bf6c2bf135f3308aeb7b8954b62d063c8773" Jan 28 21:14:10 crc kubenswrapper[4746]: I0128 21:14:10.135846 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k9wzl"] Jan 28 21:14:10 crc kubenswrapper[4746]: I0128 21:14:10.196914 4746 scope.go:117] "RemoveContainer" containerID="3fd6c462484b1978bd6a856a3d73bb3caec6c0393490ce95c841dc617fc89859" Jan 28 21:14:10 crc kubenswrapper[4746]: E0128 21:14:10.197412 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fd6c462484b1978bd6a856a3d73bb3caec6c0393490ce95c841dc617fc89859\": container with ID starting with 3fd6c462484b1978bd6a856a3d73bb3caec6c0393490ce95c841dc617fc89859 not found: ID does not exist" containerID="3fd6c462484b1978bd6a856a3d73bb3caec6c0393490ce95c841dc617fc89859" Jan 28 21:14:10 crc kubenswrapper[4746]: I0128 21:14:10.197456 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd6c462484b1978bd6a856a3d73bb3caec6c0393490ce95c841dc617fc89859"} err="failed to get container status \"3fd6c462484b1978bd6a856a3d73bb3caec6c0393490ce95c841dc617fc89859\": rpc error: code = NotFound desc = could not find container \"3fd6c462484b1978bd6a856a3d73bb3caec6c0393490ce95c841dc617fc89859\": container with ID starting with 3fd6c462484b1978bd6a856a3d73bb3caec6c0393490ce95c841dc617fc89859 not found: ID does not exist" Jan 28 21:14:10 crc kubenswrapper[4746]: I0128 21:14:10.197483 4746 scope.go:117] "RemoveContainer" containerID="ced169a2215bba611f3fb09c606519b1d6c7463f52bede6e34fbe2701a0ef540" Jan 28 21:14:10 crc kubenswrapper[4746]: E0128 21:14:10.197851 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ced169a2215bba611f3fb09c606519b1d6c7463f52bede6e34fbe2701a0ef540\": container with ID starting with ced169a2215bba611f3fb09c606519b1d6c7463f52bede6e34fbe2701a0ef540 not found: ID does not exist" containerID="ced169a2215bba611f3fb09c606519b1d6c7463f52bede6e34fbe2701a0ef540" Jan 28 21:14:10 crc kubenswrapper[4746]: I0128 21:14:10.197884 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced169a2215bba611f3fb09c606519b1d6c7463f52bede6e34fbe2701a0ef540"} err="failed to get container status \"ced169a2215bba611f3fb09c606519b1d6c7463f52bede6e34fbe2701a0ef540\": rpc error: code = NotFound desc = could not find container \"ced169a2215bba611f3fb09c606519b1d6c7463f52bede6e34fbe2701a0ef540\": container with ID starting with ced169a2215bba611f3fb09c606519b1d6c7463f52bede6e34fbe2701a0ef540 not found: ID does not exist" Jan 28 21:14:10 crc kubenswrapper[4746]: I0128 21:14:10.197908 4746 scope.go:117] "RemoveContainer" containerID="6c2d7dbdf815772e6e64d832e421bf6c2bf135f3308aeb7b8954b62d063c8773" Jan 28 21:14:10 crc kubenswrapper[4746]: E0128 21:14:10.198176 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c2d7dbdf815772e6e64d832e421bf6c2bf135f3308aeb7b8954b62d063c8773\": container with ID starting with 6c2d7dbdf815772e6e64d832e421bf6c2bf135f3308aeb7b8954b62d063c8773 not found: ID does not exist" containerID="6c2d7dbdf815772e6e64d832e421bf6c2bf135f3308aeb7b8954b62d063c8773" Jan 28 21:14:10 crc kubenswrapper[4746]: I0128 21:14:10.198232 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2d7dbdf815772e6e64d832e421bf6c2bf135f3308aeb7b8954b62d063c8773"} err="failed to get container status \"6c2d7dbdf815772e6e64d832e421bf6c2bf135f3308aeb7b8954b62d063c8773\": rpc error: code = NotFound desc = could not find container \"6c2d7dbdf815772e6e64d832e421bf6c2bf135f3308aeb7b8954b62d063c8773\": container with ID starting with 6c2d7dbdf815772e6e64d832e421bf6c2bf135f3308aeb7b8954b62d063c8773 not found: ID does not exist" Jan 28 21:14:10 crc kubenswrapper[4746]: I0128 21:14:10.851022 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb21e5e7-4689-4d83-8b78-377e04af38a2" path="/var/lib/kubelet/pods/bb21e5e7-4689-4d83-8b78-377e04af38a2/volumes" Jan 28 21:14:46 crc kubenswrapper[4746]: I0128 21:14:46.405832 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" event={"ID":"824d1a68-929d-4c25-801a-17fdf5172893","Type":"ContainerDied","Data":"a67a261b4db917976ca72cb69c0812ccd86e92077004f8141545d7ca6dbc716c"} Jan 28 21:14:46 crc kubenswrapper[4746]: I0128 21:14:46.405827 4746 generic.go:334] "Generic (PLEG): container finished" podID="824d1a68-929d-4c25-801a-17fdf5172893" containerID="a67a261b4db917976ca72cb69c0812ccd86e92077004f8141545d7ca6dbc716c" exitCode=0 Jan 28 21:14:47 crc kubenswrapper[4746]: I0128 21:14:47.939203 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:14:47 crc kubenswrapper[4746]: I0128 21:14:47.977409 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/824d1a68-929d-4c25-801a-17fdf5172893-ovncontroller-config-0\") pod \"824d1a68-929d-4c25-801a-17fdf5172893\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " Jan 28 21:14:47 crc kubenswrapper[4746]: I0128 21:14:47.977473 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824d1a68-929d-4c25-801a-17fdf5172893-ovn-combined-ca-bundle\") pod \"824d1a68-929d-4c25-801a-17fdf5172893\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " Jan 28 21:14:47 crc kubenswrapper[4746]: I0128 21:14:47.977518 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/824d1a68-929d-4c25-801a-17fdf5172893-ssh-key-openstack-edpm-ipam\") pod \"824d1a68-929d-4c25-801a-17fdf5172893\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " Jan 28 21:14:47 crc kubenswrapper[4746]: I0128 21:14:47.977561 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/824d1a68-929d-4c25-801a-17fdf5172893-inventory\") pod \"824d1a68-929d-4c25-801a-17fdf5172893\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " Jan 28 21:14:47 crc kubenswrapper[4746]: I0128 21:14:47.977696 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv8l6\" (UniqueName: \"kubernetes.io/projected/824d1a68-929d-4c25-801a-17fdf5172893-kube-api-access-wv8l6\") pod \"824d1a68-929d-4c25-801a-17fdf5172893\" (UID: \"824d1a68-929d-4c25-801a-17fdf5172893\") " Jan 28 21:14:47 crc kubenswrapper[4746]: I0128 21:14:47.990428 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/824d1a68-929d-4c25-801a-17fdf5172893-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "824d1a68-929d-4c25-801a-17fdf5172893" (UID: "824d1a68-929d-4c25-801a-17fdf5172893"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:14:47 crc kubenswrapper[4746]: I0128 21:14:47.990476 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/824d1a68-929d-4c25-801a-17fdf5172893-kube-api-access-wv8l6" (OuterVolumeSpecName: "kube-api-access-wv8l6") pod "824d1a68-929d-4c25-801a-17fdf5172893" (UID: "824d1a68-929d-4c25-801a-17fdf5172893"). InnerVolumeSpecName "kube-api-access-wv8l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.007064 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/824d1a68-929d-4c25-801a-17fdf5172893-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "824d1a68-929d-4c25-801a-17fdf5172893" (UID: "824d1a68-929d-4c25-801a-17fdf5172893"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.008307 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/824d1a68-929d-4c25-801a-17fdf5172893-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "824d1a68-929d-4c25-801a-17fdf5172893" (UID: "824d1a68-929d-4c25-801a-17fdf5172893"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.008913 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/824d1a68-929d-4c25-801a-17fdf5172893-inventory" (OuterVolumeSpecName: "inventory") pod "824d1a68-929d-4c25-801a-17fdf5172893" (UID: "824d1a68-929d-4c25-801a-17fdf5172893"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.082148 4746 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/824d1a68-929d-4c25-801a-17fdf5172893-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.082215 4746 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824d1a68-929d-4c25-801a-17fdf5172893-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.082228 4746 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/824d1a68-929d-4c25-801a-17fdf5172893-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.082291 4746 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/824d1a68-929d-4c25-801a-17fdf5172893-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.082306 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv8l6\" (UniqueName: \"kubernetes.io/projected/824d1a68-929d-4c25-801a-17fdf5172893-kube-api-access-wv8l6\") on node \"crc\" DevicePath \"\"" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.429866 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" event={"ID":"824d1a68-929d-4c25-801a-17fdf5172893","Type":"ContainerDied","Data":"0b320b76ed6ed5b4d63da9667c1a8f8d9e01997302a9a8f315571fdd0b1d867f"} Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.429906 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b320b76ed6ed5b4d63da9667c1a8f8d9e01997302a9a8f315571fdd0b1d867f" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.429949 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-sfgd4" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.563025 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp"] Jan 28 21:14:48 crc kubenswrapper[4746]: E0128 21:14:48.563892 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb21e5e7-4689-4d83-8b78-377e04af38a2" containerName="registry-server" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.563925 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb21e5e7-4689-4d83-8b78-377e04af38a2" containerName="registry-server" Jan 28 21:14:48 crc kubenswrapper[4746]: E0128 21:14:48.563987 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb21e5e7-4689-4d83-8b78-377e04af38a2" containerName="extract-content" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.564008 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb21e5e7-4689-4d83-8b78-377e04af38a2" containerName="extract-content" Jan 28 21:14:48 crc kubenswrapper[4746]: E0128 21:14:48.564038 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb21e5e7-4689-4d83-8b78-377e04af38a2" containerName="extract-utilities" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.564047 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb21e5e7-4689-4d83-8b78-377e04af38a2" containerName="extract-utilities" Jan 28 21:14:48 crc kubenswrapper[4746]: E0128 21:14:48.564060 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="824d1a68-929d-4c25-801a-17fdf5172893" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.564068 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="824d1a68-929d-4c25-801a-17fdf5172893" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.564575 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb21e5e7-4689-4d83-8b78-377e04af38a2" containerName="registry-server" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.564648 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="824d1a68-929d-4c25-801a-17fdf5172893" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.566172 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.576495 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp"] Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.578637 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.578660 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xb8vv" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.578891 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.579009 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.579296 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.597640 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.597983 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.598175 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.598253 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6vcf\" (UniqueName: \"kubernetes.io/projected/92c386a4-a812-4e5f-938a-611be2d329ff-kube-api-access-g6vcf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.598403 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.598453 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.607505 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.702035 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.702493 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.702563 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6vcf\" (UniqueName: \"kubernetes.io/projected/92c386a4-a812-4e5f-938a-611be2d329ff-kube-api-access-g6vcf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.702680 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.702722 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.702859 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.735177 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.735588 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.744617 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.745279 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.745970 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6vcf\" (UniqueName: \"kubernetes.io/projected/92c386a4-a812-4e5f-938a-611be2d329ff-kube-api-access-g6vcf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.750893 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:48 crc kubenswrapper[4746]: E0128 21:14:48.767648 4746 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod824d1a68_929d_4c25_801a_17fdf5172893.slice/crio-0b320b76ed6ed5b4d63da9667c1a8f8d9e01997302a9a8f315571fdd0b1d867f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod824d1a68_929d_4c25_801a_17fdf5172893.slice\": RecentStats: unable to find data in memory cache]" Jan 28 21:14:48 crc kubenswrapper[4746]: I0128 21:14:48.916832 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:14:49 crc kubenswrapper[4746]: I0128 21:14:49.514812 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp"] Jan 28 21:14:49 crc kubenswrapper[4746]: W0128 21:14:49.522041 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92c386a4_a812_4e5f_938a_611be2d329ff.slice/crio-fa9c1ebdabfdb9b81dc5a0a7a114e03278910db912afbb42b7e1b1b68b5303d2 WatchSource:0}: Error finding container fa9c1ebdabfdb9b81dc5a0a7a114e03278910db912afbb42b7e1b1b68b5303d2: Status 404 returned error can't find the container with id fa9c1ebdabfdb9b81dc5a0a7a114e03278910db912afbb42b7e1b1b68b5303d2 Jan 28 21:14:50 crc kubenswrapper[4746]: I0128 21:14:50.448216 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" event={"ID":"92c386a4-a812-4e5f-938a-611be2d329ff","Type":"ContainerStarted","Data":"9e3be591dfabee19985030ab75d6eee28b10a7ec25df092700001386738da80d"} Jan 28 21:14:50 crc kubenswrapper[4746]: I0128 21:14:50.448576 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" event={"ID":"92c386a4-a812-4e5f-938a-611be2d329ff","Type":"ContainerStarted","Data":"fa9c1ebdabfdb9b81dc5a0a7a114e03278910db912afbb42b7e1b1b68b5303d2"} Jan 28 21:14:50 crc kubenswrapper[4746]: I0128 21:14:50.474143 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" podStartSLOduration=1.9684569710000002 podStartE2EDuration="2.474056838s" podCreationTimestamp="2026-01-28 21:14:48 +0000 UTC" firstStartedPulling="2026-01-28 21:14:49.523561396 +0000 UTC m=+2117.479747750" lastFinishedPulling="2026-01-28 21:14:50.029161253 +0000 UTC m=+2117.985347617" observedRunningTime="2026-01-28 21:14:50.46339207 +0000 UTC m=+2118.419578424" watchObservedRunningTime="2026-01-28 21:14:50.474056838 +0000 UTC m=+2118.430243192" Jan 28 21:14:58 crc kubenswrapper[4746]: I0128 21:14:58.923761 4746 scope.go:117] "RemoveContainer" containerID="e05be61de5bbf8b7d404b06b2402502a8c3f3bb3334539bff21b437b83051af4" Jan 28 21:15:00 crc kubenswrapper[4746]: I0128 21:15:00.139742 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b"] Jan 28 21:15:00 crc kubenswrapper[4746]: I0128 21:15:00.141561 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b" Jan 28 21:15:00 crc kubenswrapper[4746]: I0128 21:15:00.144335 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 21:15:00 crc kubenswrapper[4746]: I0128 21:15:00.145454 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 21:15:00 crc kubenswrapper[4746]: I0128 21:15:00.160650 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b"] Jan 28 21:15:00 crc kubenswrapper[4746]: I0128 21:15:00.264822 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a573ebb7-d2e3-4a23-a5f7-b8947aceaece-secret-volume\") pod \"collect-profiles-29493915-8vd5b\" (UID: \"a573ebb7-d2e3-4a23-a5f7-b8947aceaece\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b" Jan 28 21:15:00 crc kubenswrapper[4746]: I0128 21:15:00.265118 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxx4m\" (UniqueName: \"kubernetes.io/projected/a573ebb7-d2e3-4a23-a5f7-b8947aceaece-kube-api-access-rxx4m\") pod \"collect-profiles-29493915-8vd5b\" (UID: \"a573ebb7-d2e3-4a23-a5f7-b8947aceaece\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b" Jan 28 21:15:00 crc kubenswrapper[4746]: I0128 21:15:00.265217 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a573ebb7-d2e3-4a23-a5f7-b8947aceaece-config-volume\") pod \"collect-profiles-29493915-8vd5b\" (UID: \"a573ebb7-d2e3-4a23-a5f7-b8947aceaece\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b" Jan 28 21:15:00 crc kubenswrapper[4746]: I0128 21:15:00.367623 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a573ebb7-d2e3-4a23-a5f7-b8947aceaece-secret-volume\") pod \"collect-profiles-29493915-8vd5b\" (UID: \"a573ebb7-d2e3-4a23-a5f7-b8947aceaece\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b" Jan 28 21:15:00 crc kubenswrapper[4746]: I0128 21:15:00.367718 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxx4m\" (UniqueName: \"kubernetes.io/projected/a573ebb7-d2e3-4a23-a5f7-b8947aceaece-kube-api-access-rxx4m\") pod \"collect-profiles-29493915-8vd5b\" (UID: \"a573ebb7-d2e3-4a23-a5f7-b8947aceaece\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b" Jan 28 21:15:00 crc kubenswrapper[4746]: I0128 21:15:00.367754 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a573ebb7-d2e3-4a23-a5f7-b8947aceaece-config-volume\") pod \"collect-profiles-29493915-8vd5b\" (UID: \"a573ebb7-d2e3-4a23-a5f7-b8947aceaece\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b" Jan 28 21:15:00 crc kubenswrapper[4746]: I0128 21:15:00.368792 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a573ebb7-d2e3-4a23-a5f7-b8947aceaece-config-volume\") pod \"collect-profiles-29493915-8vd5b\" (UID: \"a573ebb7-d2e3-4a23-a5f7-b8947aceaece\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b" Jan 28 21:15:00 crc kubenswrapper[4746]: I0128 21:15:00.373530 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a573ebb7-d2e3-4a23-a5f7-b8947aceaece-secret-volume\") pod \"collect-profiles-29493915-8vd5b\" (UID: \"a573ebb7-d2e3-4a23-a5f7-b8947aceaece\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b" Jan 28 21:15:00 crc kubenswrapper[4746]: I0128 21:15:00.386209 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxx4m\" (UniqueName: \"kubernetes.io/projected/a573ebb7-d2e3-4a23-a5f7-b8947aceaece-kube-api-access-rxx4m\") pod \"collect-profiles-29493915-8vd5b\" (UID: \"a573ebb7-d2e3-4a23-a5f7-b8947aceaece\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b" Jan 28 21:15:00 crc kubenswrapper[4746]: I0128 21:15:00.490464 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b" Jan 28 21:15:00 crc kubenswrapper[4746]: I0128 21:15:00.967307 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b"] Jan 28 21:15:01 crc kubenswrapper[4746]: I0128 21:15:01.561662 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b" event={"ID":"a573ebb7-d2e3-4a23-a5f7-b8947aceaece","Type":"ContainerStarted","Data":"c09b499be05938b58dc79f517d7dc01364d0cd5345036ff2b2c2c1214aec70b3"} Jan 28 21:15:01 crc kubenswrapper[4746]: I0128 21:15:01.561705 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b" event={"ID":"a573ebb7-d2e3-4a23-a5f7-b8947aceaece","Type":"ContainerStarted","Data":"cad55a303c81999c0f3c462fbc9472c2db52188f089a8972be1b2ebab9408edb"} Jan 28 21:15:01 crc kubenswrapper[4746]: I0128 21:15:01.586985 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b" podStartSLOduration=1.5869687799999999 podStartE2EDuration="1.58696878s" podCreationTimestamp="2026-01-28 21:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:15:01.583048815 +0000 UTC m=+2129.539235169" watchObservedRunningTime="2026-01-28 21:15:01.58696878 +0000 UTC m=+2129.543155134" Jan 28 21:15:02 crc kubenswrapper[4746]: I0128 21:15:02.572101 4746 generic.go:334] "Generic (PLEG): container finished" podID="a573ebb7-d2e3-4a23-a5f7-b8947aceaece" containerID="c09b499be05938b58dc79f517d7dc01364d0cd5345036ff2b2c2c1214aec70b3" exitCode=0 Jan 28 21:15:02 crc kubenswrapper[4746]: I0128 21:15:02.572196 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b" event={"ID":"a573ebb7-d2e3-4a23-a5f7-b8947aceaece","Type":"ContainerDied","Data":"c09b499be05938b58dc79f517d7dc01364d0cd5345036ff2b2c2c1214aec70b3"} Jan 28 21:15:04 crc kubenswrapper[4746]: I0128 21:15:04.108300 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b" Jan 28 21:15:04 crc kubenswrapper[4746]: I0128 21:15:04.269800 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a573ebb7-d2e3-4a23-a5f7-b8947aceaece-secret-volume\") pod \"a573ebb7-d2e3-4a23-a5f7-b8947aceaece\" (UID: \"a573ebb7-d2e3-4a23-a5f7-b8947aceaece\") " Jan 28 21:15:04 crc kubenswrapper[4746]: I0128 21:15:04.269916 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxx4m\" (UniqueName: \"kubernetes.io/projected/a573ebb7-d2e3-4a23-a5f7-b8947aceaece-kube-api-access-rxx4m\") pod \"a573ebb7-d2e3-4a23-a5f7-b8947aceaece\" (UID: \"a573ebb7-d2e3-4a23-a5f7-b8947aceaece\") " Jan 28 21:15:04 crc kubenswrapper[4746]: I0128 21:15:04.269974 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a573ebb7-d2e3-4a23-a5f7-b8947aceaece-config-volume\") pod \"a573ebb7-d2e3-4a23-a5f7-b8947aceaece\" (UID: \"a573ebb7-d2e3-4a23-a5f7-b8947aceaece\") " Jan 28 21:15:04 crc kubenswrapper[4746]: I0128 21:15:04.271062 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a573ebb7-d2e3-4a23-a5f7-b8947aceaece-config-volume" (OuterVolumeSpecName: "config-volume") pod "a573ebb7-d2e3-4a23-a5f7-b8947aceaece" (UID: "a573ebb7-d2e3-4a23-a5f7-b8947aceaece"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:15:04 crc kubenswrapper[4746]: I0128 21:15:04.284336 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a573ebb7-d2e3-4a23-a5f7-b8947aceaece-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a573ebb7-d2e3-4a23-a5f7-b8947aceaece" (UID: "a573ebb7-d2e3-4a23-a5f7-b8947aceaece"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:15:04 crc kubenswrapper[4746]: I0128 21:15:04.284593 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a573ebb7-d2e3-4a23-a5f7-b8947aceaece-kube-api-access-rxx4m" (OuterVolumeSpecName: "kube-api-access-rxx4m") pod "a573ebb7-d2e3-4a23-a5f7-b8947aceaece" (UID: "a573ebb7-d2e3-4a23-a5f7-b8947aceaece"). InnerVolumeSpecName "kube-api-access-rxx4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:15:04 crc kubenswrapper[4746]: I0128 21:15:04.372179 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxx4m\" (UniqueName: \"kubernetes.io/projected/a573ebb7-d2e3-4a23-a5f7-b8947aceaece-kube-api-access-rxx4m\") on node \"crc\" DevicePath \"\"" Jan 28 21:15:04 crc kubenswrapper[4746]: I0128 21:15:04.372210 4746 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a573ebb7-d2e3-4a23-a5f7-b8947aceaece-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 21:15:04 crc kubenswrapper[4746]: I0128 21:15:04.372219 4746 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a573ebb7-d2e3-4a23-a5f7-b8947aceaece-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 21:15:04 crc kubenswrapper[4746]: I0128 21:15:04.593645 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b" event={"ID":"a573ebb7-d2e3-4a23-a5f7-b8947aceaece","Type":"ContainerDied","Data":"cad55a303c81999c0f3c462fbc9472c2db52188f089a8972be1b2ebab9408edb"} Jan 28 21:15:04 crc kubenswrapper[4746]: I0128 21:15:04.593707 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cad55a303c81999c0f3c462fbc9472c2db52188f089a8972be1b2ebab9408edb" Jan 28 21:15:04 crc kubenswrapper[4746]: I0128 21:15:04.593710 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493915-8vd5b" Jan 28 21:15:04 crc kubenswrapper[4746]: I0128 21:15:04.668606 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6"] Jan 28 21:15:04 crc kubenswrapper[4746]: I0128 21:15:04.679234 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493870-4jnh6"] Jan 28 21:15:04 crc kubenswrapper[4746]: I0128 21:15:04.848859 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6" path="/var/lib/kubelet/pods/c15d5c5e-19d5-43c6-8955-1bb2ad8b33b6/volumes" Jan 28 21:15:15 crc kubenswrapper[4746]: I0128 21:15:15.871551 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:15:15 crc kubenswrapper[4746]: I0128 21:15:15.872292 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:15:34 crc kubenswrapper[4746]: I0128 21:15:34.903920 4746 generic.go:334] "Generic (PLEG): container finished" podID="92c386a4-a812-4e5f-938a-611be2d329ff" containerID="9e3be591dfabee19985030ab75d6eee28b10a7ec25df092700001386738da80d" exitCode=0 Jan 28 21:15:34 crc kubenswrapper[4746]: I0128 21:15:34.903994 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" event={"ID":"92c386a4-a812-4e5f-938a-611be2d329ff","Type":"ContainerDied","Data":"9e3be591dfabee19985030ab75d6eee28b10a7ec25df092700001386738da80d"} Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.464357 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.552217 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-ssh-key-openstack-edpm-ipam\") pod \"92c386a4-a812-4e5f-938a-611be2d329ff\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.552335 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-neutron-metadata-combined-ca-bundle\") pod \"92c386a4-a812-4e5f-938a-611be2d329ff\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.552389 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-nova-metadata-neutron-config-0\") pod \"92c386a4-a812-4e5f-938a-611be2d329ff\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.552422 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6vcf\" (UniqueName: \"kubernetes.io/projected/92c386a4-a812-4e5f-938a-611be2d329ff-kube-api-access-g6vcf\") pod \"92c386a4-a812-4e5f-938a-611be2d329ff\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.552462 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-inventory\") pod \"92c386a4-a812-4e5f-938a-611be2d329ff\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.552624 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-neutron-ovn-metadata-agent-neutron-config-0\") pod \"92c386a4-a812-4e5f-938a-611be2d329ff\" (UID: \"92c386a4-a812-4e5f-938a-611be2d329ff\") " Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.560345 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "92c386a4-a812-4e5f-938a-611be2d329ff" (UID: "92c386a4-a812-4e5f-938a-611be2d329ff"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.563684 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92c386a4-a812-4e5f-938a-611be2d329ff-kube-api-access-g6vcf" (OuterVolumeSpecName: "kube-api-access-g6vcf") pod "92c386a4-a812-4e5f-938a-611be2d329ff" (UID: "92c386a4-a812-4e5f-938a-611be2d329ff"). InnerVolumeSpecName "kube-api-access-g6vcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.587800 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-inventory" (OuterVolumeSpecName: "inventory") pod "92c386a4-a812-4e5f-938a-611be2d329ff" (UID: "92c386a4-a812-4e5f-938a-611be2d329ff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.594528 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "92c386a4-a812-4e5f-938a-611be2d329ff" (UID: "92c386a4-a812-4e5f-938a-611be2d329ff"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.596925 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "92c386a4-a812-4e5f-938a-611be2d329ff" (UID: "92c386a4-a812-4e5f-938a-611be2d329ff"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.617175 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "92c386a4-a812-4e5f-938a-611be2d329ff" (UID: "92c386a4-a812-4e5f-938a-611be2d329ff"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.655972 4746 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.656010 4746 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.656023 4746 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.656035 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6vcf\" (UniqueName: \"kubernetes.io/projected/92c386a4-a812-4e5f-938a-611be2d329ff-kube-api-access-g6vcf\") on node \"crc\" DevicePath \"\"" Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.656046 4746 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.656058 4746 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/92c386a4-a812-4e5f-938a-611be2d329ff-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.924245 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" event={"ID":"92c386a4-a812-4e5f-938a-611be2d329ff","Type":"ContainerDied","Data":"fa9c1ebdabfdb9b81dc5a0a7a114e03278910db912afbb42b7e1b1b68b5303d2"} Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.924502 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa9c1ebdabfdb9b81dc5a0a7a114e03278910db912afbb42b7e1b1b68b5303d2" Jan 28 21:15:36 crc kubenswrapper[4746]: I0128 21:15:36.924650 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.048605 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx"] Jan 28 21:15:37 crc kubenswrapper[4746]: E0128 21:15:37.049106 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92c386a4-a812-4e5f-938a-611be2d329ff" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.049139 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="92c386a4-a812-4e5f-938a-611be2d329ff" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 28 21:15:37 crc kubenswrapper[4746]: E0128 21:15:37.049165 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a573ebb7-d2e3-4a23-a5f7-b8947aceaece" containerName="collect-profiles" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.049174 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="a573ebb7-d2e3-4a23-a5f7-b8947aceaece" containerName="collect-profiles" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.049458 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="92c386a4-a812-4e5f-938a-611be2d329ff" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.049500 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="a573ebb7-d2e3-4a23-a5f7-b8947aceaece" containerName="collect-profiles" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.050318 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.052380 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.053219 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.053228 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.053241 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.055203 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xb8vv" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.061955 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx"] Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.167485 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.167561 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.167613 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.167638 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.167705 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9vl2\" (UniqueName: \"kubernetes.io/projected/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-kube-api-access-h9vl2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.269159 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9vl2\" (UniqueName: \"kubernetes.io/projected/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-kube-api-access-h9vl2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.269296 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.269341 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.269368 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.269390 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.273549 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.273559 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.273588 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.275834 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.283725 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9vl2\" (UniqueName: \"kubernetes.io/projected/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-kube-api-access-h9vl2\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:15:37 crc kubenswrapper[4746]: I0128 21:15:37.377536 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:15:38 crc kubenswrapper[4746]: W0128 21:15:38.031390 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b6fd411_07ae_42b1_bb00_68e72fdbe6fb.slice/crio-0c1da9486790d09590de5aaa31f0d32e9d605d2ef20073cd3b9e37c430b4abfc WatchSource:0}: Error finding container 0c1da9486790d09590de5aaa31f0d32e9d605d2ef20073cd3b9e37c430b4abfc: Status 404 returned error can't find the container with id 0c1da9486790d09590de5aaa31f0d32e9d605d2ef20073cd3b9e37c430b4abfc Jan 28 21:15:38 crc kubenswrapper[4746]: I0128 21:15:38.034295 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx"] Jan 28 21:15:38 crc kubenswrapper[4746]: I0128 21:15:38.034799 4746 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 21:15:38 crc kubenswrapper[4746]: I0128 21:15:38.944623 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" event={"ID":"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb","Type":"ContainerStarted","Data":"0c1da9486790d09590de5aaa31f0d32e9d605d2ef20073cd3b9e37c430b4abfc"} Jan 28 21:15:39 crc kubenswrapper[4746]: I0128 21:15:39.961692 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" event={"ID":"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb","Type":"ContainerStarted","Data":"22435159f52d49d93d3e6caa9b752a0c07f3dfa6a91e89ea65f59b8e737d96f1"} Jan 28 21:15:39 crc kubenswrapper[4746]: I0128 21:15:39.995892 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" podStartSLOduration=1.97957556 podStartE2EDuration="2.995857152s" podCreationTimestamp="2026-01-28 21:15:37 +0000 UTC" firstStartedPulling="2026-01-28 21:15:38.034522556 +0000 UTC m=+2165.990708920" lastFinishedPulling="2026-01-28 21:15:39.050804158 +0000 UTC m=+2167.006990512" observedRunningTime="2026-01-28 21:15:39.992400659 +0000 UTC m=+2167.948587023" watchObservedRunningTime="2026-01-28 21:15:39.995857152 +0000 UTC m=+2167.952043526" Jan 28 21:15:45 crc kubenswrapper[4746]: I0128 21:15:45.871518 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:15:45 crc kubenswrapper[4746]: I0128 21:15:45.872273 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:15:59 crc kubenswrapper[4746]: I0128 21:15:59.061817 4746 scope.go:117] "RemoveContainer" containerID="0285afbcb142cdf5b9d8c281aa5e2ffa73afb12d29dd61ed8bc2f60ab072fa58" Jan 28 21:16:15 crc kubenswrapper[4746]: I0128 21:16:15.871851 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:16:15 crc kubenswrapper[4746]: I0128 21:16:15.872368 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:16:15 crc kubenswrapper[4746]: I0128 21:16:15.872435 4746 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 21:16:15 crc kubenswrapper[4746]: I0128 21:16:15.873352 4746 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bce8376b1cfe954c40209943c42297ca65bedfa462c1a2c3d2942abd9bcf67ec"} pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 21:16:15 crc kubenswrapper[4746]: I0128 21:16:15.873420 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" containerID="cri-o://bce8376b1cfe954c40209943c42297ca65bedfa462c1a2c3d2942abd9bcf67ec" gracePeriod=600 Jan 28 21:16:16 crc kubenswrapper[4746]: I0128 21:16:16.319274 4746 generic.go:334] "Generic (PLEG): container finished" podID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerID="bce8376b1cfe954c40209943c42297ca65bedfa462c1a2c3d2942abd9bcf67ec" exitCode=0 Jan 28 21:16:16 crc kubenswrapper[4746]: I0128 21:16:16.319375 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerDied","Data":"bce8376b1cfe954c40209943c42297ca65bedfa462c1a2c3d2942abd9bcf67ec"} Jan 28 21:16:16 crc kubenswrapper[4746]: I0128 21:16:16.319608 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerStarted","Data":"cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767"} Jan 28 21:16:16 crc kubenswrapper[4746]: I0128 21:16:16.319640 4746 scope.go:117] "RemoveContainer" containerID="4ff7d295e236267b904a42699c2e8affd72dcade2a9eeee3b96c8eb13de3f968" Jan 28 21:18:45 crc kubenswrapper[4746]: I0128 21:18:45.871760 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:18:45 crc kubenswrapper[4746]: I0128 21:18:45.872486 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:19:15 crc kubenswrapper[4746]: I0128 21:19:15.871841 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:19:15 crc kubenswrapper[4746]: I0128 21:19:15.872338 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:19:16 crc kubenswrapper[4746]: I0128 21:19:16.189287 4746 generic.go:334] "Generic (PLEG): container finished" podID="7b6fd411-07ae-42b1-bb00-68e72fdbe6fb" containerID="22435159f52d49d93d3e6caa9b752a0c07f3dfa6a91e89ea65f59b8e737d96f1" exitCode=0 Jan 28 21:19:16 crc kubenswrapper[4746]: I0128 21:19:16.189343 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" event={"ID":"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb","Type":"ContainerDied","Data":"22435159f52d49d93d3e6caa9b752a0c07f3dfa6a91e89ea65f59b8e737d96f1"} Jan 28 21:19:17 crc kubenswrapper[4746]: I0128 21:19:17.720229 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:19:17 crc kubenswrapper[4746]: I0128 21:19:17.812583 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-libvirt-secret-0\") pod \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " Jan 28 21:19:17 crc kubenswrapper[4746]: I0128 21:19:17.812727 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9vl2\" (UniqueName: \"kubernetes.io/projected/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-kube-api-access-h9vl2\") pod \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " Jan 28 21:19:17 crc kubenswrapper[4746]: I0128 21:19:17.812854 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-libvirt-combined-ca-bundle\") pod \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " Jan 28 21:19:17 crc kubenswrapper[4746]: I0128 21:19:17.812887 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-ssh-key-openstack-edpm-ipam\") pod \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " Jan 28 21:19:17 crc kubenswrapper[4746]: I0128 21:19:17.812922 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-inventory\") pod \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\" (UID: \"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb\") " Jan 28 21:19:17 crc kubenswrapper[4746]: I0128 21:19:17.818311 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-kube-api-access-h9vl2" (OuterVolumeSpecName: "kube-api-access-h9vl2") pod "7b6fd411-07ae-42b1-bb00-68e72fdbe6fb" (UID: "7b6fd411-07ae-42b1-bb00-68e72fdbe6fb"). InnerVolumeSpecName "kube-api-access-h9vl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:19:17 crc kubenswrapper[4746]: I0128 21:19:17.819401 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7b6fd411-07ae-42b1-bb00-68e72fdbe6fb" (UID: "7b6fd411-07ae-42b1-bb00-68e72fdbe6fb"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:19:17 crc kubenswrapper[4746]: I0128 21:19:17.844188 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "7b6fd411-07ae-42b1-bb00-68e72fdbe6fb" (UID: "7b6fd411-07ae-42b1-bb00-68e72fdbe6fb"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:19:17 crc kubenswrapper[4746]: I0128 21:19:17.849325 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-inventory" (OuterVolumeSpecName: "inventory") pod "7b6fd411-07ae-42b1-bb00-68e72fdbe6fb" (UID: "7b6fd411-07ae-42b1-bb00-68e72fdbe6fb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:19:17 crc kubenswrapper[4746]: I0128 21:19:17.869299 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7b6fd411-07ae-42b1-bb00-68e72fdbe6fb" (UID: "7b6fd411-07ae-42b1-bb00-68e72fdbe6fb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:19:17 crc kubenswrapper[4746]: I0128 21:19:17.917329 4746 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:19:17 crc kubenswrapper[4746]: I0128 21:19:17.917368 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9vl2\" (UniqueName: \"kubernetes.io/projected/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-kube-api-access-h9vl2\") on node \"crc\" DevicePath \"\"" Jan 28 21:19:17 crc kubenswrapper[4746]: I0128 21:19:17.917384 4746 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:19:17 crc kubenswrapper[4746]: I0128 21:19:17.917397 4746 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 21:19:17 crc kubenswrapper[4746]: I0128 21:19:17.917409 4746 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b6fd411-07ae-42b1-bb00-68e72fdbe6fb-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.207460 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" event={"ID":"7b6fd411-07ae-42b1-bb00-68e72fdbe6fb","Type":"ContainerDied","Data":"0c1da9486790d09590de5aaa31f0d32e9d605d2ef20073cd3b9e37c430b4abfc"} Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.207504 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c1da9486790d09590de5aaa31f0d32e9d605d2ef20073cd3b9e37c430b4abfc" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.207561 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.301465 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx"] Jan 28 21:19:18 crc kubenswrapper[4746]: E0128 21:19:18.301933 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b6fd411-07ae-42b1-bb00-68e72fdbe6fb" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.301959 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b6fd411-07ae-42b1-bb00-68e72fdbe6fb" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.302239 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b6fd411-07ae-42b1-bb00-68e72fdbe6fb" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.302914 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.306248 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.306287 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.306546 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.306789 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.307356 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.307425 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.314541 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xb8vv" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.319825 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx"] Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.425561 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.425792 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66djq\" (UniqueName: \"kubernetes.io/projected/1d1f9f12-edab-459d-b9ac-2bb03644b752-kube-api-access-66djq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.425885 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.426013 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.426359 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.426401 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.426447 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.426474 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.426667 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.529114 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66djq\" (UniqueName: \"kubernetes.io/projected/1d1f9f12-edab-459d-b9ac-2bb03644b752-kube-api-access-66djq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.529195 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.529243 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.529342 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.529376 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.529430 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.529460 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.529496 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.529571 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.530657 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.534197 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.534913 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.535008 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.536557 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.536800 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.536937 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.537046 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.562978 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66djq\" (UniqueName: \"kubernetes.io/projected/1d1f9f12-edab-459d-b9ac-2bb03644b752-kube-api-access-66djq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-dpjnx\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:18 crc kubenswrapper[4746]: I0128 21:19:18.654879 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:19:19 crc kubenswrapper[4746]: I0128 21:19:19.228371 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx"] Jan 28 21:19:20 crc kubenswrapper[4746]: I0128 21:19:20.225954 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" event={"ID":"1d1f9f12-edab-459d-b9ac-2bb03644b752","Type":"ContainerStarted","Data":"477c224df459a92a798eeefb29eb1be6819ee638e5afd96608985a998ee69aee"} Jan 28 21:19:20 crc kubenswrapper[4746]: I0128 21:19:20.226482 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" event={"ID":"1d1f9f12-edab-459d-b9ac-2bb03644b752","Type":"ContainerStarted","Data":"952b02973177849b9f0c3cf1306d21191f87d9cea07616fdf9077f7f67f10696"} Jan 28 21:19:20 crc kubenswrapper[4746]: I0128 21:19:20.248958 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" podStartSLOduration=1.8052552830000002 podStartE2EDuration="2.248930191s" podCreationTimestamp="2026-01-28 21:19:18 +0000 UTC" firstStartedPulling="2026-01-28 21:19:19.235319734 +0000 UTC m=+2387.191506098" lastFinishedPulling="2026-01-28 21:19:19.678994652 +0000 UTC m=+2387.635181006" observedRunningTime="2026-01-28 21:19:20.24375977 +0000 UTC m=+2388.199946124" watchObservedRunningTime="2026-01-28 21:19:20.248930191 +0000 UTC m=+2388.205116545" Jan 28 21:19:45 crc kubenswrapper[4746]: I0128 21:19:45.871137 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:19:45 crc kubenswrapper[4746]: I0128 21:19:45.871688 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:19:45 crc kubenswrapper[4746]: I0128 21:19:45.871733 4746 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 21:19:45 crc kubenswrapper[4746]: I0128 21:19:45.872630 4746 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767"} pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 21:19:45 crc kubenswrapper[4746]: I0128 21:19:45.872699 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" containerID="cri-o://cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" gracePeriod=600 Jan 28 21:19:45 crc kubenswrapper[4746]: E0128 21:19:45.994218 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:19:46 crc kubenswrapper[4746]: I0128 21:19:46.514932 4746 generic.go:334] "Generic (PLEG): container finished" podID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" exitCode=0 Jan 28 21:19:46 crc kubenswrapper[4746]: I0128 21:19:46.514980 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerDied","Data":"cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767"} Jan 28 21:19:46 crc kubenswrapper[4746]: I0128 21:19:46.515015 4746 scope.go:117] "RemoveContainer" containerID="bce8376b1cfe954c40209943c42297ca65bedfa462c1a2c3d2942abd9bcf67ec" Jan 28 21:19:46 crc kubenswrapper[4746]: I0128 21:19:46.515754 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:19:46 crc kubenswrapper[4746]: E0128 21:19:46.516844 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:19:59 crc kubenswrapper[4746]: I0128 21:19:59.836101 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:19:59 crc kubenswrapper[4746]: E0128 21:19:59.836998 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:20:13 crc kubenswrapper[4746]: I0128 21:20:13.857620 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:20:13 crc kubenswrapper[4746]: E0128 21:20:13.858696 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:20:25 crc kubenswrapper[4746]: I0128 21:20:25.836966 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:20:25 crc kubenswrapper[4746]: E0128 21:20:25.837780 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:20:39 crc kubenswrapper[4746]: I0128 21:20:39.836768 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:20:39 crc kubenswrapper[4746]: E0128 21:20:39.837669 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:20:50 crc kubenswrapper[4746]: I0128 21:20:50.836082 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:20:50 crc kubenswrapper[4746]: E0128 21:20:50.837473 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:20:57 crc kubenswrapper[4746]: I0128 21:20:57.155248 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fd9f5"] Jan 28 21:20:57 crc kubenswrapper[4746]: I0128 21:20:57.157943 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fd9f5" Jan 28 21:20:57 crc kubenswrapper[4746]: I0128 21:20:57.171437 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fd9f5"] Jan 28 21:20:57 crc kubenswrapper[4746]: I0128 21:20:57.319463 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9qtn\" (UniqueName: \"kubernetes.io/projected/402c6597-ece6-42cc-a032-bfa23a44e8c9-kube-api-access-h9qtn\") pod \"redhat-marketplace-fd9f5\" (UID: \"402c6597-ece6-42cc-a032-bfa23a44e8c9\") " pod="openshift-marketplace/redhat-marketplace-fd9f5" Jan 28 21:20:57 crc kubenswrapper[4746]: I0128 21:20:57.319553 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402c6597-ece6-42cc-a032-bfa23a44e8c9-catalog-content\") pod \"redhat-marketplace-fd9f5\" (UID: \"402c6597-ece6-42cc-a032-bfa23a44e8c9\") " pod="openshift-marketplace/redhat-marketplace-fd9f5" Jan 28 21:20:57 crc kubenswrapper[4746]: I0128 21:20:57.319617 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402c6597-ece6-42cc-a032-bfa23a44e8c9-utilities\") pod \"redhat-marketplace-fd9f5\" (UID: \"402c6597-ece6-42cc-a032-bfa23a44e8c9\") " pod="openshift-marketplace/redhat-marketplace-fd9f5" Jan 28 21:20:57 crc kubenswrapper[4746]: I0128 21:20:57.422128 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9qtn\" (UniqueName: \"kubernetes.io/projected/402c6597-ece6-42cc-a032-bfa23a44e8c9-kube-api-access-h9qtn\") pod \"redhat-marketplace-fd9f5\" (UID: \"402c6597-ece6-42cc-a032-bfa23a44e8c9\") " pod="openshift-marketplace/redhat-marketplace-fd9f5" Jan 28 21:20:57 crc kubenswrapper[4746]: I0128 21:20:57.422230 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402c6597-ece6-42cc-a032-bfa23a44e8c9-catalog-content\") pod \"redhat-marketplace-fd9f5\" (UID: \"402c6597-ece6-42cc-a032-bfa23a44e8c9\") " pod="openshift-marketplace/redhat-marketplace-fd9f5" Jan 28 21:20:57 crc kubenswrapper[4746]: I0128 21:20:57.422293 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402c6597-ece6-42cc-a032-bfa23a44e8c9-utilities\") pod \"redhat-marketplace-fd9f5\" (UID: \"402c6597-ece6-42cc-a032-bfa23a44e8c9\") " pod="openshift-marketplace/redhat-marketplace-fd9f5" Jan 28 21:20:57 crc kubenswrapper[4746]: I0128 21:20:57.422964 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402c6597-ece6-42cc-a032-bfa23a44e8c9-utilities\") pod \"redhat-marketplace-fd9f5\" (UID: \"402c6597-ece6-42cc-a032-bfa23a44e8c9\") " pod="openshift-marketplace/redhat-marketplace-fd9f5" Jan 28 21:20:57 crc kubenswrapper[4746]: I0128 21:20:57.423566 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402c6597-ece6-42cc-a032-bfa23a44e8c9-catalog-content\") pod \"redhat-marketplace-fd9f5\" (UID: \"402c6597-ece6-42cc-a032-bfa23a44e8c9\") " pod="openshift-marketplace/redhat-marketplace-fd9f5" Jan 28 21:20:57 crc kubenswrapper[4746]: I0128 21:20:57.468442 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9qtn\" (UniqueName: \"kubernetes.io/projected/402c6597-ece6-42cc-a032-bfa23a44e8c9-kube-api-access-h9qtn\") pod \"redhat-marketplace-fd9f5\" (UID: \"402c6597-ece6-42cc-a032-bfa23a44e8c9\") " pod="openshift-marketplace/redhat-marketplace-fd9f5" Jan 28 21:20:57 crc kubenswrapper[4746]: I0128 21:20:57.482609 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fd9f5" Jan 28 21:20:57 crc kubenswrapper[4746]: I0128 21:20:57.950400 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fd9f5"] Jan 28 21:20:58 crc kubenswrapper[4746]: I0128 21:20:58.213853 4746 generic.go:334] "Generic (PLEG): container finished" podID="402c6597-ece6-42cc-a032-bfa23a44e8c9" containerID="a7f29f00db6a3b63de3541d226d926dda4a54708628cfe215a8168bbf3e8d2a9" exitCode=0 Jan 28 21:20:58 crc kubenswrapper[4746]: I0128 21:20:58.214037 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd9f5" event={"ID":"402c6597-ece6-42cc-a032-bfa23a44e8c9","Type":"ContainerDied","Data":"a7f29f00db6a3b63de3541d226d926dda4a54708628cfe215a8168bbf3e8d2a9"} Jan 28 21:20:58 crc kubenswrapper[4746]: I0128 21:20:58.214236 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd9f5" event={"ID":"402c6597-ece6-42cc-a032-bfa23a44e8c9","Type":"ContainerStarted","Data":"75a406030ef5f6b9f9957c29d074d0ce924bec648bd924c70009d0f754063878"} Jan 28 21:20:58 crc kubenswrapper[4746]: I0128 21:20:58.216230 4746 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 21:21:00 crc kubenswrapper[4746]: I0128 21:21:00.237251 4746 generic.go:334] "Generic (PLEG): container finished" podID="402c6597-ece6-42cc-a032-bfa23a44e8c9" containerID="b4ebd47646524a7ff88c121f79479dbe14d4964a22a47393891962bc6179f1e4" exitCode=0 Jan 28 21:21:00 crc kubenswrapper[4746]: I0128 21:21:00.237453 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd9f5" event={"ID":"402c6597-ece6-42cc-a032-bfa23a44e8c9","Type":"ContainerDied","Data":"b4ebd47646524a7ff88c121f79479dbe14d4964a22a47393891962bc6179f1e4"} Jan 28 21:21:01 crc kubenswrapper[4746]: I0128 21:21:01.247934 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd9f5" event={"ID":"402c6597-ece6-42cc-a032-bfa23a44e8c9","Type":"ContainerStarted","Data":"fbcc91e8d09c58518503f2fc0511eff3fd9d3c6784a59739ec0b33447d45baa0"} Jan 28 21:21:01 crc kubenswrapper[4746]: I0128 21:21:01.268008 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fd9f5" podStartSLOduration=1.8603318789999999 podStartE2EDuration="4.267990997s" podCreationTimestamp="2026-01-28 21:20:57 +0000 UTC" firstStartedPulling="2026-01-28 21:20:58.215945294 +0000 UTC m=+2486.172131648" lastFinishedPulling="2026-01-28 21:21:00.623604412 +0000 UTC m=+2488.579790766" observedRunningTime="2026-01-28 21:21:01.26404967 +0000 UTC m=+2489.220236024" watchObservedRunningTime="2026-01-28 21:21:01.267990997 +0000 UTC m=+2489.224177351" Jan 28 21:21:01 crc kubenswrapper[4746]: I0128 21:21:01.835612 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:21:01 crc kubenswrapper[4746]: E0128 21:21:01.835881 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:21:07 crc kubenswrapper[4746]: I0128 21:21:07.483422 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fd9f5" Jan 28 21:21:07 crc kubenswrapper[4746]: I0128 21:21:07.484023 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fd9f5" Jan 28 21:21:07 crc kubenswrapper[4746]: I0128 21:21:07.562799 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fd9f5" Jan 28 21:21:08 crc kubenswrapper[4746]: I0128 21:21:08.382787 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fd9f5" Jan 28 21:21:08 crc kubenswrapper[4746]: I0128 21:21:08.439884 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fd9f5"] Jan 28 21:21:10 crc kubenswrapper[4746]: I0128 21:21:10.363041 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fd9f5" podUID="402c6597-ece6-42cc-a032-bfa23a44e8c9" containerName="registry-server" containerID="cri-o://fbcc91e8d09c58518503f2fc0511eff3fd9d3c6784a59739ec0b33447d45baa0" gracePeriod=2 Jan 28 21:21:10 crc kubenswrapper[4746]: I0128 21:21:10.909692 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fd9f5" Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.055891 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402c6597-ece6-42cc-a032-bfa23a44e8c9-utilities\") pod \"402c6597-ece6-42cc-a032-bfa23a44e8c9\" (UID: \"402c6597-ece6-42cc-a032-bfa23a44e8c9\") " Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.055956 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402c6597-ece6-42cc-a032-bfa23a44e8c9-catalog-content\") pod \"402c6597-ece6-42cc-a032-bfa23a44e8c9\" (UID: \"402c6597-ece6-42cc-a032-bfa23a44e8c9\") " Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.056238 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9qtn\" (UniqueName: \"kubernetes.io/projected/402c6597-ece6-42cc-a032-bfa23a44e8c9-kube-api-access-h9qtn\") pod \"402c6597-ece6-42cc-a032-bfa23a44e8c9\" (UID: \"402c6597-ece6-42cc-a032-bfa23a44e8c9\") " Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.058693 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402c6597-ece6-42cc-a032-bfa23a44e8c9-utilities" (OuterVolumeSpecName: "utilities") pod "402c6597-ece6-42cc-a032-bfa23a44e8c9" (UID: "402c6597-ece6-42cc-a032-bfa23a44e8c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.061664 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/402c6597-ece6-42cc-a032-bfa23a44e8c9-kube-api-access-h9qtn" (OuterVolumeSpecName: "kube-api-access-h9qtn") pod "402c6597-ece6-42cc-a032-bfa23a44e8c9" (UID: "402c6597-ece6-42cc-a032-bfa23a44e8c9"). InnerVolumeSpecName "kube-api-access-h9qtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.098106 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402c6597-ece6-42cc-a032-bfa23a44e8c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "402c6597-ece6-42cc-a032-bfa23a44e8c9" (UID: "402c6597-ece6-42cc-a032-bfa23a44e8c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.159227 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9qtn\" (UniqueName: \"kubernetes.io/projected/402c6597-ece6-42cc-a032-bfa23a44e8c9-kube-api-access-h9qtn\") on node \"crc\" DevicePath \"\"" Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.159265 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402c6597-ece6-42cc-a032-bfa23a44e8c9-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.159278 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402c6597-ece6-42cc-a032-bfa23a44e8c9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.373749 4746 generic.go:334] "Generic (PLEG): container finished" podID="402c6597-ece6-42cc-a032-bfa23a44e8c9" containerID="fbcc91e8d09c58518503f2fc0511eff3fd9d3c6784a59739ec0b33447d45baa0" exitCode=0 Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.373797 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd9f5" event={"ID":"402c6597-ece6-42cc-a032-bfa23a44e8c9","Type":"ContainerDied","Data":"fbcc91e8d09c58518503f2fc0511eff3fd9d3c6784a59739ec0b33447d45baa0"} Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.373827 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fd9f5" event={"ID":"402c6597-ece6-42cc-a032-bfa23a44e8c9","Type":"ContainerDied","Data":"75a406030ef5f6b9f9957c29d074d0ce924bec648bd924c70009d0f754063878"} Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.373835 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fd9f5" Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.373847 4746 scope.go:117] "RemoveContainer" containerID="fbcc91e8d09c58518503f2fc0511eff3fd9d3c6784a59739ec0b33447d45baa0" Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.398150 4746 scope.go:117] "RemoveContainer" containerID="b4ebd47646524a7ff88c121f79479dbe14d4964a22a47393891962bc6179f1e4" Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.424243 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fd9f5"] Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.424873 4746 scope.go:117] "RemoveContainer" containerID="a7f29f00db6a3b63de3541d226d926dda4a54708628cfe215a8168bbf3e8d2a9" Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.433332 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fd9f5"] Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.496047 4746 scope.go:117] "RemoveContainer" containerID="fbcc91e8d09c58518503f2fc0511eff3fd9d3c6784a59739ec0b33447d45baa0" Jan 28 21:21:11 crc kubenswrapper[4746]: E0128 21:21:11.496848 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbcc91e8d09c58518503f2fc0511eff3fd9d3c6784a59739ec0b33447d45baa0\": container with ID starting with fbcc91e8d09c58518503f2fc0511eff3fd9d3c6784a59739ec0b33447d45baa0 not found: ID does not exist" containerID="fbcc91e8d09c58518503f2fc0511eff3fd9d3c6784a59739ec0b33447d45baa0" Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.496888 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbcc91e8d09c58518503f2fc0511eff3fd9d3c6784a59739ec0b33447d45baa0"} err="failed to get container status \"fbcc91e8d09c58518503f2fc0511eff3fd9d3c6784a59739ec0b33447d45baa0\": rpc error: code = NotFound desc = could not find container \"fbcc91e8d09c58518503f2fc0511eff3fd9d3c6784a59739ec0b33447d45baa0\": container with ID starting with fbcc91e8d09c58518503f2fc0511eff3fd9d3c6784a59739ec0b33447d45baa0 not found: ID does not exist" Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.496918 4746 scope.go:117] "RemoveContainer" containerID="b4ebd47646524a7ff88c121f79479dbe14d4964a22a47393891962bc6179f1e4" Jan 28 21:21:11 crc kubenswrapper[4746]: E0128 21:21:11.497503 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4ebd47646524a7ff88c121f79479dbe14d4964a22a47393891962bc6179f1e4\": container with ID starting with b4ebd47646524a7ff88c121f79479dbe14d4964a22a47393891962bc6179f1e4 not found: ID does not exist" containerID="b4ebd47646524a7ff88c121f79479dbe14d4964a22a47393891962bc6179f1e4" Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.497540 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4ebd47646524a7ff88c121f79479dbe14d4964a22a47393891962bc6179f1e4"} err="failed to get container status \"b4ebd47646524a7ff88c121f79479dbe14d4964a22a47393891962bc6179f1e4\": rpc error: code = NotFound desc = could not find container \"b4ebd47646524a7ff88c121f79479dbe14d4964a22a47393891962bc6179f1e4\": container with ID starting with b4ebd47646524a7ff88c121f79479dbe14d4964a22a47393891962bc6179f1e4 not found: ID does not exist" Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.497562 4746 scope.go:117] "RemoveContainer" containerID="a7f29f00db6a3b63de3541d226d926dda4a54708628cfe215a8168bbf3e8d2a9" Jan 28 21:21:11 crc kubenswrapper[4746]: E0128 21:21:11.497932 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7f29f00db6a3b63de3541d226d926dda4a54708628cfe215a8168bbf3e8d2a9\": container with ID starting with a7f29f00db6a3b63de3541d226d926dda4a54708628cfe215a8168bbf3e8d2a9 not found: ID does not exist" containerID="a7f29f00db6a3b63de3541d226d926dda4a54708628cfe215a8168bbf3e8d2a9" Jan 28 21:21:11 crc kubenswrapper[4746]: I0128 21:21:11.497966 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7f29f00db6a3b63de3541d226d926dda4a54708628cfe215a8168bbf3e8d2a9"} err="failed to get container status \"a7f29f00db6a3b63de3541d226d926dda4a54708628cfe215a8168bbf3e8d2a9\": rpc error: code = NotFound desc = could not find container \"a7f29f00db6a3b63de3541d226d926dda4a54708628cfe215a8168bbf3e8d2a9\": container with ID starting with a7f29f00db6a3b63de3541d226d926dda4a54708628cfe215a8168bbf3e8d2a9 not found: ID does not exist" Jan 28 21:21:12 crc kubenswrapper[4746]: I0128 21:21:12.856764 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="402c6597-ece6-42cc-a032-bfa23a44e8c9" path="/var/lib/kubelet/pods/402c6597-ece6-42cc-a032-bfa23a44e8c9/volumes" Jan 28 21:21:16 crc kubenswrapper[4746]: I0128 21:21:16.836332 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:21:16 crc kubenswrapper[4746]: E0128 21:21:16.837285 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:21:24 crc kubenswrapper[4746]: I0128 21:21:24.522308 4746 generic.go:334] "Generic (PLEG): container finished" podID="1d1f9f12-edab-459d-b9ac-2bb03644b752" containerID="477c224df459a92a798eeefb29eb1be6819ee638e5afd96608985a998ee69aee" exitCode=0 Jan 28 21:21:24 crc kubenswrapper[4746]: I0128 21:21:24.522426 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" event={"ID":"1d1f9f12-edab-459d-b9ac-2bb03644b752","Type":"ContainerDied","Data":"477c224df459a92a798eeefb29eb1be6819ee638e5afd96608985a998ee69aee"} Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.022390 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.182725 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-migration-ssh-key-0\") pod \"1d1f9f12-edab-459d-b9ac-2bb03644b752\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.183170 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-cell1-compute-config-1\") pod \"1d1f9f12-edab-459d-b9ac-2bb03644b752\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.183283 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-combined-ca-bundle\") pod \"1d1f9f12-edab-459d-b9ac-2bb03644b752\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.183391 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-migration-ssh-key-1\") pod \"1d1f9f12-edab-459d-b9ac-2bb03644b752\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.183791 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66djq\" (UniqueName: \"kubernetes.io/projected/1d1f9f12-edab-459d-b9ac-2bb03644b752-kube-api-access-66djq\") pod \"1d1f9f12-edab-459d-b9ac-2bb03644b752\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.183886 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-cell1-compute-config-0\") pod \"1d1f9f12-edab-459d-b9ac-2bb03644b752\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.184019 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-extra-config-0\") pod \"1d1f9f12-edab-459d-b9ac-2bb03644b752\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.184114 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-ssh-key-openstack-edpm-ipam\") pod \"1d1f9f12-edab-459d-b9ac-2bb03644b752\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.184390 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-inventory\") pod \"1d1f9f12-edab-459d-b9ac-2bb03644b752\" (UID: \"1d1f9f12-edab-459d-b9ac-2bb03644b752\") " Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.188963 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d1f9f12-edab-459d-b9ac-2bb03644b752-kube-api-access-66djq" (OuterVolumeSpecName: "kube-api-access-66djq") pod "1d1f9f12-edab-459d-b9ac-2bb03644b752" (UID: "1d1f9f12-edab-459d-b9ac-2bb03644b752"). InnerVolumeSpecName "kube-api-access-66djq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.189496 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "1d1f9f12-edab-459d-b9ac-2bb03644b752" (UID: "1d1f9f12-edab-459d-b9ac-2bb03644b752"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.214594 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "1d1f9f12-edab-459d-b9ac-2bb03644b752" (UID: "1d1f9f12-edab-459d-b9ac-2bb03644b752"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.214929 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1d1f9f12-edab-459d-b9ac-2bb03644b752" (UID: "1d1f9f12-edab-459d-b9ac-2bb03644b752"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.216242 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "1d1f9f12-edab-459d-b9ac-2bb03644b752" (UID: "1d1f9f12-edab-459d-b9ac-2bb03644b752"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.217308 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "1d1f9f12-edab-459d-b9ac-2bb03644b752" (UID: "1d1f9f12-edab-459d-b9ac-2bb03644b752"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.220695 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-inventory" (OuterVolumeSpecName: "inventory") pod "1d1f9f12-edab-459d-b9ac-2bb03644b752" (UID: "1d1f9f12-edab-459d-b9ac-2bb03644b752"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.221563 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "1d1f9f12-edab-459d-b9ac-2bb03644b752" (UID: "1d1f9f12-edab-459d-b9ac-2bb03644b752"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.225716 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "1d1f9f12-edab-459d-b9ac-2bb03644b752" (UID: "1d1f9f12-edab-459d-b9ac-2bb03644b752"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.286692 4746 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.286722 4746 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.286733 4746 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.286743 4746 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.286752 4746 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.286761 4746 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.286770 4746 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.286779 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66djq\" (UniqueName: \"kubernetes.io/projected/1d1f9f12-edab-459d-b9ac-2bb03644b752-kube-api-access-66djq\") on node \"crc\" DevicePath \"\"" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.286787 4746 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/1d1f9f12-edab-459d-b9ac-2bb03644b752-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.542973 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" event={"ID":"1d1f9f12-edab-459d-b9ac-2bb03644b752","Type":"ContainerDied","Data":"952b02973177849b9f0c3cf1306d21191f87d9cea07616fdf9077f7f67f10696"} Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.543017 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="952b02973177849b9f0c3cf1306d21191f87d9cea07616fdf9077f7f67f10696" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.543073 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-dpjnx" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.644850 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt"] Jan 28 21:21:26 crc kubenswrapper[4746]: E0128 21:21:26.646468 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402c6597-ece6-42cc-a032-bfa23a44e8c9" containerName="extract-content" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.646563 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="402c6597-ece6-42cc-a032-bfa23a44e8c9" containerName="extract-content" Jan 28 21:21:26 crc kubenswrapper[4746]: E0128 21:21:26.646650 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402c6597-ece6-42cc-a032-bfa23a44e8c9" containerName="registry-server" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.646710 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="402c6597-ece6-42cc-a032-bfa23a44e8c9" containerName="registry-server" Jan 28 21:21:26 crc kubenswrapper[4746]: E0128 21:21:26.646782 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402c6597-ece6-42cc-a032-bfa23a44e8c9" containerName="extract-utilities" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.646833 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="402c6597-ece6-42cc-a032-bfa23a44e8c9" containerName="extract-utilities" Jan 28 21:21:26 crc kubenswrapper[4746]: E0128 21:21:26.646904 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d1f9f12-edab-459d-b9ac-2bb03644b752" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.646961 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d1f9f12-edab-459d-b9ac-2bb03644b752" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.647438 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="402c6597-ece6-42cc-a032-bfa23a44e8c9" containerName="registry-server" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.647550 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d1f9f12-edab-459d-b9ac-2bb03644b752" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.648707 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.667736 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-xb8vv" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.667968 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.668119 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.669417 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.669747 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.694868 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt"] Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.796201 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.796461 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btwk5\" (UniqueName: \"kubernetes.io/projected/9c46ddb7-5815-475e-b798-06a7fee944c8-kube-api-access-btwk5\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.796630 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.796727 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.796827 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.796920 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.797022 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.898768 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.898825 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btwk5\" (UniqueName: \"kubernetes.io/projected/9c46ddb7-5815-475e-b798-06a7fee944c8-kube-api-access-btwk5\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.898939 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.898999 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.899040 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.899111 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.899150 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.903018 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.903631 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.903767 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.906476 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.906738 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.914321 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.933549 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btwk5\" (UniqueName: \"kubernetes.io/projected/9c46ddb7-5815-475e-b798-06a7fee944c8-kube-api-access-btwk5\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:26 crc kubenswrapper[4746]: I0128 21:21:26.968857 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:21:27 crc kubenswrapper[4746]: I0128 21:21:27.519686 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt"] Jan 28 21:21:27 crc kubenswrapper[4746]: I0128 21:21:27.557266 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" event={"ID":"9c46ddb7-5815-475e-b798-06a7fee944c8","Type":"ContainerStarted","Data":"7d3ffb83535d398bcf8919156c24ee1f30e0dd39f5a708c1561192fc35f685b1"} Jan 28 21:21:27 crc kubenswrapper[4746]: I0128 21:21:27.836845 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:21:27 crc kubenswrapper[4746]: E0128 21:21:27.837403 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:21:28 crc kubenswrapper[4746]: I0128 21:21:28.576183 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" event={"ID":"9c46ddb7-5815-475e-b798-06a7fee944c8","Type":"ContainerStarted","Data":"4fe0d25a5a27ffda47995b6411fcca00fe410b27bf20342de4b899f1698f2c66"} Jan 28 21:21:28 crc kubenswrapper[4746]: I0128 21:21:28.603975 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" podStartSLOduration=2.08140177 podStartE2EDuration="2.603955684s" podCreationTimestamp="2026-01-28 21:21:26 +0000 UTC" firstStartedPulling="2026-01-28 21:21:27.519658663 +0000 UTC m=+2515.475845027" lastFinishedPulling="2026-01-28 21:21:28.042212577 +0000 UTC m=+2515.998398941" observedRunningTime="2026-01-28 21:21:28.597872569 +0000 UTC m=+2516.554058933" watchObservedRunningTime="2026-01-28 21:21:28.603955684 +0000 UTC m=+2516.560142038" Jan 28 21:21:42 crc kubenswrapper[4746]: I0128 21:21:42.843958 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:21:42 crc kubenswrapper[4746]: E0128 21:21:42.844967 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:21:55 crc kubenswrapper[4746]: I0128 21:21:55.836223 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:21:55 crc kubenswrapper[4746]: E0128 21:21:55.837465 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:22:07 crc kubenswrapper[4746]: I0128 21:22:07.836364 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:22:07 crc kubenswrapper[4746]: E0128 21:22:07.837920 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:22:18 crc kubenswrapper[4746]: I0128 21:22:18.835737 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:22:18 crc kubenswrapper[4746]: E0128 21:22:18.837941 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:22:32 crc kubenswrapper[4746]: I0128 21:22:32.845784 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:22:32 crc kubenswrapper[4746]: E0128 21:22:32.846757 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:22:45 crc kubenswrapper[4746]: I0128 21:22:45.836734 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:22:45 crc kubenswrapper[4746]: E0128 21:22:45.837675 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:22:57 crc kubenswrapper[4746]: I0128 21:22:57.836737 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:22:57 crc kubenswrapper[4746]: E0128 21:22:57.837577 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:23:09 crc kubenswrapper[4746]: I0128 21:23:09.836566 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:23:09 crc kubenswrapper[4746]: E0128 21:23:09.837986 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:23:23 crc kubenswrapper[4746]: I0128 21:23:23.837210 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:23:23 crc kubenswrapper[4746]: E0128 21:23:23.838019 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.423967 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qk2z6"] Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.429955 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qk2z6" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.436958 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qk2z6"] Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.571244 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e-utilities\") pod \"community-operators-qk2z6\" (UID: \"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e\") " pod="openshift-marketplace/community-operators-qk2z6" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.571392 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrn42\" (UniqueName: \"kubernetes.io/projected/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e-kube-api-access-wrn42\") pod \"community-operators-qk2z6\" (UID: \"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e\") " pod="openshift-marketplace/community-operators-qk2z6" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.571653 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e-catalog-content\") pod \"community-operators-qk2z6\" (UID: \"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e\") " pod="openshift-marketplace/community-operators-qk2z6" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.612129 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nw9n4"] Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.615134 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nw9n4" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.646248 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nw9n4"] Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.675304 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrn42\" (UniqueName: \"kubernetes.io/projected/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e-kube-api-access-wrn42\") pod \"community-operators-qk2z6\" (UID: \"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e\") " pod="openshift-marketplace/community-operators-qk2z6" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.675433 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e-catalog-content\") pod \"community-operators-qk2z6\" (UID: \"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e\") " pod="openshift-marketplace/community-operators-qk2z6" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.675459 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e-utilities\") pod \"community-operators-qk2z6\" (UID: \"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e\") " pod="openshift-marketplace/community-operators-qk2z6" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.675878 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e-utilities\") pod \"community-operators-qk2z6\" (UID: \"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e\") " pod="openshift-marketplace/community-operators-qk2z6" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.676356 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e-catalog-content\") pod \"community-operators-qk2z6\" (UID: \"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e\") " pod="openshift-marketplace/community-operators-qk2z6" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.696732 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrn42\" (UniqueName: \"kubernetes.io/projected/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e-kube-api-access-wrn42\") pod \"community-operators-qk2z6\" (UID: \"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e\") " pod="openshift-marketplace/community-operators-qk2z6" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.777508 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6hbv\" (UniqueName: \"kubernetes.io/projected/c964aa0f-5f83-4b77-9584-91f94b0e34e9-kube-api-access-x6hbv\") pod \"certified-operators-nw9n4\" (UID: \"c964aa0f-5f83-4b77-9584-91f94b0e34e9\") " pod="openshift-marketplace/certified-operators-nw9n4" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.777586 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c964aa0f-5f83-4b77-9584-91f94b0e34e9-utilities\") pod \"certified-operators-nw9n4\" (UID: \"c964aa0f-5f83-4b77-9584-91f94b0e34e9\") " pod="openshift-marketplace/certified-operators-nw9n4" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.777660 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c964aa0f-5f83-4b77-9584-91f94b0e34e9-catalog-content\") pod \"certified-operators-nw9n4\" (UID: \"c964aa0f-5f83-4b77-9584-91f94b0e34e9\") " pod="openshift-marketplace/certified-operators-nw9n4" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.785910 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qk2z6" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.899027 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6hbv\" (UniqueName: \"kubernetes.io/projected/c964aa0f-5f83-4b77-9584-91f94b0e34e9-kube-api-access-x6hbv\") pod \"certified-operators-nw9n4\" (UID: \"c964aa0f-5f83-4b77-9584-91f94b0e34e9\") " pod="openshift-marketplace/certified-operators-nw9n4" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.902389 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c964aa0f-5f83-4b77-9584-91f94b0e34e9-utilities\") pod \"certified-operators-nw9n4\" (UID: \"c964aa0f-5f83-4b77-9584-91f94b0e34e9\") " pod="openshift-marketplace/certified-operators-nw9n4" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.902645 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c964aa0f-5f83-4b77-9584-91f94b0e34e9-catalog-content\") pod \"certified-operators-nw9n4\" (UID: \"c964aa0f-5f83-4b77-9584-91f94b0e34e9\") " pod="openshift-marketplace/certified-operators-nw9n4" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.904066 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c964aa0f-5f83-4b77-9584-91f94b0e34e9-catalog-content\") pod \"certified-operators-nw9n4\" (UID: \"c964aa0f-5f83-4b77-9584-91f94b0e34e9\") " pod="openshift-marketplace/certified-operators-nw9n4" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.904555 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c964aa0f-5f83-4b77-9584-91f94b0e34e9-utilities\") pod \"certified-operators-nw9n4\" (UID: \"c964aa0f-5f83-4b77-9584-91f94b0e34e9\") " pod="openshift-marketplace/certified-operators-nw9n4" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.923553 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6hbv\" (UniqueName: \"kubernetes.io/projected/c964aa0f-5f83-4b77-9584-91f94b0e34e9-kube-api-access-x6hbv\") pod \"certified-operators-nw9n4\" (UID: \"c964aa0f-5f83-4b77-9584-91f94b0e34e9\") " pod="openshift-marketplace/certified-operators-nw9n4" Jan 28 21:23:26 crc kubenswrapper[4746]: I0128 21:23:26.937601 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nw9n4" Jan 28 21:23:27 crc kubenswrapper[4746]: I0128 21:23:27.394015 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qk2z6"] Jan 28 21:23:27 crc kubenswrapper[4746]: W0128 21:23:27.617442 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc964aa0f_5f83_4b77_9584_91f94b0e34e9.slice/crio-efbae4a736b7fee6cba98fa959b645907456c21a97eec4fa6a0a0509b86fe1e2 WatchSource:0}: Error finding container efbae4a736b7fee6cba98fa959b645907456c21a97eec4fa6a0a0509b86fe1e2: Status 404 returned error can't find the container with id efbae4a736b7fee6cba98fa959b645907456c21a97eec4fa6a0a0509b86fe1e2 Jan 28 21:23:27 crc kubenswrapper[4746]: I0128 21:23:27.630678 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nw9n4"] Jan 28 21:23:27 crc kubenswrapper[4746]: I0128 21:23:27.989242 4746 generic.go:334] "Generic (PLEG): container finished" podID="c964aa0f-5f83-4b77-9584-91f94b0e34e9" containerID="3f6736541b438c993d423ffb53913045b394d73aec1a836e9b6f2238eecf83dd" exitCode=0 Jan 28 21:23:27 crc kubenswrapper[4746]: I0128 21:23:27.989341 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw9n4" event={"ID":"c964aa0f-5f83-4b77-9584-91f94b0e34e9","Type":"ContainerDied","Data":"3f6736541b438c993d423ffb53913045b394d73aec1a836e9b6f2238eecf83dd"} Jan 28 21:23:27 crc kubenswrapper[4746]: I0128 21:23:27.989709 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw9n4" event={"ID":"c964aa0f-5f83-4b77-9584-91f94b0e34e9","Type":"ContainerStarted","Data":"efbae4a736b7fee6cba98fa959b645907456c21a97eec4fa6a0a0509b86fe1e2"} Jan 28 21:23:27 crc kubenswrapper[4746]: I0128 21:23:27.993952 4746 generic.go:334] "Generic (PLEG): container finished" podID="af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e" containerID="67b7aa30fd5c83df7908c10b6649bb9cfffc9c188d9673767f5b4a5172dfe277" exitCode=0 Jan 28 21:23:27 crc kubenswrapper[4746]: I0128 21:23:27.993989 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qk2z6" event={"ID":"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e","Type":"ContainerDied","Data":"67b7aa30fd5c83df7908c10b6649bb9cfffc9c188d9673767f5b4a5172dfe277"} Jan 28 21:23:27 crc kubenswrapper[4746]: I0128 21:23:27.994018 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qk2z6" event={"ID":"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e","Type":"ContainerStarted","Data":"f2c3d554fcd80f2edfb9b8c93f8cc7a1f137665af62328fc3886cc0bce60576b"} Jan 28 21:23:29 crc kubenswrapper[4746]: I0128 21:23:29.004647 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw9n4" event={"ID":"c964aa0f-5f83-4b77-9584-91f94b0e34e9","Type":"ContainerStarted","Data":"4e2ffba00302d81d620bafa30babff0170f004d58f104a713b95141ce182e41f"} Jan 28 21:23:29 crc kubenswrapper[4746]: I0128 21:23:29.006532 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qk2z6" event={"ID":"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e","Type":"ContainerStarted","Data":"d1be1344e3864c37c1663614ec4317238b17363135a26188437faf28cf7e05a3"} Jan 28 21:23:32 crc kubenswrapper[4746]: I0128 21:23:32.040805 4746 generic.go:334] "Generic (PLEG): container finished" podID="c964aa0f-5f83-4b77-9584-91f94b0e34e9" containerID="4e2ffba00302d81d620bafa30babff0170f004d58f104a713b95141ce182e41f" exitCode=0 Jan 28 21:23:32 crc kubenswrapper[4746]: I0128 21:23:32.041364 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw9n4" event={"ID":"c964aa0f-5f83-4b77-9584-91f94b0e34e9","Type":"ContainerDied","Data":"4e2ffba00302d81d620bafa30babff0170f004d58f104a713b95141ce182e41f"} Jan 28 21:23:32 crc kubenswrapper[4746]: I0128 21:23:32.045451 4746 generic.go:334] "Generic (PLEG): container finished" podID="af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e" containerID="d1be1344e3864c37c1663614ec4317238b17363135a26188437faf28cf7e05a3" exitCode=0 Jan 28 21:23:32 crc kubenswrapper[4746]: I0128 21:23:32.045486 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qk2z6" event={"ID":"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e","Type":"ContainerDied","Data":"d1be1344e3864c37c1663614ec4317238b17363135a26188437faf28cf7e05a3"} Jan 28 21:23:33 crc kubenswrapper[4746]: I0128 21:23:33.056583 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw9n4" event={"ID":"c964aa0f-5f83-4b77-9584-91f94b0e34e9","Type":"ContainerStarted","Data":"0904cddcf0dcd287b2dfbdb857d9bfb381c60ef3d980ae5047d8ea882a9e5c2c"} Jan 28 21:23:33 crc kubenswrapper[4746]: I0128 21:23:33.062491 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qk2z6" event={"ID":"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e","Type":"ContainerStarted","Data":"25f37ea19f62314c6a2407d671683ac7e056077078ad2933d3d2b6750e08a389"} Jan 28 21:23:33 crc kubenswrapper[4746]: I0128 21:23:33.083866 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nw9n4" podStartSLOduration=2.6235982829999998 podStartE2EDuration="7.083845641s" podCreationTimestamp="2026-01-28 21:23:26 +0000 UTC" firstStartedPulling="2026-01-28 21:23:27.990875644 +0000 UTC m=+2635.947062008" lastFinishedPulling="2026-01-28 21:23:32.451122992 +0000 UTC m=+2640.407309366" observedRunningTime="2026-01-28 21:23:33.073823118 +0000 UTC m=+2641.030009462" watchObservedRunningTime="2026-01-28 21:23:33.083845641 +0000 UTC m=+2641.040031995" Jan 28 21:23:33 crc kubenswrapper[4746]: I0128 21:23:33.102834 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qk2z6" podStartSLOduration=2.654068107 podStartE2EDuration="7.102814574s" podCreationTimestamp="2026-01-28 21:23:26 +0000 UTC" firstStartedPulling="2026-01-28 21:23:27.996970088 +0000 UTC m=+2635.953156432" lastFinishedPulling="2026-01-28 21:23:32.445716535 +0000 UTC m=+2640.401902899" observedRunningTime="2026-01-28 21:23:33.092478844 +0000 UTC m=+2641.048665188" watchObservedRunningTime="2026-01-28 21:23:33.102814574 +0000 UTC m=+2641.059000918" Jan 28 21:23:36 crc kubenswrapper[4746]: I0128 21:23:36.786402 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qk2z6" Jan 28 21:23:36 crc kubenswrapper[4746]: I0128 21:23:36.787791 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qk2z6" Jan 28 21:23:36 crc kubenswrapper[4746]: I0128 21:23:36.857129 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qk2z6" Jan 28 21:23:36 crc kubenswrapper[4746]: I0128 21:23:36.938266 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nw9n4" Jan 28 21:23:36 crc kubenswrapper[4746]: I0128 21:23:36.938527 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nw9n4" Jan 28 21:23:37 crc kubenswrapper[4746]: I0128 21:23:37.166493 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qk2z6" Jan 28 21:23:37 crc kubenswrapper[4746]: I0128 21:23:37.804108 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qk2z6"] Jan 28 21:23:37 crc kubenswrapper[4746]: I0128 21:23:37.990790 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-nw9n4" podUID="c964aa0f-5f83-4b77-9584-91f94b0e34e9" containerName="registry-server" probeResult="failure" output=< Jan 28 21:23:37 crc kubenswrapper[4746]: timeout: failed to connect service ":50051" within 1s Jan 28 21:23:37 crc kubenswrapper[4746]: > Jan 28 21:23:38 crc kubenswrapper[4746]: I0128 21:23:38.836431 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:23:38 crc kubenswrapper[4746]: E0128 21:23:38.837013 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:23:39 crc kubenswrapper[4746]: I0128 21:23:39.118514 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qk2z6" podUID="af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e" containerName="registry-server" containerID="cri-o://25f37ea19f62314c6a2407d671683ac7e056077078ad2933d3d2b6750e08a389" gracePeriod=2 Jan 28 21:23:39 crc kubenswrapper[4746]: I0128 21:23:39.661207 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qk2z6" Jan 28 21:23:39 crc kubenswrapper[4746]: I0128 21:23:39.776822 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e-utilities\") pod \"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e\" (UID: \"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e\") " Jan 28 21:23:39 crc kubenswrapper[4746]: I0128 21:23:39.776925 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrn42\" (UniqueName: \"kubernetes.io/projected/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e-kube-api-access-wrn42\") pod \"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e\" (UID: \"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e\") " Jan 28 21:23:39 crc kubenswrapper[4746]: I0128 21:23:39.777037 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e-catalog-content\") pod \"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e\" (UID: \"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e\") " Jan 28 21:23:39 crc kubenswrapper[4746]: I0128 21:23:39.778763 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e-utilities" (OuterVolumeSpecName: "utilities") pod "af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e" (UID: "af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:23:39 crc kubenswrapper[4746]: I0128 21:23:39.785281 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e-kube-api-access-wrn42" (OuterVolumeSpecName: "kube-api-access-wrn42") pod "af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e" (UID: "af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e"). InnerVolumeSpecName "kube-api-access-wrn42". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:23:39 crc kubenswrapper[4746]: I0128 21:23:39.832580 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e" (UID: "af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:23:39 crc kubenswrapper[4746]: I0128 21:23:39.879427 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 21:23:39 crc kubenswrapper[4746]: I0128 21:23:39.879464 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrn42\" (UniqueName: \"kubernetes.io/projected/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e-kube-api-access-wrn42\") on node \"crc\" DevicePath \"\"" Jan 28 21:23:39 crc kubenswrapper[4746]: I0128 21:23:39.879505 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 21:23:40 crc kubenswrapper[4746]: I0128 21:23:40.130317 4746 generic.go:334] "Generic (PLEG): container finished" podID="af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e" containerID="25f37ea19f62314c6a2407d671683ac7e056077078ad2933d3d2b6750e08a389" exitCode=0 Jan 28 21:23:40 crc kubenswrapper[4746]: I0128 21:23:40.130523 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qk2z6" event={"ID":"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e","Type":"ContainerDied","Data":"25f37ea19f62314c6a2407d671683ac7e056077078ad2933d3d2b6750e08a389"} Jan 28 21:23:40 crc kubenswrapper[4746]: I0128 21:23:40.130622 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qk2z6" event={"ID":"af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e","Type":"ContainerDied","Data":"f2c3d554fcd80f2edfb9b8c93f8cc7a1f137665af62328fc3886cc0bce60576b"} Jan 28 21:23:40 crc kubenswrapper[4746]: I0128 21:23:40.130641 4746 scope.go:117] "RemoveContainer" containerID="25f37ea19f62314c6a2407d671683ac7e056077078ad2933d3d2b6750e08a389" Jan 28 21:23:40 crc kubenswrapper[4746]: I0128 21:23:40.130582 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qk2z6" Jan 28 21:23:40 crc kubenswrapper[4746]: I0128 21:23:40.164219 4746 scope.go:117] "RemoveContainer" containerID="d1be1344e3864c37c1663614ec4317238b17363135a26188437faf28cf7e05a3" Jan 28 21:23:40 crc kubenswrapper[4746]: I0128 21:23:40.174537 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qk2z6"] Jan 28 21:23:40 crc kubenswrapper[4746]: I0128 21:23:40.183654 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qk2z6"] Jan 28 21:23:40 crc kubenswrapper[4746]: I0128 21:23:40.194686 4746 scope.go:117] "RemoveContainer" containerID="67b7aa30fd5c83df7908c10b6649bb9cfffc9c188d9673767f5b4a5172dfe277" Jan 28 21:23:40 crc kubenswrapper[4746]: I0128 21:23:40.234806 4746 scope.go:117] "RemoveContainer" containerID="25f37ea19f62314c6a2407d671683ac7e056077078ad2933d3d2b6750e08a389" Jan 28 21:23:40 crc kubenswrapper[4746]: E0128 21:23:40.235604 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f37ea19f62314c6a2407d671683ac7e056077078ad2933d3d2b6750e08a389\": container with ID starting with 25f37ea19f62314c6a2407d671683ac7e056077078ad2933d3d2b6750e08a389 not found: ID does not exist" containerID="25f37ea19f62314c6a2407d671683ac7e056077078ad2933d3d2b6750e08a389" Jan 28 21:23:40 crc kubenswrapper[4746]: I0128 21:23:40.235639 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f37ea19f62314c6a2407d671683ac7e056077078ad2933d3d2b6750e08a389"} err="failed to get container status \"25f37ea19f62314c6a2407d671683ac7e056077078ad2933d3d2b6750e08a389\": rpc error: code = NotFound desc = could not find container \"25f37ea19f62314c6a2407d671683ac7e056077078ad2933d3d2b6750e08a389\": container with ID starting with 25f37ea19f62314c6a2407d671683ac7e056077078ad2933d3d2b6750e08a389 not found: ID does not exist" Jan 28 21:23:40 crc kubenswrapper[4746]: I0128 21:23:40.235660 4746 scope.go:117] "RemoveContainer" containerID="d1be1344e3864c37c1663614ec4317238b17363135a26188437faf28cf7e05a3" Jan 28 21:23:40 crc kubenswrapper[4746]: E0128 21:23:40.236360 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1be1344e3864c37c1663614ec4317238b17363135a26188437faf28cf7e05a3\": container with ID starting with d1be1344e3864c37c1663614ec4317238b17363135a26188437faf28cf7e05a3 not found: ID does not exist" containerID="d1be1344e3864c37c1663614ec4317238b17363135a26188437faf28cf7e05a3" Jan 28 21:23:40 crc kubenswrapper[4746]: I0128 21:23:40.236397 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1be1344e3864c37c1663614ec4317238b17363135a26188437faf28cf7e05a3"} err="failed to get container status \"d1be1344e3864c37c1663614ec4317238b17363135a26188437faf28cf7e05a3\": rpc error: code = NotFound desc = could not find container \"d1be1344e3864c37c1663614ec4317238b17363135a26188437faf28cf7e05a3\": container with ID starting with d1be1344e3864c37c1663614ec4317238b17363135a26188437faf28cf7e05a3 not found: ID does not exist" Jan 28 21:23:40 crc kubenswrapper[4746]: I0128 21:23:40.236417 4746 scope.go:117] "RemoveContainer" containerID="67b7aa30fd5c83df7908c10b6649bb9cfffc9c188d9673767f5b4a5172dfe277" Jan 28 21:23:40 crc kubenswrapper[4746]: E0128 21:23:40.236795 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67b7aa30fd5c83df7908c10b6649bb9cfffc9c188d9673767f5b4a5172dfe277\": container with ID starting with 67b7aa30fd5c83df7908c10b6649bb9cfffc9c188d9673767f5b4a5172dfe277 not found: ID does not exist" containerID="67b7aa30fd5c83df7908c10b6649bb9cfffc9c188d9673767f5b4a5172dfe277" Jan 28 21:23:40 crc kubenswrapper[4746]: I0128 21:23:40.236820 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67b7aa30fd5c83df7908c10b6649bb9cfffc9c188d9673767f5b4a5172dfe277"} err="failed to get container status \"67b7aa30fd5c83df7908c10b6649bb9cfffc9c188d9673767f5b4a5172dfe277\": rpc error: code = NotFound desc = could not find container \"67b7aa30fd5c83df7908c10b6649bb9cfffc9c188d9673767f5b4a5172dfe277\": container with ID starting with 67b7aa30fd5c83df7908c10b6649bb9cfffc9c188d9673767f5b4a5172dfe277 not found: ID does not exist" Jan 28 21:23:40 crc kubenswrapper[4746]: I0128 21:23:40.846229 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e" path="/var/lib/kubelet/pods/af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e/volumes" Jan 28 21:23:47 crc kubenswrapper[4746]: I0128 21:23:47.014363 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nw9n4" Jan 28 21:23:47 crc kubenswrapper[4746]: I0128 21:23:47.077980 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nw9n4" Jan 28 21:23:47 crc kubenswrapper[4746]: I0128 21:23:47.285221 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nw9n4"] Jan 28 21:23:48 crc kubenswrapper[4746]: I0128 21:23:48.225298 4746 generic.go:334] "Generic (PLEG): container finished" podID="9c46ddb7-5815-475e-b798-06a7fee944c8" containerID="4fe0d25a5a27ffda47995b6411fcca00fe410b27bf20342de4b899f1698f2c66" exitCode=0 Jan 28 21:23:48 crc kubenswrapper[4746]: I0128 21:23:48.225431 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" event={"ID":"9c46ddb7-5815-475e-b798-06a7fee944c8","Type":"ContainerDied","Data":"4fe0d25a5a27ffda47995b6411fcca00fe410b27bf20342de4b899f1698f2c66"} Jan 28 21:23:48 crc kubenswrapper[4746]: I0128 21:23:48.225627 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nw9n4" podUID="c964aa0f-5f83-4b77-9584-91f94b0e34e9" containerName="registry-server" containerID="cri-o://0904cddcf0dcd287b2dfbdb857d9bfb381c60ef3d980ae5047d8ea882a9e5c2c" gracePeriod=2 Jan 28 21:23:48 crc kubenswrapper[4746]: I0128 21:23:48.760584 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nw9n4" Jan 28 21:23:48 crc kubenswrapper[4746]: I0128 21:23:48.874031 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c964aa0f-5f83-4b77-9584-91f94b0e34e9-utilities\") pod \"c964aa0f-5f83-4b77-9584-91f94b0e34e9\" (UID: \"c964aa0f-5f83-4b77-9584-91f94b0e34e9\") " Jan 28 21:23:48 crc kubenswrapper[4746]: I0128 21:23:48.874229 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6hbv\" (UniqueName: \"kubernetes.io/projected/c964aa0f-5f83-4b77-9584-91f94b0e34e9-kube-api-access-x6hbv\") pod \"c964aa0f-5f83-4b77-9584-91f94b0e34e9\" (UID: \"c964aa0f-5f83-4b77-9584-91f94b0e34e9\") " Jan 28 21:23:48 crc kubenswrapper[4746]: I0128 21:23:48.874269 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c964aa0f-5f83-4b77-9584-91f94b0e34e9-catalog-content\") pod \"c964aa0f-5f83-4b77-9584-91f94b0e34e9\" (UID: \"c964aa0f-5f83-4b77-9584-91f94b0e34e9\") " Jan 28 21:23:48 crc kubenswrapper[4746]: I0128 21:23:48.875350 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c964aa0f-5f83-4b77-9584-91f94b0e34e9-utilities" (OuterVolumeSpecName: "utilities") pod "c964aa0f-5f83-4b77-9584-91f94b0e34e9" (UID: "c964aa0f-5f83-4b77-9584-91f94b0e34e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:23:48 crc kubenswrapper[4746]: I0128 21:23:48.879428 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c964aa0f-5f83-4b77-9584-91f94b0e34e9-kube-api-access-x6hbv" (OuterVolumeSpecName: "kube-api-access-x6hbv") pod "c964aa0f-5f83-4b77-9584-91f94b0e34e9" (UID: "c964aa0f-5f83-4b77-9584-91f94b0e34e9"). InnerVolumeSpecName "kube-api-access-x6hbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:23:48 crc kubenswrapper[4746]: I0128 21:23:48.944337 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c964aa0f-5f83-4b77-9584-91f94b0e34e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c964aa0f-5f83-4b77-9584-91f94b0e34e9" (UID: "c964aa0f-5f83-4b77-9584-91f94b0e34e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:23:48 crc kubenswrapper[4746]: I0128 21:23:48.978850 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c964aa0f-5f83-4b77-9584-91f94b0e34e9-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 21:23:48 crc kubenswrapper[4746]: I0128 21:23:48.978934 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6hbv\" (UniqueName: \"kubernetes.io/projected/c964aa0f-5f83-4b77-9584-91f94b0e34e9-kube-api-access-x6hbv\") on node \"crc\" DevicePath \"\"" Jan 28 21:23:48 crc kubenswrapper[4746]: I0128 21:23:48.978955 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c964aa0f-5f83-4b77-9584-91f94b0e34e9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.245422 4746 generic.go:334] "Generic (PLEG): container finished" podID="c964aa0f-5f83-4b77-9584-91f94b0e34e9" containerID="0904cddcf0dcd287b2dfbdb857d9bfb381c60ef3d980ae5047d8ea882a9e5c2c" exitCode=0 Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.245490 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nw9n4" Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.245503 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw9n4" event={"ID":"c964aa0f-5f83-4b77-9584-91f94b0e34e9","Type":"ContainerDied","Data":"0904cddcf0dcd287b2dfbdb857d9bfb381c60ef3d980ae5047d8ea882a9e5c2c"} Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.246024 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nw9n4" event={"ID":"c964aa0f-5f83-4b77-9584-91f94b0e34e9","Type":"ContainerDied","Data":"efbae4a736b7fee6cba98fa959b645907456c21a97eec4fa6a0a0509b86fe1e2"} Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.246065 4746 scope.go:117] "RemoveContainer" containerID="0904cddcf0dcd287b2dfbdb857d9bfb381c60ef3d980ae5047d8ea882a9e5c2c" Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.290680 4746 scope.go:117] "RemoveContainer" containerID="4e2ffba00302d81d620bafa30babff0170f004d58f104a713b95141ce182e41f" Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.339018 4746 scope.go:117] "RemoveContainer" containerID="3f6736541b438c993d423ffb53913045b394d73aec1a836e9b6f2238eecf83dd" Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.341165 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nw9n4"] Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.355682 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nw9n4"] Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.387115 4746 scope.go:117] "RemoveContainer" containerID="0904cddcf0dcd287b2dfbdb857d9bfb381c60ef3d980ae5047d8ea882a9e5c2c" Jan 28 21:23:49 crc kubenswrapper[4746]: E0128 21:23:49.403225 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0904cddcf0dcd287b2dfbdb857d9bfb381c60ef3d980ae5047d8ea882a9e5c2c\": container with ID starting with 0904cddcf0dcd287b2dfbdb857d9bfb381c60ef3d980ae5047d8ea882a9e5c2c not found: ID does not exist" containerID="0904cddcf0dcd287b2dfbdb857d9bfb381c60ef3d980ae5047d8ea882a9e5c2c" Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.403275 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0904cddcf0dcd287b2dfbdb857d9bfb381c60ef3d980ae5047d8ea882a9e5c2c"} err="failed to get container status \"0904cddcf0dcd287b2dfbdb857d9bfb381c60ef3d980ae5047d8ea882a9e5c2c\": rpc error: code = NotFound desc = could not find container \"0904cddcf0dcd287b2dfbdb857d9bfb381c60ef3d980ae5047d8ea882a9e5c2c\": container with ID starting with 0904cddcf0dcd287b2dfbdb857d9bfb381c60ef3d980ae5047d8ea882a9e5c2c not found: ID does not exist" Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.403303 4746 scope.go:117] "RemoveContainer" containerID="4e2ffba00302d81d620bafa30babff0170f004d58f104a713b95141ce182e41f" Jan 28 21:23:49 crc kubenswrapper[4746]: E0128 21:23:49.403567 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e2ffba00302d81d620bafa30babff0170f004d58f104a713b95141ce182e41f\": container with ID starting with 4e2ffba00302d81d620bafa30babff0170f004d58f104a713b95141ce182e41f not found: ID does not exist" containerID="4e2ffba00302d81d620bafa30babff0170f004d58f104a713b95141ce182e41f" Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.403601 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e2ffba00302d81d620bafa30babff0170f004d58f104a713b95141ce182e41f"} err="failed to get container status \"4e2ffba00302d81d620bafa30babff0170f004d58f104a713b95141ce182e41f\": rpc error: code = NotFound desc = could not find container \"4e2ffba00302d81d620bafa30babff0170f004d58f104a713b95141ce182e41f\": container with ID starting with 4e2ffba00302d81d620bafa30babff0170f004d58f104a713b95141ce182e41f not found: ID does not exist" Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.403618 4746 scope.go:117] "RemoveContainer" containerID="3f6736541b438c993d423ffb53913045b394d73aec1a836e9b6f2238eecf83dd" Jan 28 21:23:49 crc kubenswrapper[4746]: E0128 21:23:49.404560 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f6736541b438c993d423ffb53913045b394d73aec1a836e9b6f2238eecf83dd\": container with ID starting with 3f6736541b438c993d423ffb53913045b394d73aec1a836e9b6f2238eecf83dd not found: ID does not exist" containerID="3f6736541b438c993d423ffb53913045b394d73aec1a836e9b6f2238eecf83dd" Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.404584 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6736541b438c993d423ffb53913045b394d73aec1a836e9b6f2238eecf83dd"} err="failed to get container status \"3f6736541b438c993d423ffb53913045b394d73aec1a836e9b6f2238eecf83dd\": rpc error: code = NotFound desc = could not find container \"3f6736541b438c993d423ffb53913045b394d73aec1a836e9b6f2238eecf83dd\": container with ID starting with 3f6736541b438c993d423ffb53913045b394d73aec1a836e9b6f2238eecf83dd not found: ID does not exist" Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.769851 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.899640 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ssh-key-openstack-edpm-ipam\") pod \"9c46ddb7-5815-475e-b798-06a7fee944c8\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.899981 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ceilometer-compute-config-data-0\") pod \"9c46ddb7-5815-475e-b798-06a7fee944c8\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.900176 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-telemetry-combined-ca-bundle\") pod \"9c46ddb7-5815-475e-b798-06a7fee944c8\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.900236 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ceilometer-compute-config-data-1\") pod \"9c46ddb7-5815-475e-b798-06a7fee944c8\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.900278 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-inventory\") pod \"9c46ddb7-5815-475e-b798-06a7fee944c8\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.900384 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ceilometer-compute-config-data-2\") pod \"9c46ddb7-5815-475e-b798-06a7fee944c8\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.900439 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btwk5\" (UniqueName: \"kubernetes.io/projected/9c46ddb7-5815-475e-b798-06a7fee944c8-kube-api-access-btwk5\") pod \"9c46ddb7-5815-475e-b798-06a7fee944c8\" (UID: \"9c46ddb7-5815-475e-b798-06a7fee944c8\") " Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.904957 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c46ddb7-5815-475e-b798-06a7fee944c8-kube-api-access-btwk5" (OuterVolumeSpecName: "kube-api-access-btwk5") pod "9c46ddb7-5815-475e-b798-06a7fee944c8" (UID: "9c46ddb7-5815-475e-b798-06a7fee944c8"). InnerVolumeSpecName "kube-api-access-btwk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.905076 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9c46ddb7-5815-475e-b798-06a7fee944c8" (UID: "9c46ddb7-5815-475e-b798-06a7fee944c8"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.928710 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "9c46ddb7-5815-475e-b798-06a7fee944c8" (UID: "9c46ddb7-5815-475e-b798-06a7fee944c8"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.929403 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-inventory" (OuterVolumeSpecName: "inventory") pod "9c46ddb7-5815-475e-b798-06a7fee944c8" (UID: "9c46ddb7-5815-475e-b798-06a7fee944c8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.935210 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9c46ddb7-5815-475e-b798-06a7fee944c8" (UID: "9c46ddb7-5815-475e-b798-06a7fee944c8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.947654 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "9c46ddb7-5815-475e-b798-06a7fee944c8" (UID: "9c46ddb7-5815-475e-b798-06a7fee944c8"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:23:49 crc kubenswrapper[4746]: I0128 21:23:49.957696 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "9c46ddb7-5815-475e-b798-06a7fee944c8" (UID: "9c46ddb7-5815-475e-b798-06a7fee944c8"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:23:50 crc kubenswrapper[4746]: I0128 21:23:50.003825 4746 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 28 21:23:50 crc kubenswrapper[4746]: I0128 21:23:50.004155 4746 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 28 21:23:50 crc kubenswrapper[4746]: I0128 21:23:50.004247 4746 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 28 21:23:50 crc kubenswrapper[4746]: I0128 21:23:50.004325 4746 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 28 21:23:50 crc kubenswrapper[4746]: I0128 21:23:50.004387 4746 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-inventory\") on node \"crc\" DevicePath \"\"" Jan 28 21:23:50 crc kubenswrapper[4746]: I0128 21:23:50.004453 4746 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9c46ddb7-5815-475e-b798-06a7fee944c8-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Jan 28 21:23:50 crc kubenswrapper[4746]: I0128 21:23:50.004511 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btwk5\" (UniqueName: \"kubernetes.io/projected/9c46ddb7-5815-475e-b798-06a7fee944c8-kube-api-access-btwk5\") on node \"crc\" DevicePath \"\"" Jan 28 21:23:50 crc kubenswrapper[4746]: I0128 21:23:50.257429 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" event={"ID":"9c46ddb7-5815-475e-b798-06a7fee944c8","Type":"ContainerDied","Data":"7d3ffb83535d398bcf8919156c24ee1f30e0dd39f5a708c1561192fc35f685b1"} Jan 28 21:23:50 crc kubenswrapper[4746]: I0128 21:23:50.257468 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d3ffb83535d398bcf8919156c24ee1f30e0dd39f5a708c1561192fc35f685b1" Jan 28 21:23:50 crc kubenswrapper[4746]: I0128 21:23:50.257584 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt" Jan 28 21:23:50 crc kubenswrapper[4746]: I0128 21:23:50.848835 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c964aa0f-5f83-4b77-9584-91f94b0e34e9" path="/var/lib/kubelet/pods/c964aa0f-5f83-4b77-9584-91f94b0e34e9/volumes" Jan 28 21:23:52 crc kubenswrapper[4746]: I0128 21:23:52.841878 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:23:52 crc kubenswrapper[4746]: E0128 21:23:52.842500 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:24:03 crc kubenswrapper[4746]: I0128 21:24:03.835328 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:24:03 crc kubenswrapper[4746]: E0128 21:24:03.836106 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:24:15 crc kubenswrapper[4746]: I0128 21:24:15.836095 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:24:15 crc kubenswrapper[4746]: E0128 21:24:15.837066 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:24:28 crc kubenswrapper[4746]: I0128 21:24:28.836840 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:24:28 crc kubenswrapper[4746]: E0128 21:24:28.837858 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.444358 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mnt79"] Jan 28 21:24:31 crc kubenswrapper[4746]: E0128 21:24:31.445252 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c964aa0f-5f83-4b77-9584-91f94b0e34e9" containerName="registry-server" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.445268 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c964aa0f-5f83-4b77-9584-91f94b0e34e9" containerName="registry-server" Jan 28 21:24:31 crc kubenswrapper[4746]: E0128 21:24:31.445282 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c964aa0f-5f83-4b77-9584-91f94b0e34e9" containerName="extract-content" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.445290 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c964aa0f-5f83-4b77-9584-91f94b0e34e9" containerName="extract-content" Jan 28 21:24:31 crc kubenswrapper[4746]: E0128 21:24:31.445305 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e" containerName="registry-server" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.445312 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e" containerName="registry-server" Jan 28 21:24:31 crc kubenswrapper[4746]: E0128 21:24:31.445324 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c46ddb7-5815-475e-b798-06a7fee944c8" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.445334 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c46ddb7-5815-475e-b798-06a7fee944c8" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 28 21:24:31 crc kubenswrapper[4746]: E0128 21:24:31.445352 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e" containerName="extract-utilities" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.445360 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e" containerName="extract-utilities" Jan 28 21:24:31 crc kubenswrapper[4746]: E0128 21:24:31.445389 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e" containerName="extract-content" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.445397 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e" containerName="extract-content" Jan 28 21:24:31 crc kubenswrapper[4746]: E0128 21:24:31.445423 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c964aa0f-5f83-4b77-9584-91f94b0e34e9" containerName="extract-utilities" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.445431 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c964aa0f-5f83-4b77-9584-91f94b0e34e9" containerName="extract-utilities" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.445715 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c46ddb7-5815-475e-b798-06a7fee944c8" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.445737 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8523bf-0fc0-4e0c-b163-b2bf3f2b9a1e" containerName="registry-server" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.445771 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c964aa0f-5f83-4b77-9584-91f94b0e34e9" containerName="registry-server" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.448489 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mnt79"] Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.448594 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mnt79" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.612257 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3851eb27-360e-4c5f-b855-90383742cb54-utilities\") pod \"redhat-operators-mnt79\" (UID: \"3851eb27-360e-4c5f-b855-90383742cb54\") " pod="openshift-marketplace/redhat-operators-mnt79" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.612477 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3851eb27-360e-4c5f-b855-90383742cb54-catalog-content\") pod \"redhat-operators-mnt79\" (UID: \"3851eb27-360e-4c5f-b855-90383742cb54\") " pod="openshift-marketplace/redhat-operators-mnt79" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.613123 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzjrv\" (UniqueName: \"kubernetes.io/projected/3851eb27-360e-4c5f-b855-90383742cb54-kube-api-access-kzjrv\") pod \"redhat-operators-mnt79\" (UID: \"3851eb27-360e-4c5f-b855-90383742cb54\") " pod="openshift-marketplace/redhat-operators-mnt79" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.714849 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzjrv\" (UniqueName: \"kubernetes.io/projected/3851eb27-360e-4c5f-b855-90383742cb54-kube-api-access-kzjrv\") pod \"redhat-operators-mnt79\" (UID: \"3851eb27-360e-4c5f-b855-90383742cb54\") " pod="openshift-marketplace/redhat-operators-mnt79" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.714899 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3851eb27-360e-4c5f-b855-90383742cb54-utilities\") pod \"redhat-operators-mnt79\" (UID: \"3851eb27-360e-4c5f-b855-90383742cb54\") " pod="openshift-marketplace/redhat-operators-mnt79" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.714960 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3851eb27-360e-4c5f-b855-90383742cb54-catalog-content\") pod \"redhat-operators-mnt79\" (UID: \"3851eb27-360e-4c5f-b855-90383742cb54\") " pod="openshift-marketplace/redhat-operators-mnt79" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.715596 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3851eb27-360e-4c5f-b855-90383742cb54-utilities\") pod \"redhat-operators-mnt79\" (UID: \"3851eb27-360e-4c5f-b855-90383742cb54\") " pod="openshift-marketplace/redhat-operators-mnt79" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.715814 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3851eb27-360e-4c5f-b855-90383742cb54-catalog-content\") pod \"redhat-operators-mnt79\" (UID: \"3851eb27-360e-4c5f-b855-90383742cb54\") " pod="openshift-marketplace/redhat-operators-mnt79" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.749937 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzjrv\" (UniqueName: \"kubernetes.io/projected/3851eb27-360e-4c5f-b855-90383742cb54-kube-api-access-kzjrv\") pod \"redhat-operators-mnt79\" (UID: \"3851eb27-360e-4c5f-b855-90383742cb54\") " pod="openshift-marketplace/redhat-operators-mnt79" Jan 28 21:24:31 crc kubenswrapper[4746]: I0128 21:24:31.773617 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mnt79" Jan 28 21:24:32 crc kubenswrapper[4746]: I0128 21:24:32.247302 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mnt79"] Jan 28 21:24:32 crc kubenswrapper[4746]: I0128 21:24:32.729126 4746 generic.go:334] "Generic (PLEG): container finished" podID="3851eb27-360e-4c5f-b855-90383742cb54" containerID="08e8cb87d9df6b98014958e374ae4e7f7d2e197963129bd905aa8ccf036762a5" exitCode=0 Jan 28 21:24:32 crc kubenswrapper[4746]: I0128 21:24:32.729245 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mnt79" event={"ID":"3851eb27-360e-4c5f-b855-90383742cb54","Type":"ContainerDied","Data":"08e8cb87d9df6b98014958e374ae4e7f7d2e197963129bd905aa8ccf036762a5"} Jan 28 21:24:32 crc kubenswrapper[4746]: I0128 21:24:32.730127 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mnt79" event={"ID":"3851eb27-360e-4c5f-b855-90383742cb54","Type":"ContainerStarted","Data":"052d4427870ca0c222be9aa2eaa3f86cfaffe9ea35268539e503469be5dbb37b"} Jan 28 21:24:33 crc kubenswrapper[4746]: I0128 21:24:33.743038 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mnt79" event={"ID":"3851eb27-360e-4c5f-b855-90383742cb54","Type":"ContainerStarted","Data":"c08ecd91194066f6320b3eac8d73ac737f62c843430d0733b8784c4d1d6b801c"} Jan 28 21:24:38 crc kubenswrapper[4746]: I0128 21:24:38.802635 4746 generic.go:334] "Generic (PLEG): container finished" podID="3851eb27-360e-4c5f-b855-90383742cb54" containerID="c08ecd91194066f6320b3eac8d73ac737f62c843430d0733b8784c4d1d6b801c" exitCode=0 Jan 28 21:24:38 crc kubenswrapper[4746]: I0128 21:24:38.802998 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mnt79" event={"ID":"3851eb27-360e-4c5f-b855-90383742cb54","Type":"ContainerDied","Data":"c08ecd91194066f6320b3eac8d73ac737f62c843430d0733b8784c4d1d6b801c"} Jan 28 21:24:39 crc kubenswrapper[4746]: I0128 21:24:39.815765 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mnt79" event={"ID":"3851eb27-360e-4c5f-b855-90383742cb54","Type":"ContainerStarted","Data":"8784b03850d1753374fa4de8989acd170cccf8c87fdace725f4b4cf05c4b7161"} Jan 28 21:24:39 crc kubenswrapper[4746]: I0128 21:24:39.835886 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:24:39 crc kubenswrapper[4746]: E0128 21:24:39.836203 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:24:39 crc kubenswrapper[4746]: I0128 21:24:39.842813 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mnt79" podStartSLOduration=2.3497970280000002 podStartE2EDuration="8.842790901s" podCreationTimestamp="2026-01-28 21:24:31 +0000 UTC" firstStartedPulling="2026-01-28 21:24:32.730808072 +0000 UTC m=+2700.686994426" lastFinishedPulling="2026-01-28 21:24:39.223801935 +0000 UTC m=+2707.179988299" observedRunningTime="2026-01-28 21:24:39.83462426 +0000 UTC m=+2707.790810614" watchObservedRunningTime="2026-01-28 21:24:39.842790901 +0000 UTC m=+2707.798977255" Jan 28 21:24:41 crc kubenswrapper[4746]: I0128 21:24:41.774035 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mnt79" Jan 28 21:24:41 crc kubenswrapper[4746]: I0128 21:24:41.775810 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mnt79" Jan 28 21:24:42 crc kubenswrapper[4746]: I0128 21:24:42.848756 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mnt79" podUID="3851eb27-360e-4c5f-b855-90383742cb54" containerName="registry-server" probeResult="failure" output=< Jan 28 21:24:42 crc kubenswrapper[4746]: timeout: failed to connect service ":50051" within 1s Jan 28 21:24:42 crc kubenswrapper[4746]: > Jan 28 21:24:51 crc kubenswrapper[4746]: I0128 21:24:51.820548 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mnt79" Jan 28 21:24:51 crc kubenswrapper[4746]: I0128 21:24:51.878401 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mnt79" Jan 28 21:24:52 crc kubenswrapper[4746]: I0128 21:24:52.064514 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mnt79"] Jan 28 21:24:53 crc kubenswrapper[4746]: I0128 21:24:53.016166 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mnt79" podUID="3851eb27-360e-4c5f-b855-90383742cb54" containerName="registry-server" containerID="cri-o://8784b03850d1753374fa4de8989acd170cccf8c87fdace725f4b4cf05c4b7161" gracePeriod=2 Jan 28 21:24:53 crc kubenswrapper[4746]: I0128 21:24:53.667729 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mnt79" Jan 28 21:24:53 crc kubenswrapper[4746]: I0128 21:24:53.815251 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3851eb27-360e-4c5f-b855-90383742cb54-utilities\") pod \"3851eb27-360e-4c5f-b855-90383742cb54\" (UID: \"3851eb27-360e-4c5f-b855-90383742cb54\") " Jan 28 21:24:53 crc kubenswrapper[4746]: I0128 21:24:53.815874 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzjrv\" (UniqueName: \"kubernetes.io/projected/3851eb27-360e-4c5f-b855-90383742cb54-kube-api-access-kzjrv\") pod \"3851eb27-360e-4c5f-b855-90383742cb54\" (UID: \"3851eb27-360e-4c5f-b855-90383742cb54\") " Jan 28 21:24:53 crc kubenswrapper[4746]: I0128 21:24:53.816167 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3851eb27-360e-4c5f-b855-90383742cb54-catalog-content\") pod \"3851eb27-360e-4c5f-b855-90383742cb54\" (UID: \"3851eb27-360e-4c5f-b855-90383742cb54\") " Jan 28 21:24:53 crc kubenswrapper[4746]: I0128 21:24:53.816727 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3851eb27-360e-4c5f-b855-90383742cb54-utilities" (OuterVolumeSpecName: "utilities") pod "3851eb27-360e-4c5f-b855-90383742cb54" (UID: "3851eb27-360e-4c5f-b855-90383742cb54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:24:53 crc kubenswrapper[4746]: I0128 21:24:53.836549 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:24:53 crc kubenswrapper[4746]: I0128 21:24:53.841313 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3851eb27-360e-4c5f-b855-90383742cb54-kube-api-access-kzjrv" (OuterVolumeSpecName: "kube-api-access-kzjrv") pod "3851eb27-360e-4c5f-b855-90383742cb54" (UID: "3851eb27-360e-4c5f-b855-90383742cb54"). InnerVolumeSpecName "kube-api-access-kzjrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:24:53 crc kubenswrapper[4746]: I0128 21:24:53.919119 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3851eb27-360e-4c5f-b855-90383742cb54-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 21:24:53 crc kubenswrapper[4746]: I0128 21:24:53.919154 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzjrv\" (UniqueName: \"kubernetes.io/projected/3851eb27-360e-4c5f-b855-90383742cb54-kube-api-access-kzjrv\") on node \"crc\" DevicePath \"\"" Jan 28 21:24:53 crc kubenswrapper[4746]: I0128 21:24:53.952333 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3851eb27-360e-4c5f-b855-90383742cb54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3851eb27-360e-4c5f-b855-90383742cb54" (UID: "3851eb27-360e-4c5f-b855-90383742cb54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:24:54 crc kubenswrapper[4746]: I0128 21:24:54.020871 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3851eb27-360e-4c5f-b855-90383742cb54-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 21:24:54 crc kubenswrapper[4746]: I0128 21:24:54.026945 4746 generic.go:334] "Generic (PLEG): container finished" podID="3851eb27-360e-4c5f-b855-90383742cb54" containerID="8784b03850d1753374fa4de8989acd170cccf8c87fdace725f4b4cf05c4b7161" exitCode=0 Jan 28 21:24:54 crc kubenswrapper[4746]: I0128 21:24:54.026989 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mnt79" event={"ID":"3851eb27-360e-4c5f-b855-90383742cb54","Type":"ContainerDied","Data":"8784b03850d1753374fa4de8989acd170cccf8c87fdace725f4b4cf05c4b7161"} Jan 28 21:24:54 crc kubenswrapper[4746]: I0128 21:24:54.027015 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mnt79" Jan 28 21:24:54 crc kubenswrapper[4746]: I0128 21:24:54.027030 4746 scope.go:117] "RemoveContainer" containerID="8784b03850d1753374fa4de8989acd170cccf8c87fdace725f4b4cf05c4b7161" Jan 28 21:24:54 crc kubenswrapper[4746]: I0128 21:24:54.027018 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mnt79" event={"ID":"3851eb27-360e-4c5f-b855-90383742cb54","Type":"ContainerDied","Data":"052d4427870ca0c222be9aa2eaa3f86cfaffe9ea35268539e503469be5dbb37b"} Jan 28 21:24:54 crc kubenswrapper[4746]: I0128 21:24:54.058293 4746 scope.go:117] "RemoveContainer" containerID="c08ecd91194066f6320b3eac8d73ac737f62c843430d0733b8784c4d1d6b801c" Jan 28 21:24:54 crc kubenswrapper[4746]: I0128 21:24:54.076643 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mnt79"] Jan 28 21:24:54 crc kubenswrapper[4746]: I0128 21:24:54.085966 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mnt79"] Jan 28 21:24:54 crc kubenswrapper[4746]: I0128 21:24:54.087361 4746 scope.go:117] "RemoveContainer" containerID="08e8cb87d9df6b98014958e374ae4e7f7d2e197963129bd905aa8ccf036762a5" Jan 28 21:24:54 crc kubenswrapper[4746]: I0128 21:24:54.106758 4746 scope.go:117] "RemoveContainer" containerID="8784b03850d1753374fa4de8989acd170cccf8c87fdace725f4b4cf05c4b7161" Jan 28 21:24:54 crc kubenswrapper[4746]: E0128 21:24:54.107229 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8784b03850d1753374fa4de8989acd170cccf8c87fdace725f4b4cf05c4b7161\": container with ID starting with 8784b03850d1753374fa4de8989acd170cccf8c87fdace725f4b4cf05c4b7161 not found: ID does not exist" containerID="8784b03850d1753374fa4de8989acd170cccf8c87fdace725f4b4cf05c4b7161" Jan 28 21:24:54 crc kubenswrapper[4746]: I0128 21:24:54.107260 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8784b03850d1753374fa4de8989acd170cccf8c87fdace725f4b4cf05c4b7161"} err="failed to get container status \"8784b03850d1753374fa4de8989acd170cccf8c87fdace725f4b4cf05c4b7161\": rpc error: code = NotFound desc = could not find container \"8784b03850d1753374fa4de8989acd170cccf8c87fdace725f4b4cf05c4b7161\": container with ID starting with 8784b03850d1753374fa4de8989acd170cccf8c87fdace725f4b4cf05c4b7161 not found: ID does not exist" Jan 28 21:24:54 crc kubenswrapper[4746]: I0128 21:24:54.107285 4746 scope.go:117] "RemoveContainer" containerID="c08ecd91194066f6320b3eac8d73ac737f62c843430d0733b8784c4d1d6b801c" Jan 28 21:24:54 crc kubenswrapper[4746]: E0128 21:24:54.107622 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c08ecd91194066f6320b3eac8d73ac737f62c843430d0733b8784c4d1d6b801c\": container with ID starting with c08ecd91194066f6320b3eac8d73ac737f62c843430d0733b8784c4d1d6b801c not found: ID does not exist" containerID="c08ecd91194066f6320b3eac8d73ac737f62c843430d0733b8784c4d1d6b801c" Jan 28 21:24:54 crc kubenswrapper[4746]: I0128 21:24:54.107644 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c08ecd91194066f6320b3eac8d73ac737f62c843430d0733b8784c4d1d6b801c"} err="failed to get container status \"c08ecd91194066f6320b3eac8d73ac737f62c843430d0733b8784c4d1d6b801c\": rpc error: code = NotFound desc = could not find container \"c08ecd91194066f6320b3eac8d73ac737f62c843430d0733b8784c4d1d6b801c\": container with ID starting with c08ecd91194066f6320b3eac8d73ac737f62c843430d0733b8784c4d1d6b801c not found: ID does not exist" Jan 28 21:24:54 crc kubenswrapper[4746]: I0128 21:24:54.107658 4746 scope.go:117] "RemoveContainer" containerID="08e8cb87d9df6b98014958e374ae4e7f7d2e197963129bd905aa8ccf036762a5" Jan 28 21:24:54 crc kubenswrapper[4746]: E0128 21:24:54.107836 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08e8cb87d9df6b98014958e374ae4e7f7d2e197963129bd905aa8ccf036762a5\": container with ID starting with 08e8cb87d9df6b98014958e374ae4e7f7d2e197963129bd905aa8ccf036762a5 not found: ID does not exist" containerID="08e8cb87d9df6b98014958e374ae4e7f7d2e197963129bd905aa8ccf036762a5" Jan 28 21:24:54 crc kubenswrapper[4746]: I0128 21:24:54.107856 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08e8cb87d9df6b98014958e374ae4e7f7d2e197963129bd905aa8ccf036762a5"} err="failed to get container status \"08e8cb87d9df6b98014958e374ae4e7f7d2e197963129bd905aa8ccf036762a5\": rpc error: code = NotFound desc = could not find container \"08e8cb87d9df6b98014958e374ae4e7f7d2e197963129bd905aa8ccf036762a5\": container with ID starting with 08e8cb87d9df6b98014958e374ae4e7f7d2e197963129bd905aa8ccf036762a5 not found: ID does not exist" Jan 28 21:24:54 crc kubenswrapper[4746]: I0128 21:24:54.859212 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3851eb27-360e-4c5f-b855-90383742cb54" path="/var/lib/kubelet/pods/3851eb27-360e-4c5f-b855-90383742cb54/volumes" Jan 28 21:24:55 crc kubenswrapper[4746]: I0128 21:24:55.043417 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerStarted","Data":"3739da62fe099fce1fae53951ff544988a4a187ac878f9a23423cf57b93b5305"} Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.256148 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Jan 28 21:25:15 crc kubenswrapper[4746]: E0128 21:25:15.257115 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3851eb27-360e-4c5f-b855-90383742cb54" containerName="extract-utilities" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.257132 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="3851eb27-360e-4c5f-b855-90383742cb54" containerName="extract-utilities" Jan 28 21:25:15 crc kubenswrapper[4746]: E0128 21:25:15.257149 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3851eb27-360e-4c5f-b855-90383742cb54" containerName="registry-server" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.257157 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="3851eb27-360e-4c5f-b855-90383742cb54" containerName="registry-server" Jan 28 21:25:15 crc kubenswrapper[4746]: E0128 21:25:15.257184 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3851eb27-360e-4c5f-b855-90383742cb54" containerName="extract-content" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.257192 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="3851eb27-360e-4c5f-b855-90383742cb54" containerName="extract-content" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.257437 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="3851eb27-360e-4c5f-b855-90383742cb54" containerName="registry-server" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.258408 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.265278 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.271526 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-wc9sh" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.271735 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.271947 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.272114 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.361337 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcwqv\" (UniqueName: \"kubernetes.io/projected/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-kube-api-access-mcwqv\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.361393 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.361433 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.361459 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.361487 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.361576 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.361602 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.361636 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.361663 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-config-data\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.463441 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.463501 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-config-data\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.463566 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcwqv\" (UniqueName: \"kubernetes.io/projected/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-kube-api-access-mcwqv\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.463588 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.463637 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.464457 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.464490 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.464556 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.464588 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.464903 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.465112 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.465351 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.465595 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.469608 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.469788 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.469977 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.471360 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-config-data\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.480847 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcwqv\" (UniqueName: \"kubernetes.io/projected/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-kube-api-access-mcwqv\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.492094 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " pod="openstack/tempest-tests-tempest" Jan 28 21:25:15 crc kubenswrapper[4746]: I0128 21:25:15.614007 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 28 21:25:16 crc kubenswrapper[4746]: I0128 21:25:16.075379 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 28 21:25:16 crc kubenswrapper[4746]: I0128 21:25:16.265893 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a","Type":"ContainerStarted","Data":"67de9d1923b74d98c93d0904acd3bab3423904ed4594c2c1117a4367a9cce647"} Jan 28 21:25:52 crc kubenswrapper[4746]: E0128 21:25:52.150589 4746 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Jan 28 21:25:52 crc kubenswrapper[4746]: E0128 21:25:52.151009 4746 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mcwqv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 28 21:25:52 crc kubenswrapper[4746]: E0128 21:25:52.152136 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a" Jan 28 21:25:52 crc kubenswrapper[4746]: E0128 21:25:52.680416 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a" Jan 28 21:26:04 crc kubenswrapper[4746]: I0128 21:26:04.838952 4746 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 21:26:05 crc kubenswrapper[4746]: I0128 21:26:05.307814 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 28 21:26:06 crc kubenswrapper[4746]: I0128 21:26:06.861192 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a","Type":"ContainerStarted","Data":"3561c8056f1ae1686b6aa97f163e57ba5cc62f618990f8abb7270e4ed44ac421"} Jan 28 21:26:06 crc kubenswrapper[4746]: I0128 21:26:06.886650 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.657647388 podStartE2EDuration="52.886634989s" podCreationTimestamp="2026-01-28 21:25:14 +0000 UTC" firstStartedPulling="2026-01-28 21:25:16.073808711 +0000 UTC m=+2744.029995065" lastFinishedPulling="2026-01-28 21:26:05.302796302 +0000 UTC m=+2793.258982666" observedRunningTime="2026-01-28 21:26:06.884133202 +0000 UTC m=+2794.840319556" watchObservedRunningTime="2026-01-28 21:26:06.886634989 +0000 UTC m=+2794.842821343" Jan 28 21:27:15 crc kubenswrapper[4746]: I0128 21:27:15.871721 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:27:15 crc kubenswrapper[4746]: I0128 21:27:15.872299 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:27:45 crc kubenswrapper[4746]: I0128 21:27:45.873868 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:27:45 crc kubenswrapper[4746]: I0128 21:27:45.874392 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:28:15 crc kubenswrapper[4746]: I0128 21:28:15.871434 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:28:15 crc kubenswrapper[4746]: I0128 21:28:15.872051 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:28:15 crc kubenswrapper[4746]: I0128 21:28:15.872132 4746 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 21:28:15 crc kubenswrapper[4746]: I0128 21:28:15.872983 4746 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3739da62fe099fce1fae53951ff544988a4a187ac878f9a23423cf57b93b5305"} pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 21:28:15 crc kubenswrapper[4746]: I0128 21:28:15.873049 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" containerID="cri-o://3739da62fe099fce1fae53951ff544988a4a187ac878f9a23423cf57b93b5305" gracePeriod=600 Jan 28 21:28:16 crc kubenswrapper[4746]: I0128 21:28:16.189482 4746 generic.go:334] "Generic (PLEG): container finished" podID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerID="3739da62fe099fce1fae53951ff544988a4a187ac878f9a23423cf57b93b5305" exitCode=0 Jan 28 21:28:16 crc kubenswrapper[4746]: I0128 21:28:16.189845 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerDied","Data":"3739da62fe099fce1fae53951ff544988a4a187ac878f9a23423cf57b93b5305"} Jan 28 21:28:16 crc kubenswrapper[4746]: I0128 21:28:16.189937 4746 scope.go:117] "RemoveContainer" containerID="cc873ec6b8a0868869d7fd09d5b5eb7c385e0fac746adb1c9927b6d382729767" Jan 28 21:28:17 crc kubenswrapper[4746]: I0128 21:28:17.199729 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerStarted","Data":"3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66"} Jan 28 21:30:00 crc kubenswrapper[4746]: I0128 21:30:00.170656 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld"] Jan 28 21:30:00 crc kubenswrapper[4746]: I0128 21:30:00.172724 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld" Jan 28 21:30:00 crc kubenswrapper[4746]: I0128 21:30:00.176796 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 21:30:00 crc kubenswrapper[4746]: I0128 21:30:00.177253 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 21:30:00 crc kubenswrapper[4746]: I0128 21:30:00.239723 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8751a4a-bf68-4cca-aba3-a34077107958-secret-volume\") pod \"collect-profiles-29493930-g8fld\" (UID: \"d8751a4a-bf68-4cca-aba3-a34077107958\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld" Jan 28 21:30:00 crc kubenswrapper[4746]: I0128 21:30:00.239983 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8751a4a-bf68-4cca-aba3-a34077107958-config-volume\") pod \"collect-profiles-29493930-g8fld\" (UID: \"d8751a4a-bf68-4cca-aba3-a34077107958\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld" Jan 28 21:30:00 crc kubenswrapper[4746]: I0128 21:30:00.240197 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg2rb\" (UniqueName: \"kubernetes.io/projected/d8751a4a-bf68-4cca-aba3-a34077107958-kube-api-access-lg2rb\") pod \"collect-profiles-29493930-g8fld\" (UID: \"d8751a4a-bf68-4cca-aba3-a34077107958\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld" Jan 28 21:30:00 crc kubenswrapper[4746]: I0128 21:30:00.281692 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld"] Jan 28 21:30:00 crc kubenswrapper[4746]: I0128 21:30:00.343406 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8751a4a-bf68-4cca-aba3-a34077107958-config-volume\") pod \"collect-profiles-29493930-g8fld\" (UID: \"d8751a4a-bf68-4cca-aba3-a34077107958\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld" Jan 28 21:30:00 crc kubenswrapper[4746]: I0128 21:30:00.343548 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg2rb\" (UniqueName: \"kubernetes.io/projected/d8751a4a-bf68-4cca-aba3-a34077107958-kube-api-access-lg2rb\") pod \"collect-profiles-29493930-g8fld\" (UID: \"d8751a4a-bf68-4cca-aba3-a34077107958\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld" Jan 28 21:30:00 crc kubenswrapper[4746]: I0128 21:30:00.343611 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8751a4a-bf68-4cca-aba3-a34077107958-secret-volume\") pod \"collect-profiles-29493930-g8fld\" (UID: \"d8751a4a-bf68-4cca-aba3-a34077107958\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld" Jan 28 21:30:00 crc kubenswrapper[4746]: I0128 21:30:00.344490 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8751a4a-bf68-4cca-aba3-a34077107958-config-volume\") pod \"collect-profiles-29493930-g8fld\" (UID: \"d8751a4a-bf68-4cca-aba3-a34077107958\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld" Jan 28 21:30:00 crc kubenswrapper[4746]: I0128 21:30:00.356865 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8751a4a-bf68-4cca-aba3-a34077107958-secret-volume\") pod \"collect-profiles-29493930-g8fld\" (UID: \"d8751a4a-bf68-4cca-aba3-a34077107958\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld" Jan 28 21:30:00 crc kubenswrapper[4746]: I0128 21:30:00.372744 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg2rb\" (UniqueName: \"kubernetes.io/projected/d8751a4a-bf68-4cca-aba3-a34077107958-kube-api-access-lg2rb\") pod \"collect-profiles-29493930-g8fld\" (UID: \"d8751a4a-bf68-4cca-aba3-a34077107958\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld" Jan 28 21:30:00 crc kubenswrapper[4746]: I0128 21:30:00.493310 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld" Jan 28 21:30:00 crc kubenswrapper[4746]: I0128 21:30:00.936899 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld"] Jan 28 21:30:01 crc kubenswrapper[4746]: I0128 21:30:01.341580 4746 generic.go:334] "Generic (PLEG): container finished" podID="d8751a4a-bf68-4cca-aba3-a34077107958" containerID="ba05a81245eaab803a66814cabcadfbc91fcd763e232e8cc52f9e4746d5ae8bb" exitCode=0 Jan 28 21:30:01 crc kubenswrapper[4746]: I0128 21:30:01.341621 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld" event={"ID":"d8751a4a-bf68-4cca-aba3-a34077107958","Type":"ContainerDied","Data":"ba05a81245eaab803a66814cabcadfbc91fcd763e232e8cc52f9e4746d5ae8bb"} Jan 28 21:30:01 crc kubenswrapper[4746]: I0128 21:30:01.341645 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld" event={"ID":"d8751a4a-bf68-4cca-aba3-a34077107958","Type":"ContainerStarted","Data":"293fd2cf8a1f3defdfa64a2adbaf7146674741f0f7d37e8f5e91a5f19b26a5ab"} Jan 28 21:30:03 crc kubenswrapper[4746]: I0128 21:30:03.096734 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld" Jan 28 21:30:03 crc kubenswrapper[4746]: I0128 21:30:03.206756 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8751a4a-bf68-4cca-aba3-a34077107958-config-volume\") pod \"d8751a4a-bf68-4cca-aba3-a34077107958\" (UID: \"d8751a4a-bf68-4cca-aba3-a34077107958\") " Jan 28 21:30:03 crc kubenswrapper[4746]: I0128 21:30:03.206820 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8751a4a-bf68-4cca-aba3-a34077107958-secret-volume\") pod \"d8751a4a-bf68-4cca-aba3-a34077107958\" (UID: \"d8751a4a-bf68-4cca-aba3-a34077107958\") " Jan 28 21:30:03 crc kubenswrapper[4746]: I0128 21:30:03.206922 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg2rb\" (UniqueName: \"kubernetes.io/projected/d8751a4a-bf68-4cca-aba3-a34077107958-kube-api-access-lg2rb\") pod \"d8751a4a-bf68-4cca-aba3-a34077107958\" (UID: \"d8751a4a-bf68-4cca-aba3-a34077107958\") " Jan 28 21:30:03 crc kubenswrapper[4746]: I0128 21:30:03.207766 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8751a4a-bf68-4cca-aba3-a34077107958-config-volume" (OuterVolumeSpecName: "config-volume") pod "d8751a4a-bf68-4cca-aba3-a34077107958" (UID: "d8751a4a-bf68-4cca-aba3-a34077107958"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:30:03 crc kubenswrapper[4746]: I0128 21:30:03.214653 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8751a4a-bf68-4cca-aba3-a34077107958-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d8751a4a-bf68-4cca-aba3-a34077107958" (UID: "d8751a4a-bf68-4cca-aba3-a34077107958"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:30:03 crc kubenswrapper[4746]: I0128 21:30:03.214810 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8751a4a-bf68-4cca-aba3-a34077107958-kube-api-access-lg2rb" (OuterVolumeSpecName: "kube-api-access-lg2rb") pod "d8751a4a-bf68-4cca-aba3-a34077107958" (UID: "d8751a4a-bf68-4cca-aba3-a34077107958"). InnerVolumeSpecName "kube-api-access-lg2rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:30:03 crc kubenswrapper[4746]: I0128 21:30:03.309967 4746 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d8751a4a-bf68-4cca-aba3-a34077107958-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 21:30:03 crc kubenswrapper[4746]: I0128 21:30:03.310026 4746 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d8751a4a-bf68-4cca-aba3-a34077107958-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 21:30:03 crc kubenswrapper[4746]: I0128 21:30:03.310047 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg2rb\" (UniqueName: \"kubernetes.io/projected/d8751a4a-bf68-4cca-aba3-a34077107958-kube-api-access-lg2rb\") on node \"crc\" DevicePath \"\"" Jan 28 21:30:03 crc kubenswrapper[4746]: I0128 21:30:03.359644 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld" event={"ID":"d8751a4a-bf68-4cca-aba3-a34077107958","Type":"ContainerDied","Data":"293fd2cf8a1f3defdfa64a2adbaf7146674741f0f7d37e8f5e91a5f19b26a5ab"} Jan 28 21:30:03 crc kubenswrapper[4746]: I0128 21:30:03.359691 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="293fd2cf8a1f3defdfa64a2adbaf7146674741f0f7d37e8f5e91a5f19b26a5ab" Jan 28 21:30:03 crc kubenswrapper[4746]: I0128 21:30:03.359950 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493930-g8fld" Jan 28 21:30:03 crc kubenswrapper[4746]: E0128 21:30:03.570646 4746 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8751a4a_bf68_4cca_aba3_a34077107958.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8751a4a_bf68_4cca_aba3_a34077107958.slice/crio-293fd2cf8a1f3defdfa64a2adbaf7146674741f0f7d37e8f5e91a5f19b26a5ab\": RecentStats: unable to find data in memory cache]" Jan 28 21:30:04 crc kubenswrapper[4746]: I0128 21:30:04.184495 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9"] Jan 28 21:30:04 crc kubenswrapper[4746]: I0128 21:30:04.196110 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493885-pgrr9"] Jan 28 21:30:04 crc kubenswrapper[4746]: I0128 21:30:04.850493 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="189cf38e-34c5-4cd9-ad46-db8bc26b458e" path="/var/lib/kubelet/pods/189cf38e-34c5-4cd9-ad46-db8bc26b458e/volumes" Jan 28 21:30:45 crc kubenswrapper[4746]: I0128 21:30:45.871296 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:30:45 crc kubenswrapper[4746]: I0128 21:30:45.871882 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:30:59 crc kubenswrapper[4746]: I0128 21:30:59.497565 4746 scope.go:117] "RemoveContainer" containerID="53d24f4b0daeba9c3cedfb4178cddf963faa1db0f701a2bdaf0499613c7e320c" Jan 28 21:31:15 crc kubenswrapper[4746]: I0128 21:31:15.871707 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:31:15 crc kubenswrapper[4746]: I0128 21:31:15.872215 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:31:21 crc kubenswrapper[4746]: I0128 21:31:21.093042 4746 generic.go:334] "Generic (PLEG): container finished" podID="4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a" containerID="3561c8056f1ae1686b6aa97f163e57ba5cc62f618990f8abb7270e4ed44ac421" exitCode=0 Jan 28 21:31:21 crc kubenswrapper[4746]: I0128 21:31:21.093129 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a","Type":"ContainerDied","Data":"3561c8056f1ae1686b6aa97f163e57ba5cc62f618990f8abb7270e4ed44ac421"} Jan 28 21:31:22 crc kubenswrapper[4746]: I0128 21:31:22.760470 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 28 21:31:22 crc kubenswrapper[4746]: I0128 21:31:22.957494 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-test-operator-ephemeral-workdir\") pod \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " Jan 28 21:31:22 crc kubenswrapper[4746]: I0128 21:31:22.957609 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-ca-certs\") pod \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " Jan 28 21:31:22 crc kubenswrapper[4746]: I0128 21:31:22.957693 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-test-operator-ephemeral-temporary\") pod \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " Jan 28 21:31:22 crc kubenswrapper[4746]: I0128 21:31:22.957744 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-ssh-key\") pod \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " Jan 28 21:31:22 crc kubenswrapper[4746]: I0128 21:31:22.957869 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-config-data\") pod \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " Jan 28 21:31:22 crc kubenswrapper[4746]: I0128 21:31:22.957916 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-openstack-config-secret\") pod \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " Jan 28 21:31:22 crc kubenswrapper[4746]: I0128 21:31:22.957954 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcwqv\" (UniqueName: \"kubernetes.io/projected/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-kube-api-access-mcwqv\") pod \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " Jan 28 21:31:22 crc kubenswrapper[4746]: I0128 21:31:22.957978 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " Jan 28 21:31:22 crc kubenswrapper[4746]: I0128 21:31:22.958019 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-openstack-config\") pod \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\" (UID: \"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a\") " Jan 28 21:31:22 crc kubenswrapper[4746]: I0128 21:31:22.958162 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a" (UID: "4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:31:22 crc kubenswrapper[4746]: I0128 21:31:22.958688 4746 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 28 21:31:22 crc kubenswrapper[4746]: I0128 21:31:22.962842 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-config-data" (OuterVolumeSpecName: "config-data") pod "4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a" (UID: "4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:31:22 crc kubenswrapper[4746]: I0128 21:31:22.964446 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-kube-api-access-mcwqv" (OuterVolumeSpecName: "kube-api-access-mcwqv") pod "4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a" (UID: "4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a"). InnerVolumeSpecName "kube-api-access-mcwqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:31:22 crc kubenswrapper[4746]: I0128 21:31:22.970316 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a" (UID: "4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 28 21:31:22 crc kubenswrapper[4746]: I0128 21:31:22.992241 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a" (UID: "4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:31:23 crc kubenswrapper[4746]: I0128 21:31:23.015321 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a" (UID: "4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:31:23 crc kubenswrapper[4746]: I0128 21:31:23.016875 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a" (UID: "4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:31:23 crc kubenswrapper[4746]: I0128 21:31:23.037523 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a" (UID: "4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:31:23 crc kubenswrapper[4746]: I0128 21:31:23.060557 4746 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 28 21:31:23 crc kubenswrapper[4746]: I0128 21:31:23.060679 4746 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 28 21:31:23 crc kubenswrapper[4746]: I0128 21:31:23.060736 4746 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-config-data\") on node \"crc\" DevicePath \"\"" Jan 28 21:31:23 crc kubenswrapper[4746]: I0128 21:31:23.060797 4746 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 28 21:31:23 crc kubenswrapper[4746]: I0128 21:31:23.060852 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcwqv\" (UniqueName: \"kubernetes.io/projected/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-kube-api-access-mcwqv\") on node \"crc\" DevicePath \"\"" Jan 28 21:31:23 crc kubenswrapper[4746]: I0128 21:31:23.060920 4746 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 28 21:31:23 crc kubenswrapper[4746]: I0128 21:31:23.060978 4746 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 28 21:31:23 crc kubenswrapper[4746]: I0128 21:31:23.081033 4746 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 28 21:31:23 crc kubenswrapper[4746]: I0128 21:31:23.116912 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a","Type":"ContainerDied","Data":"67de9d1923b74d98c93d0904acd3bab3423904ed4594c2c1117a4367a9cce647"} Jan 28 21:31:23 crc kubenswrapper[4746]: I0128 21:31:23.116949 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67de9d1923b74d98c93d0904acd3bab3423904ed4594c2c1117a4367a9cce647" Jan 28 21:31:23 crc kubenswrapper[4746]: I0128 21:31:23.117333 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 28 21:31:23 crc kubenswrapper[4746]: I0128 21:31:23.163427 4746 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 28 21:31:23 crc kubenswrapper[4746]: I0128 21:31:23.405353 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a" (UID: "4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:31:23 crc kubenswrapper[4746]: I0128 21:31:23.469914 4746 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.535334 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v8xb2"] Jan 28 21:31:31 crc kubenswrapper[4746]: E0128 21:31:31.537183 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8751a4a-bf68-4cca-aba3-a34077107958" containerName="collect-profiles" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.537269 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8751a4a-bf68-4cca-aba3-a34077107958" containerName="collect-profiles" Jan 28 21:31:31 crc kubenswrapper[4746]: E0128 21:31:31.537374 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a" containerName="tempest-tests-tempest-tests-runner" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.537450 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a" containerName="tempest-tests-tempest-tests-runner" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.537726 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8751a4a-bf68-4cca-aba3-a34077107958" containerName="collect-profiles" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.537809 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a" containerName="tempest-tests-tempest-tests-runner" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.539389 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v8xb2" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.557991 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8xb2"] Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.637940 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9frkz\" (UniqueName: \"kubernetes.io/projected/19674eb4-9d31-4ea3-b715-63a965481ba4-kube-api-access-9frkz\") pod \"redhat-marketplace-v8xb2\" (UID: \"19674eb4-9d31-4ea3-b715-63a965481ba4\") " pod="openshift-marketplace/redhat-marketplace-v8xb2" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.638197 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19674eb4-9d31-4ea3-b715-63a965481ba4-catalog-content\") pod \"redhat-marketplace-v8xb2\" (UID: \"19674eb4-9d31-4ea3-b715-63a965481ba4\") " pod="openshift-marketplace/redhat-marketplace-v8xb2" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.638226 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19674eb4-9d31-4ea3-b715-63a965481ba4-utilities\") pod \"redhat-marketplace-v8xb2\" (UID: \"19674eb4-9d31-4ea3-b715-63a965481ba4\") " pod="openshift-marketplace/redhat-marketplace-v8xb2" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.740229 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19674eb4-9d31-4ea3-b715-63a965481ba4-catalog-content\") pod \"redhat-marketplace-v8xb2\" (UID: \"19674eb4-9d31-4ea3-b715-63a965481ba4\") " pod="openshift-marketplace/redhat-marketplace-v8xb2" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.740280 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19674eb4-9d31-4ea3-b715-63a965481ba4-utilities\") pod \"redhat-marketplace-v8xb2\" (UID: \"19674eb4-9d31-4ea3-b715-63a965481ba4\") " pod="openshift-marketplace/redhat-marketplace-v8xb2" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.740348 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9frkz\" (UniqueName: \"kubernetes.io/projected/19674eb4-9d31-4ea3-b715-63a965481ba4-kube-api-access-9frkz\") pod \"redhat-marketplace-v8xb2\" (UID: \"19674eb4-9d31-4ea3-b715-63a965481ba4\") " pod="openshift-marketplace/redhat-marketplace-v8xb2" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.740794 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19674eb4-9d31-4ea3-b715-63a965481ba4-catalog-content\") pod \"redhat-marketplace-v8xb2\" (UID: \"19674eb4-9d31-4ea3-b715-63a965481ba4\") " pod="openshift-marketplace/redhat-marketplace-v8xb2" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.740915 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19674eb4-9d31-4ea3-b715-63a965481ba4-utilities\") pod \"redhat-marketplace-v8xb2\" (UID: \"19674eb4-9d31-4ea3-b715-63a965481ba4\") " pod="openshift-marketplace/redhat-marketplace-v8xb2" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.763755 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9frkz\" (UniqueName: \"kubernetes.io/projected/19674eb4-9d31-4ea3-b715-63a965481ba4-kube-api-access-9frkz\") pod \"redhat-marketplace-v8xb2\" (UID: \"19674eb4-9d31-4ea3-b715-63a965481ba4\") " pod="openshift-marketplace/redhat-marketplace-v8xb2" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.823223 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.826219 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.854270 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-wc9sh" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.859349 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v8xb2" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.867024 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.958929 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4a0e4c1-a20f-4efd-aae8-3345c0c9c0f7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 28 21:31:31 crc kubenswrapper[4746]: I0128 21:31:31.959111 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4ccn\" (UniqueName: \"kubernetes.io/projected/e4a0e4c1-a20f-4efd-aae8-3345c0c9c0f7-kube-api-access-m4ccn\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4a0e4c1-a20f-4efd-aae8-3345c0c9c0f7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 28 21:31:32 crc kubenswrapper[4746]: I0128 21:31:32.065952 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4a0e4c1-a20f-4efd-aae8-3345c0c9c0f7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 28 21:31:32 crc kubenswrapper[4746]: I0128 21:31:32.066060 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4ccn\" (UniqueName: \"kubernetes.io/projected/e4a0e4c1-a20f-4efd-aae8-3345c0c9c0f7-kube-api-access-m4ccn\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4a0e4c1-a20f-4efd-aae8-3345c0c9c0f7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 28 21:31:32 crc kubenswrapper[4746]: I0128 21:31:32.066377 4746 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4a0e4c1-a20f-4efd-aae8-3345c0c9c0f7\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 28 21:31:32 crc kubenswrapper[4746]: I0128 21:31:32.105507 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4ccn\" (UniqueName: \"kubernetes.io/projected/e4a0e4c1-a20f-4efd-aae8-3345c0c9c0f7-kube-api-access-m4ccn\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4a0e4c1-a20f-4efd-aae8-3345c0c9c0f7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 28 21:31:32 crc kubenswrapper[4746]: I0128 21:31:32.128459 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4a0e4c1-a20f-4efd-aae8-3345c0c9c0f7\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 28 21:31:32 crc kubenswrapper[4746]: I0128 21:31:32.182650 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 28 21:31:32 crc kubenswrapper[4746]: I0128 21:31:32.425053 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8xb2"] Jan 28 21:31:32 crc kubenswrapper[4746]: I0128 21:31:32.714258 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 28 21:31:32 crc kubenswrapper[4746]: I0128 21:31:32.741488 4746 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 21:31:33 crc kubenswrapper[4746]: I0128 21:31:33.215728 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e4a0e4c1-a20f-4efd-aae8-3345c0c9c0f7","Type":"ContainerStarted","Data":"655d4c7723031bdac4c51014cae17ae829bf7b2d9766cac56af3b8fd1026381f"} Jan 28 21:31:33 crc kubenswrapper[4746]: I0128 21:31:33.217680 4746 generic.go:334] "Generic (PLEG): container finished" podID="19674eb4-9d31-4ea3-b715-63a965481ba4" containerID="1b99181394d7c43463c16b4ab8823c3aedb7258a516591b53e0a2d55dc6509e2" exitCode=0 Jan 28 21:31:33 crc kubenswrapper[4746]: I0128 21:31:33.217720 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8xb2" event={"ID":"19674eb4-9d31-4ea3-b715-63a965481ba4","Type":"ContainerDied","Data":"1b99181394d7c43463c16b4ab8823c3aedb7258a516591b53e0a2d55dc6509e2"} Jan 28 21:31:33 crc kubenswrapper[4746]: I0128 21:31:33.217741 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8xb2" event={"ID":"19674eb4-9d31-4ea3-b715-63a965481ba4","Type":"ContainerStarted","Data":"db7da7a3aa7e713020e6dabbbea2eb34d0e7eded09916e2776b549adae30b65f"} Jan 28 21:31:34 crc kubenswrapper[4746]: I0128 21:31:34.229200 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8xb2" event={"ID":"19674eb4-9d31-4ea3-b715-63a965481ba4","Type":"ContainerStarted","Data":"1a2247eef8f11169391e8430aa02439356601dcaca81b22257ed8948afebfeff"} Jan 28 21:31:34 crc kubenswrapper[4746]: I0128 21:31:34.230963 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e4a0e4c1-a20f-4efd-aae8-3345c0c9c0f7","Type":"ContainerStarted","Data":"2c544b90148cb71e7b9bd91ce3eabfdbd0652e9bd066d79b21ef767f2e8cb59b"} Jan 28 21:31:34 crc kubenswrapper[4746]: I0128 21:31:34.269874 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.393050115 podStartE2EDuration="3.269853341s" podCreationTimestamp="2026-01-28 21:31:31 +0000 UTC" firstStartedPulling="2026-01-28 21:31:32.741207809 +0000 UTC m=+3120.697394163" lastFinishedPulling="2026-01-28 21:31:33.618011035 +0000 UTC m=+3121.574197389" observedRunningTime="2026-01-28 21:31:34.267553209 +0000 UTC m=+3122.223739563" watchObservedRunningTime="2026-01-28 21:31:34.269853341 +0000 UTC m=+3122.226039705" Jan 28 21:31:35 crc kubenswrapper[4746]: I0128 21:31:35.244660 4746 generic.go:334] "Generic (PLEG): container finished" podID="19674eb4-9d31-4ea3-b715-63a965481ba4" containerID="1a2247eef8f11169391e8430aa02439356601dcaca81b22257ed8948afebfeff" exitCode=0 Jan 28 21:31:35 crc kubenswrapper[4746]: I0128 21:31:35.246319 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8xb2" event={"ID":"19674eb4-9d31-4ea3-b715-63a965481ba4","Type":"ContainerDied","Data":"1a2247eef8f11169391e8430aa02439356601dcaca81b22257ed8948afebfeff"} Jan 28 21:31:36 crc kubenswrapper[4746]: I0128 21:31:36.258465 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8xb2" event={"ID":"19674eb4-9d31-4ea3-b715-63a965481ba4","Type":"ContainerStarted","Data":"a509768283d484be1b68391a1bdb8d5922e38f4db17b10ad51a17af863f51518"} Jan 28 21:31:36 crc kubenswrapper[4746]: I0128 21:31:36.289827 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v8xb2" podStartSLOduration=2.815366216 podStartE2EDuration="5.289805936s" podCreationTimestamp="2026-01-28 21:31:31 +0000 UTC" firstStartedPulling="2026-01-28 21:31:33.219739605 +0000 UTC m=+3121.175925959" lastFinishedPulling="2026-01-28 21:31:35.694179325 +0000 UTC m=+3123.650365679" observedRunningTime="2026-01-28 21:31:36.283851084 +0000 UTC m=+3124.240037448" watchObservedRunningTime="2026-01-28 21:31:36.289805936 +0000 UTC m=+3124.245992290" Jan 28 21:31:41 crc kubenswrapper[4746]: I0128 21:31:41.859595 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v8xb2" Jan 28 21:31:41 crc kubenswrapper[4746]: I0128 21:31:41.860248 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v8xb2" Jan 28 21:31:41 crc kubenswrapper[4746]: I0128 21:31:41.924741 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v8xb2" Jan 28 21:31:42 crc kubenswrapper[4746]: I0128 21:31:42.360756 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v8xb2" Jan 28 21:31:42 crc kubenswrapper[4746]: I0128 21:31:42.405219 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8xb2"] Jan 28 21:31:44 crc kubenswrapper[4746]: I0128 21:31:44.335698 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v8xb2" podUID="19674eb4-9d31-4ea3-b715-63a965481ba4" containerName="registry-server" containerID="cri-o://a509768283d484be1b68391a1bdb8d5922e38f4db17b10ad51a17af863f51518" gracePeriod=2 Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.082910 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v8xb2" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.268868 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9frkz\" (UniqueName: \"kubernetes.io/projected/19674eb4-9d31-4ea3-b715-63a965481ba4-kube-api-access-9frkz\") pod \"19674eb4-9d31-4ea3-b715-63a965481ba4\" (UID: \"19674eb4-9d31-4ea3-b715-63a965481ba4\") " Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.269071 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19674eb4-9d31-4ea3-b715-63a965481ba4-utilities\") pod \"19674eb4-9d31-4ea3-b715-63a965481ba4\" (UID: \"19674eb4-9d31-4ea3-b715-63a965481ba4\") " Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.269130 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19674eb4-9d31-4ea3-b715-63a965481ba4-catalog-content\") pod \"19674eb4-9d31-4ea3-b715-63a965481ba4\" (UID: \"19674eb4-9d31-4ea3-b715-63a965481ba4\") " Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.281172 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19674eb4-9d31-4ea3-b715-63a965481ba4-utilities" (OuterVolumeSpecName: "utilities") pod "19674eb4-9d31-4ea3-b715-63a965481ba4" (UID: "19674eb4-9d31-4ea3-b715-63a965481ba4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.285520 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19674eb4-9d31-4ea3-b715-63a965481ba4-kube-api-access-9frkz" (OuterVolumeSpecName: "kube-api-access-9frkz") pod "19674eb4-9d31-4ea3-b715-63a965481ba4" (UID: "19674eb4-9d31-4ea3-b715-63a965481ba4"). InnerVolumeSpecName "kube-api-access-9frkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.305030 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19674eb4-9d31-4ea3-b715-63a965481ba4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19674eb4-9d31-4ea3-b715-63a965481ba4" (UID: "19674eb4-9d31-4ea3-b715-63a965481ba4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.347649 4746 generic.go:334] "Generic (PLEG): container finished" podID="19674eb4-9d31-4ea3-b715-63a965481ba4" containerID="a509768283d484be1b68391a1bdb8d5922e38f4db17b10ad51a17af863f51518" exitCode=0 Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.347700 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8xb2" event={"ID":"19674eb4-9d31-4ea3-b715-63a965481ba4","Type":"ContainerDied","Data":"a509768283d484be1b68391a1bdb8d5922e38f4db17b10ad51a17af863f51518"} Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.347733 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8xb2" event={"ID":"19674eb4-9d31-4ea3-b715-63a965481ba4","Type":"ContainerDied","Data":"db7da7a3aa7e713020e6dabbbea2eb34d0e7eded09916e2776b549adae30b65f"} Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.347752 4746 scope.go:117] "RemoveContainer" containerID="a509768283d484be1b68391a1bdb8d5922e38f4db17b10ad51a17af863f51518" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.347877 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v8xb2" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.371745 4746 scope.go:117] "RemoveContainer" containerID="1a2247eef8f11169391e8430aa02439356601dcaca81b22257ed8948afebfeff" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.372816 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9frkz\" (UniqueName: \"kubernetes.io/projected/19674eb4-9d31-4ea3-b715-63a965481ba4-kube-api-access-9frkz\") on node \"crc\" DevicePath \"\"" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.372863 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19674eb4-9d31-4ea3-b715-63a965481ba4-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.372875 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19674eb4-9d31-4ea3-b715-63a965481ba4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.385438 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8xb2"] Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.395661 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8xb2"] Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.407256 4746 scope.go:117] "RemoveContainer" containerID="1b99181394d7c43463c16b4ab8823c3aedb7258a516591b53e0a2d55dc6509e2" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.444219 4746 scope.go:117] "RemoveContainer" containerID="a509768283d484be1b68391a1bdb8d5922e38f4db17b10ad51a17af863f51518" Jan 28 21:31:45 crc kubenswrapper[4746]: E0128 21:31:45.445029 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a509768283d484be1b68391a1bdb8d5922e38f4db17b10ad51a17af863f51518\": container with ID starting with a509768283d484be1b68391a1bdb8d5922e38f4db17b10ad51a17af863f51518 not found: ID does not exist" containerID="a509768283d484be1b68391a1bdb8d5922e38f4db17b10ad51a17af863f51518" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.445071 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a509768283d484be1b68391a1bdb8d5922e38f4db17b10ad51a17af863f51518"} err="failed to get container status \"a509768283d484be1b68391a1bdb8d5922e38f4db17b10ad51a17af863f51518\": rpc error: code = NotFound desc = could not find container \"a509768283d484be1b68391a1bdb8d5922e38f4db17b10ad51a17af863f51518\": container with ID starting with a509768283d484be1b68391a1bdb8d5922e38f4db17b10ad51a17af863f51518 not found: ID does not exist" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.445115 4746 scope.go:117] "RemoveContainer" containerID="1a2247eef8f11169391e8430aa02439356601dcaca81b22257ed8948afebfeff" Jan 28 21:31:45 crc kubenswrapper[4746]: E0128 21:31:45.445615 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a2247eef8f11169391e8430aa02439356601dcaca81b22257ed8948afebfeff\": container with ID starting with 1a2247eef8f11169391e8430aa02439356601dcaca81b22257ed8948afebfeff not found: ID does not exist" containerID="1a2247eef8f11169391e8430aa02439356601dcaca81b22257ed8948afebfeff" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.445659 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a2247eef8f11169391e8430aa02439356601dcaca81b22257ed8948afebfeff"} err="failed to get container status \"1a2247eef8f11169391e8430aa02439356601dcaca81b22257ed8948afebfeff\": rpc error: code = NotFound desc = could not find container \"1a2247eef8f11169391e8430aa02439356601dcaca81b22257ed8948afebfeff\": container with ID starting with 1a2247eef8f11169391e8430aa02439356601dcaca81b22257ed8948afebfeff not found: ID does not exist" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.445686 4746 scope.go:117] "RemoveContainer" containerID="1b99181394d7c43463c16b4ab8823c3aedb7258a516591b53e0a2d55dc6509e2" Jan 28 21:31:45 crc kubenswrapper[4746]: E0128 21:31:45.446023 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b99181394d7c43463c16b4ab8823c3aedb7258a516591b53e0a2d55dc6509e2\": container with ID starting with 1b99181394d7c43463c16b4ab8823c3aedb7258a516591b53e0a2d55dc6509e2 not found: ID does not exist" containerID="1b99181394d7c43463c16b4ab8823c3aedb7258a516591b53e0a2d55dc6509e2" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.446063 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b99181394d7c43463c16b4ab8823c3aedb7258a516591b53e0a2d55dc6509e2"} err="failed to get container status \"1b99181394d7c43463c16b4ab8823c3aedb7258a516591b53e0a2d55dc6509e2\": rpc error: code = NotFound desc = could not find container \"1b99181394d7c43463c16b4ab8823c3aedb7258a516591b53e0a2d55dc6509e2\": container with ID starting with 1b99181394d7c43463c16b4ab8823c3aedb7258a516591b53e0a2d55dc6509e2 not found: ID does not exist" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.872013 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.872099 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.872160 4746 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.873136 4746 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66"} pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 21:31:45 crc kubenswrapper[4746]: I0128 21:31:45.873218 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" containerID="cri-o://3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" gracePeriod=600 Jan 28 21:31:45 crc kubenswrapper[4746]: E0128 21:31:45.996697 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:31:46 crc kubenswrapper[4746]: I0128 21:31:46.361380 4746 generic.go:334] "Generic (PLEG): container finished" podID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" exitCode=0 Jan 28 21:31:46 crc kubenswrapper[4746]: I0128 21:31:46.361432 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerDied","Data":"3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66"} Jan 28 21:31:46 crc kubenswrapper[4746]: I0128 21:31:46.361514 4746 scope.go:117] "RemoveContainer" containerID="3739da62fe099fce1fae53951ff544988a4a187ac878f9a23423cf57b93b5305" Jan 28 21:31:46 crc kubenswrapper[4746]: I0128 21:31:46.410989 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:31:46 crc kubenswrapper[4746]: E0128 21:31:46.412826 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:31:46 crc kubenswrapper[4746]: I0128 21:31:46.845051 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19674eb4-9d31-4ea3-b715-63a965481ba4" path="/var/lib/kubelet/pods/19674eb4-9d31-4ea3-b715-63a965481ba4/volumes" Jan 28 21:31:59 crc kubenswrapper[4746]: I0128 21:31:59.836319 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:31:59 crc kubenswrapper[4746]: E0128 21:31:59.838351 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:31:59 crc kubenswrapper[4746]: I0128 21:31:59.873188 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fjncs/must-gather-5v5j6"] Jan 28 21:31:59 crc kubenswrapper[4746]: E0128 21:31:59.873725 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19674eb4-9d31-4ea3-b715-63a965481ba4" containerName="extract-utilities" Jan 28 21:31:59 crc kubenswrapper[4746]: I0128 21:31:59.873746 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="19674eb4-9d31-4ea3-b715-63a965481ba4" containerName="extract-utilities" Jan 28 21:31:59 crc kubenswrapper[4746]: E0128 21:31:59.873762 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19674eb4-9d31-4ea3-b715-63a965481ba4" containerName="registry-server" Jan 28 21:31:59 crc kubenswrapper[4746]: I0128 21:31:59.873770 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="19674eb4-9d31-4ea3-b715-63a965481ba4" containerName="registry-server" Jan 28 21:31:59 crc kubenswrapper[4746]: E0128 21:31:59.873781 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19674eb4-9d31-4ea3-b715-63a965481ba4" containerName="extract-content" Jan 28 21:31:59 crc kubenswrapper[4746]: I0128 21:31:59.873789 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="19674eb4-9d31-4ea3-b715-63a965481ba4" containerName="extract-content" Jan 28 21:31:59 crc kubenswrapper[4746]: I0128 21:31:59.874038 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="19674eb4-9d31-4ea3-b715-63a965481ba4" containerName="registry-server" Jan 28 21:31:59 crc kubenswrapper[4746]: I0128 21:31:59.875445 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjncs/must-gather-5v5j6" Jan 28 21:31:59 crc kubenswrapper[4746]: I0128 21:31:59.877843 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fjncs"/"kube-root-ca.crt" Jan 28 21:31:59 crc kubenswrapper[4746]: I0128 21:31:59.878010 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fjncs"/"openshift-service-ca.crt" Jan 28 21:31:59 crc kubenswrapper[4746]: I0128 21:31:59.914610 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fjncs/must-gather-5v5j6"] Jan 28 21:31:59 crc kubenswrapper[4746]: I0128 21:31:59.986695 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c0a60132-458a-46f2-a61f-444424b4c7cb-must-gather-output\") pod \"must-gather-5v5j6\" (UID: \"c0a60132-458a-46f2-a61f-444424b4c7cb\") " pod="openshift-must-gather-fjncs/must-gather-5v5j6" Jan 28 21:31:59 crc kubenswrapper[4746]: I0128 21:31:59.987022 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8wfn\" (UniqueName: \"kubernetes.io/projected/c0a60132-458a-46f2-a61f-444424b4c7cb-kube-api-access-w8wfn\") pod \"must-gather-5v5j6\" (UID: \"c0a60132-458a-46f2-a61f-444424b4c7cb\") " pod="openshift-must-gather-fjncs/must-gather-5v5j6" Jan 28 21:32:00 crc kubenswrapper[4746]: I0128 21:32:00.097947 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8wfn\" (UniqueName: \"kubernetes.io/projected/c0a60132-458a-46f2-a61f-444424b4c7cb-kube-api-access-w8wfn\") pod \"must-gather-5v5j6\" (UID: \"c0a60132-458a-46f2-a61f-444424b4c7cb\") " pod="openshift-must-gather-fjncs/must-gather-5v5j6" Jan 28 21:32:00 crc kubenswrapper[4746]: I0128 21:32:00.098263 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c0a60132-458a-46f2-a61f-444424b4c7cb-must-gather-output\") pod \"must-gather-5v5j6\" (UID: \"c0a60132-458a-46f2-a61f-444424b4c7cb\") " pod="openshift-must-gather-fjncs/must-gather-5v5j6" Jan 28 21:32:00 crc kubenswrapper[4746]: I0128 21:32:00.098985 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c0a60132-458a-46f2-a61f-444424b4c7cb-must-gather-output\") pod \"must-gather-5v5j6\" (UID: \"c0a60132-458a-46f2-a61f-444424b4c7cb\") " pod="openshift-must-gather-fjncs/must-gather-5v5j6" Jan 28 21:32:00 crc kubenswrapper[4746]: I0128 21:32:00.122668 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8wfn\" (UniqueName: \"kubernetes.io/projected/c0a60132-458a-46f2-a61f-444424b4c7cb-kube-api-access-w8wfn\") pod \"must-gather-5v5j6\" (UID: \"c0a60132-458a-46f2-a61f-444424b4c7cb\") " pod="openshift-must-gather-fjncs/must-gather-5v5j6" Jan 28 21:32:00 crc kubenswrapper[4746]: I0128 21:32:00.201150 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjncs/must-gather-5v5j6" Jan 28 21:32:00 crc kubenswrapper[4746]: I0128 21:32:00.711642 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fjncs/must-gather-5v5j6"] Jan 28 21:32:01 crc kubenswrapper[4746]: I0128 21:32:01.530288 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fjncs/must-gather-5v5j6" event={"ID":"c0a60132-458a-46f2-a61f-444424b4c7cb","Type":"ContainerStarted","Data":"bbc060ffa40d022fea79857c498cf64d9b209200d451e20a37989628820128b1"} Jan 28 21:32:09 crc kubenswrapper[4746]: I0128 21:32:09.612533 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fjncs/must-gather-5v5j6" event={"ID":"c0a60132-458a-46f2-a61f-444424b4c7cb","Type":"ContainerStarted","Data":"f784a010d13db326f2d27e8d7112b46dbad58c5d897846252677e1fb89b8b5b6"} Jan 28 21:32:09 crc kubenswrapper[4746]: I0128 21:32:09.613028 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fjncs/must-gather-5v5j6" event={"ID":"c0a60132-458a-46f2-a61f-444424b4c7cb","Type":"ContainerStarted","Data":"ca03b7997e8a5546e955fc8d2980d71dbf2f003ef00ac1b7464afe07799f1d18"} Jan 28 21:32:09 crc kubenswrapper[4746]: I0128 21:32:09.642659 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fjncs/must-gather-5v5j6" podStartSLOduration=2.648072023 podStartE2EDuration="10.642639569s" podCreationTimestamp="2026-01-28 21:31:59 +0000 UTC" firstStartedPulling="2026-01-28 21:32:00.717617061 +0000 UTC m=+3148.673803415" lastFinishedPulling="2026-01-28 21:32:08.712184607 +0000 UTC m=+3156.668370961" observedRunningTime="2026-01-28 21:32:09.631712072 +0000 UTC m=+3157.587898416" watchObservedRunningTime="2026-01-28 21:32:09.642639569 +0000 UTC m=+3157.598825923" Jan 28 21:32:13 crc kubenswrapper[4746]: I0128 21:32:13.127218 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fjncs/crc-debug-nkpdk"] Jan 28 21:32:13 crc kubenswrapper[4746]: I0128 21:32:13.128865 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjncs/crc-debug-nkpdk" Jan 28 21:32:13 crc kubenswrapper[4746]: I0128 21:32:13.152610 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-fjncs"/"default-dockercfg-v6h55" Jan 28 21:32:13 crc kubenswrapper[4746]: I0128 21:32:13.168940 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmn5d\" (UniqueName: \"kubernetes.io/projected/b7344261-061c-432d-8ea5-79eb497f8205-kube-api-access-gmn5d\") pod \"crc-debug-nkpdk\" (UID: \"b7344261-061c-432d-8ea5-79eb497f8205\") " pod="openshift-must-gather-fjncs/crc-debug-nkpdk" Jan 28 21:32:13 crc kubenswrapper[4746]: I0128 21:32:13.168995 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7344261-061c-432d-8ea5-79eb497f8205-host\") pod \"crc-debug-nkpdk\" (UID: \"b7344261-061c-432d-8ea5-79eb497f8205\") " pod="openshift-must-gather-fjncs/crc-debug-nkpdk" Jan 28 21:32:13 crc kubenswrapper[4746]: I0128 21:32:13.271794 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmn5d\" (UniqueName: \"kubernetes.io/projected/b7344261-061c-432d-8ea5-79eb497f8205-kube-api-access-gmn5d\") pod \"crc-debug-nkpdk\" (UID: \"b7344261-061c-432d-8ea5-79eb497f8205\") " pod="openshift-must-gather-fjncs/crc-debug-nkpdk" Jan 28 21:32:13 crc kubenswrapper[4746]: I0128 21:32:13.271845 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7344261-061c-432d-8ea5-79eb497f8205-host\") pod \"crc-debug-nkpdk\" (UID: \"b7344261-061c-432d-8ea5-79eb497f8205\") " pod="openshift-must-gather-fjncs/crc-debug-nkpdk" Jan 28 21:32:13 crc kubenswrapper[4746]: I0128 21:32:13.271974 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7344261-061c-432d-8ea5-79eb497f8205-host\") pod \"crc-debug-nkpdk\" (UID: \"b7344261-061c-432d-8ea5-79eb497f8205\") " pod="openshift-must-gather-fjncs/crc-debug-nkpdk" Jan 28 21:32:13 crc kubenswrapper[4746]: I0128 21:32:13.305955 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmn5d\" (UniqueName: \"kubernetes.io/projected/b7344261-061c-432d-8ea5-79eb497f8205-kube-api-access-gmn5d\") pod \"crc-debug-nkpdk\" (UID: \"b7344261-061c-432d-8ea5-79eb497f8205\") " pod="openshift-must-gather-fjncs/crc-debug-nkpdk" Jan 28 21:32:13 crc kubenswrapper[4746]: I0128 21:32:13.448436 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjncs/crc-debug-nkpdk" Jan 28 21:32:13 crc kubenswrapper[4746]: I0128 21:32:13.645600 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fjncs/crc-debug-nkpdk" event={"ID":"b7344261-061c-432d-8ea5-79eb497f8205","Type":"ContainerStarted","Data":"92325b9257fc477de9efacf5fbdf5af8839f959a29018bde6f620039120adae1"} Jan 28 21:32:14 crc kubenswrapper[4746]: I0128 21:32:14.836348 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:32:14 crc kubenswrapper[4746]: E0128 21:32:14.836625 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:32:25 crc kubenswrapper[4746]: I0128 21:32:25.841276 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:32:25 crc kubenswrapper[4746]: E0128 21:32:25.846548 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:32:26 crc kubenswrapper[4746]: I0128 21:32:26.793272 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fjncs/crc-debug-nkpdk" event={"ID":"b7344261-061c-432d-8ea5-79eb497f8205","Type":"ContainerStarted","Data":"0ed507540f26e3a9479ea99c801ff8d11e890a05483be6c285463189688a3ce4"} Jan 28 21:32:37 crc kubenswrapper[4746]: I0128 21:32:37.836344 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:32:37 crc kubenswrapper[4746]: E0128 21:32:37.837152 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:32:52 crc kubenswrapper[4746]: I0128 21:32:52.843183 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:32:52 crc kubenswrapper[4746]: E0128 21:32:52.843993 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:33:05 crc kubenswrapper[4746]: I0128 21:33:05.835786 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:33:05 crc kubenswrapper[4746]: E0128 21:33:05.836863 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:33:18 crc kubenswrapper[4746]: I0128 21:33:18.845678 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:33:18 crc kubenswrapper[4746]: E0128 21:33:18.846488 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:33:27 crc kubenswrapper[4746]: I0128 21:33:27.396818 4746 generic.go:334] "Generic (PLEG): container finished" podID="b7344261-061c-432d-8ea5-79eb497f8205" containerID="0ed507540f26e3a9479ea99c801ff8d11e890a05483be6c285463189688a3ce4" exitCode=0 Jan 28 21:33:27 crc kubenswrapper[4746]: I0128 21:33:27.396895 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fjncs/crc-debug-nkpdk" event={"ID":"b7344261-061c-432d-8ea5-79eb497f8205","Type":"ContainerDied","Data":"0ed507540f26e3a9479ea99c801ff8d11e890a05483be6c285463189688a3ce4"} Jan 28 21:33:28 crc kubenswrapper[4746]: I0128 21:33:28.533788 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjncs/crc-debug-nkpdk" Jan 28 21:33:28 crc kubenswrapper[4746]: I0128 21:33:28.570814 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fjncs/crc-debug-nkpdk"] Jan 28 21:33:28 crc kubenswrapper[4746]: I0128 21:33:28.579566 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fjncs/crc-debug-nkpdk"] Jan 28 21:33:28 crc kubenswrapper[4746]: I0128 21:33:28.593101 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmn5d\" (UniqueName: \"kubernetes.io/projected/b7344261-061c-432d-8ea5-79eb497f8205-kube-api-access-gmn5d\") pod \"b7344261-061c-432d-8ea5-79eb497f8205\" (UID: \"b7344261-061c-432d-8ea5-79eb497f8205\") " Jan 28 21:33:28 crc kubenswrapper[4746]: I0128 21:33:28.593140 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7344261-061c-432d-8ea5-79eb497f8205-host\") pod \"b7344261-061c-432d-8ea5-79eb497f8205\" (UID: \"b7344261-061c-432d-8ea5-79eb497f8205\") " Jan 28 21:33:28 crc kubenswrapper[4746]: I0128 21:33:28.593367 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7344261-061c-432d-8ea5-79eb497f8205-host" (OuterVolumeSpecName: "host") pod "b7344261-061c-432d-8ea5-79eb497f8205" (UID: "b7344261-061c-432d-8ea5-79eb497f8205"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 21:33:28 crc kubenswrapper[4746]: I0128 21:33:28.593685 4746 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7344261-061c-432d-8ea5-79eb497f8205-host\") on node \"crc\" DevicePath \"\"" Jan 28 21:33:28 crc kubenswrapper[4746]: I0128 21:33:28.598260 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7344261-061c-432d-8ea5-79eb497f8205-kube-api-access-gmn5d" (OuterVolumeSpecName: "kube-api-access-gmn5d") pod "b7344261-061c-432d-8ea5-79eb497f8205" (UID: "b7344261-061c-432d-8ea5-79eb497f8205"). InnerVolumeSpecName "kube-api-access-gmn5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:33:28 crc kubenswrapper[4746]: I0128 21:33:28.696022 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmn5d\" (UniqueName: \"kubernetes.io/projected/b7344261-061c-432d-8ea5-79eb497f8205-kube-api-access-gmn5d\") on node \"crc\" DevicePath \"\"" Jan 28 21:33:28 crc kubenswrapper[4746]: I0128 21:33:28.851712 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7344261-061c-432d-8ea5-79eb497f8205" path="/var/lib/kubelet/pods/b7344261-061c-432d-8ea5-79eb497f8205/volumes" Jan 28 21:33:29 crc kubenswrapper[4746]: I0128 21:33:29.427348 4746 scope.go:117] "RemoveContainer" containerID="0ed507540f26e3a9479ea99c801ff8d11e890a05483be6c285463189688a3ce4" Jan 28 21:33:29 crc kubenswrapper[4746]: I0128 21:33:29.427391 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjncs/crc-debug-nkpdk" Jan 28 21:33:29 crc kubenswrapper[4746]: I0128 21:33:29.730146 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fjncs/crc-debug-gjvhd"] Jan 28 21:33:29 crc kubenswrapper[4746]: E0128 21:33:29.730593 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7344261-061c-432d-8ea5-79eb497f8205" containerName="container-00" Jan 28 21:33:29 crc kubenswrapper[4746]: I0128 21:33:29.730605 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7344261-061c-432d-8ea5-79eb497f8205" containerName="container-00" Jan 28 21:33:29 crc kubenswrapper[4746]: I0128 21:33:29.730814 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7344261-061c-432d-8ea5-79eb497f8205" containerName="container-00" Jan 28 21:33:29 crc kubenswrapper[4746]: I0128 21:33:29.731528 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjncs/crc-debug-gjvhd" Jan 28 21:33:29 crc kubenswrapper[4746]: I0128 21:33:29.734460 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-fjncs"/"default-dockercfg-v6h55" Jan 28 21:33:29 crc kubenswrapper[4746]: I0128 21:33:29.820274 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d1ee023c-b20d-4723-963b-4ef6982ba39d-host\") pod \"crc-debug-gjvhd\" (UID: \"d1ee023c-b20d-4723-963b-4ef6982ba39d\") " pod="openshift-must-gather-fjncs/crc-debug-gjvhd" Jan 28 21:33:29 crc kubenswrapper[4746]: I0128 21:33:29.820317 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w5hj\" (UniqueName: \"kubernetes.io/projected/d1ee023c-b20d-4723-963b-4ef6982ba39d-kube-api-access-8w5hj\") pod \"crc-debug-gjvhd\" (UID: \"d1ee023c-b20d-4723-963b-4ef6982ba39d\") " pod="openshift-must-gather-fjncs/crc-debug-gjvhd" Jan 28 21:33:29 crc kubenswrapper[4746]: I0128 21:33:29.836104 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:33:29 crc kubenswrapper[4746]: E0128 21:33:29.836388 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:33:29 crc kubenswrapper[4746]: I0128 21:33:29.922607 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d1ee023c-b20d-4723-963b-4ef6982ba39d-host\") pod \"crc-debug-gjvhd\" (UID: \"d1ee023c-b20d-4723-963b-4ef6982ba39d\") " pod="openshift-must-gather-fjncs/crc-debug-gjvhd" Jan 28 21:33:29 crc kubenswrapper[4746]: I0128 21:33:29.922659 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w5hj\" (UniqueName: \"kubernetes.io/projected/d1ee023c-b20d-4723-963b-4ef6982ba39d-kube-api-access-8w5hj\") pod \"crc-debug-gjvhd\" (UID: \"d1ee023c-b20d-4723-963b-4ef6982ba39d\") " pod="openshift-must-gather-fjncs/crc-debug-gjvhd" Jan 28 21:33:29 crc kubenswrapper[4746]: I0128 21:33:29.923259 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d1ee023c-b20d-4723-963b-4ef6982ba39d-host\") pod \"crc-debug-gjvhd\" (UID: \"d1ee023c-b20d-4723-963b-4ef6982ba39d\") " pod="openshift-must-gather-fjncs/crc-debug-gjvhd" Jan 28 21:33:29 crc kubenswrapper[4746]: I0128 21:33:29.942656 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w5hj\" (UniqueName: \"kubernetes.io/projected/d1ee023c-b20d-4723-963b-4ef6982ba39d-kube-api-access-8w5hj\") pod \"crc-debug-gjvhd\" (UID: \"d1ee023c-b20d-4723-963b-4ef6982ba39d\") " pod="openshift-must-gather-fjncs/crc-debug-gjvhd" Jan 28 21:33:30 crc kubenswrapper[4746]: I0128 21:33:30.058290 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjncs/crc-debug-gjvhd" Jan 28 21:33:30 crc kubenswrapper[4746]: I0128 21:33:30.437411 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fjncs/crc-debug-gjvhd" event={"ID":"d1ee023c-b20d-4723-963b-4ef6982ba39d","Type":"ContainerStarted","Data":"a7180455cc870f5987a1bbfe020d75edd9439e7fe60ec3cedcb842e1692b79f6"} Jan 28 21:33:30 crc kubenswrapper[4746]: I0128 21:33:30.437734 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fjncs/crc-debug-gjvhd" event={"ID":"d1ee023c-b20d-4723-963b-4ef6982ba39d","Type":"ContainerStarted","Data":"586e09793ff886e6f7bfa6a2cbad20c1649277c55165a603e2d6b0f8f71b26f3"} Jan 28 21:33:30 crc kubenswrapper[4746]: I0128 21:33:30.454275 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fjncs/crc-debug-gjvhd" podStartSLOduration=1.4542545150000001 podStartE2EDuration="1.454254515s" podCreationTimestamp="2026-01-28 21:33:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:33:30.451055108 +0000 UTC m=+3238.407241472" watchObservedRunningTime="2026-01-28 21:33:30.454254515 +0000 UTC m=+3238.410440879" Jan 28 21:33:31 crc kubenswrapper[4746]: I0128 21:33:31.449823 4746 generic.go:334] "Generic (PLEG): container finished" podID="d1ee023c-b20d-4723-963b-4ef6982ba39d" containerID="a7180455cc870f5987a1bbfe020d75edd9439e7fe60ec3cedcb842e1692b79f6" exitCode=0 Jan 28 21:33:31 crc kubenswrapper[4746]: I0128 21:33:31.449894 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fjncs/crc-debug-gjvhd" event={"ID":"d1ee023c-b20d-4723-963b-4ef6982ba39d","Type":"ContainerDied","Data":"a7180455cc870f5987a1bbfe020d75edd9439e7fe60ec3cedcb842e1692b79f6"} Jan 28 21:33:32 crc kubenswrapper[4746]: I0128 21:33:32.577577 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjncs/crc-debug-gjvhd" Jan 28 21:33:32 crc kubenswrapper[4746]: I0128 21:33:32.612111 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fjncs/crc-debug-gjvhd"] Jan 28 21:33:32 crc kubenswrapper[4746]: I0128 21:33:32.620396 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fjncs/crc-debug-gjvhd"] Jan 28 21:33:32 crc kubenswrapper[4746]: I0128 21:33:32.679437 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d1ee023c-b20d-4723-963b-4ef6982ba39d-host\") pod \"d1ee023c-b20d-4723-963b-4ef6982ba39d\" (UID: \"d1ee023c-b20d-4723-963b-4ef6982ba39d\") " Jan 28 21:33:32 crc kubenswrapper[4746]: I0128 21:33:32.679715 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w5hj\" (UniqueName: \"kubernetes.io/projected/d1ee023c-b20d-4723-963b-4ef6982ba39d-kube-api-access-8w5hj\") pod \"d1ee023c-b20d-4723-963b-4ef6982ba39d\" (UID: \"d1ee023c-b20d-4723-963b-4ef6982ba39d\") " Jan 28 21:33:32 crc kubenswrapper[4746]: I0128 21:33:32.679532 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1ee023c-b20d-4723-963b-4ef6982ba39d-host" (OuterVolumeSpecName: "host") pod "d1ee023c-b20d-4723-963b-4ef6982ba39d" (UID: "d1ee023c-b20d-4723-963b-4ef6982ba39d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 21:33:32 crc kubenswrapper[4746]: I0128 21:33:32.680545 4746 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d1ee023c-b20d-4723-963b-4ef6982ba39d-host\") on node \"crc\" DevicePath \"\"" Jan 28 21:33:32 crc kubenswrapper[4746]: I0128 21:33:32.685325 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1ee023c-b20d-4723-963b-4ef6982ba39d-kube-api-access-8w5hj" (OuterVolumeSpecName: "kube-api-access-8w5hj") pod "d1ee023c-b20d-4723-963b-4ef6982ba39d" (UID: "d1ee023c-b20d-4723-963b-4ef6982ba39d"). InnerVolumeSpecName "kube-api-access-8w5hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:33:32 crc kubenswrapper[4746]: I0128 21:33:32.782492 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w5hj\" (UniqueName: \"kubernetes.io/projected/d1ee023c-b20d-4723-963b-4ef6982ba39d-kube-api-access-8w5hj\") on node \"crc\" DevicePath \"\"" Jan 28 21:33:32 crc kubenswrapper[4746]: I0128 21:33:32.855053 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1ee023c-b20d-4723-963b-4ef6982ba39d" path="/var/lib/kubelet/pods/d1ee023c-b20d-4723-963b-4ef6982ba39d/volumes" Jan 28 21:33:33 crc kubenswrapper[4746]: I0128 21:33:33.474159 4746 scope.go:117] "RemoveContainer" containerID="a7180455cc870f5987a1bbfe020d75edd9439e7fe60ec3cedcb842e1692b79f6" Jan 28 21:33:33 crc kubenswrapper[4746]: I0128 21:33:33.474419 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjncs/crc-debug-gjvhd" Jan 28 21:33:33 crc kubenswrapper[4746]: I0128 21:33:33.790898 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fjncs/crc-debug-tnlht"] Jan 28 21:33:33 crc kubenswrapper[4746]: E0128 21:33:33.791486 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1ee023c-b20d-4723-963b-4ef6982ba39d" containerName="container-00" Jan 28 21:33:33 crc kubenswrapper[4746]: I0128 21:33:33.791501 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1ee023c-b20d-4723-963b-4ef6982ba39d" containerName="container-00" Jan 28 21:33:33 crc kubenswrapper[4746]: I0128 21:33:33.791768 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1ee023c-b20d-4723-963b-4ef6982ba39d" containerName="container-00" Jan 28 21:33:33 crc kubenswrapper[4746]: I0128 21:33:33.792637 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjncs/crc-debug-tnlht" Jan 28 21:33:33 crc kubenswrapper[4746]: I0128 21:33:33.794934 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-fjncs"/"default-dockercfg-v6h55" Jan 28 21:33:33 crc kubenswrapper[4746]: I0128 21:33:33.907893 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grjw4\" (UniqueName: \"kubernetes.io/projected/f9406ea8-74b7-45da-b841-e7106d813b9b-kube-api-access-grjw4\") pod \"crc-debug-tnlht\" (UID: \"f9406ea8-74b7-45da-b841-e7106d813b9b\") " pod="openshift-must-gather-fjncs/crc-debug-tnlht" Jan 28 21:33:33 crc kubenswrapper[4746]: I0128 21:33:33.908346 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9406ea8-74b7-45da-b841-e7106d813b9b-host\") pod \"crc-debug-tnlht\" (UID: \"f9406ea8-74b7-45da-b841-e7106d813b9b\") " pod="openshift-must-gather-fjncs/crc-debug-tnlht" Jan 28 21:33:34 crc kubenswrapper[4746]: I0128 21:33:34.010240 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grjw4\" (UniqueName: \"kubernetes.io/projected/f9406ea8-74b7-45da-b841-e7106d813b9b-kube-api-access-grjw4\") pod \"crc-debug-tnlht\" (UID: \"f9406ea8-74b7-45da-b841-e7106d813b9b\") " pod="openshift-must-gather-fjncs/crc-debug-tnlht" Jan 28 21:33:34 crc kubenswrapper[4746]: I0128 21:33:34.010315 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9406ea8-74b7-45da-b841-e7106d813b9b-host\") pod \"crc-debug-tnlht\" (UID: \"f9406ea8-74b7-45da-b841-e7106d813b9b\") " pod="openshift-must-gather-fjncs/crc-debug-tnlht" Jan 28 21:33:34 crc kubenswrapper[4746]: I0128 21:33:34.010380 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9406ea8-74b7-45da-b841-e7106d813b9b-host\") pod \"crc-debug-tnlht\" (UID: \"f9406ea8-74b7-45da-b841-e7106d813b9b\") " pod="openshift-must-gather-fjncs/crc-debug-tnlht" Jan 28 21:33:34 crc kubenswrapper[4746]: I0128 21:33:34.030371 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grjw4\" (UniqueName: \"kubernetes.io/projected/f9406ea8-74b7-45da-b841-e7106d813b9b-kube-api-access-grjw4\") pod \"crc-debug-tnlht\" (UID: \"f9406ea8-74b7-45da-b841-e7106d813b9b\") " pod="openshift-must-gather-fjncs/crc-debug-tnlht" Jan 28 21:33:34 crc kubenswrapper[4746]: I0128 21:33:34.118918 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjncs/crc-debug-tnlht" Jan 28 21:33:34 crc kubenswrapper[4746]: I0128 21:33:34.487720 4746 generic.go:334] "Generic (PLEG): container finished" podID="f9406ea8-74b7-45da-b841-e7106d813b9b" containerID="d3cab073ca84547ceb2234b4e8c1175bb3a0b24df40364aa0bc2fdd53ff0bab3" exitCode=0 Jan 28 21:33:34 crc kubenswrapper[4746]: I0128 21:33:34.487810 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fjncs/crc-debug-tnlht" event={"ID":"f9406ea8-74b7-45da-b841-e7106d813b9b","Type":"ContainerDied","Data":"d3cab073ca84547ceb2234b4e8c1175bb3a0b24df40364aa0bc2fdd53ff0bab3"} Jan 28 21:33:34 crc kubenswrapper[4746]: I0128 21:33:34.488039 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fjncs/crc-debug-tnlht" event={"ID":"f9406ea8-74b7-45da-b841-e7106d813b9b","Type":"ContainerStarted","Data":"be1884903ef09561116133531b0870871f3349adc4b2818e65c8c9f26bd1436f"} Jan 28 21:33:34 crc kubenswrapper[4746]: I0128 21:33:34.526029 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fjncs/crc-debug-tnlht"] Jan 28 21:33:34 crc kubenswrapper[4746]: I0128 21:33:34.536744 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fjncs/crc-debug-tnlht"] Jan 28 21:33:35 crc kubenswrapper[4746]: I0128 21:33:35.615569 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjncs/crc-debug-tnlht" Jan 28 21:33:35 crc kubenswrapper[4746]: I0128 21:33:35.758210 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grjw4\" (UniqueName: \"kubernetes.io/projected/f9406ea8-74b7-45da-b841-e7106d813b9b-kube-api-access-grjw4\") pod \"f9406ea8-74b7-45da-b841-e7106d813b9b\" (UID: \"f9406ea8-74b7-45da-b841-e7106d813b9b\") " Jan 28 21:33:35 crc kubenswrapper[4746]: I0128 21:33:35.758439 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9406ea8-74b7-45da-b841-e7106d813b9b-host\") pod \"f9406ea8-74b7-45da-b841-e7106d813b9b\" (UID: \"f9406ea8-74b7-45da-b841-e7106d813b9b\") " Jan 28 21:33:35 crc kubenswrapper[4746]: I0128 21:33:35.758547 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9406ea8-74b7-45da-b841-e7106d813b9b-host" (OuterVolumeSpecName: "host") pod "f9406ea8-74b7-45da-b841-e7106d813b9b" (UID: "f9406ea8-74b7-45da-b841-e7106d813b9b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 21:33:35 crc kubenswrapper[4746]: I0128 21:33:35.759038 4746 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9406ea8-74b7-45da-b841-e7106d813b9b-host\") on node \"crc\" DevicePath \"\"" Jan 28 21:33:35 crc kubenswrapper[4746]: I0128 21:33:35.764687 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9406ea8-74b7-45da-b841-e7106d813b9b-kube-api-access-grjw4" (OuterVolumeSpecName: "kube-api-access-grjw4") pod "f9406ea8-74b7-45da-b841-e7106d813b9b" (UID: "f9406ea8-74b7-45da-b841-e7106d813b9b"). InnerVolumeSpecName "kube-api-access-grjw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:33:35 crc kubenswrapper[4746]: I0128 21:33:35.860968 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grjw4\" (UniqueName: \"kubernetes.io/projected/f9406ea8-74b7-45da-b841-e7106d813b9b-kube-api-access-grjw4\") on node \"crc\" DevicePath \"\"" Jan 28 21:33:36 crc kubenswrapper[4746]: I0128 21:33:36.503704 4746 scope.go:117] "RemoveContainer" containerID="d3cab073ca84547ceb2234b4e8c1175bb3a0b24df40364aa0bc2fdd53ff0bab3" Jan 28 21:33:36 crc kubenswrapper[4746]: I0128 21:33:36.504160 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjncs/crc-debug-tnlht" Jan 28 21:33:36 crc kubenswrapper[4746]: I0128 21:33:36.847319 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9406ea8-74b7-45da-b841-e7106d813b9b" path="/var/lib/kubelet/pods/f9406ea8-74b7-45da-b841-e7106d813b9b/volumes" Jan 28 21:33:40 crc kubenswrapper[4746]: I0128 21:33:40.836792 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:33:40 crc kubenswrapper[4746]: E0128 21:33:40.837714 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:33:51 crc kubenswrapper[4746]: I0128 21:33:51.835960 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:33:51 crc kubenswrapper[4746]: E0128 21:33:51.836711 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:33:55 crc kubenswrapper[4746]: I0128 21:33:55.934519 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-frkpn"] Jan 28 21:33:55 crc kubenswrapper[4746]: E0128 21:33:55.935629 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9406ea8-74b7-45da-b841-e7106d813b9b" containerName="container-00" Jan 28 21:33:55 crc kubenswrapper[4746]: I0128 21:33:55.935651 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9406ea8-74b7-45da-b841-e7106d813b9b" containerName="container-00" Jan 28 21:33:55 crc kubenswrapper[4746]: I0128 21:33:55.936013 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9406ea8-74b7-45da-b841-e7106d813b9b" containerName="container-00" Jan 28 21:33:55 crc kubenswrapper[4746]: I0128 21:33:55.938465 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frkpn" Jan 28 21:33:55 crc kubenswrapper[4746]: I0128 21:33:55.976329 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-frkpn"] Jan 28 21:33:56 crc kubenswrapper[4746]: I0128 21:33:56.019877 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41b73a06-0bf3-435a-90f4-71dc6706ffc7-catalog-content\") pod \"community-operators-frkpn\" (UID: \"41b73a06-0bf3-435a-90f4-71dc6706ffc7\") " pod="openshift-marketplace/community-operators-frkpn" Jan 28 21:33:56 crc kubenswrapper[4746]: I0128 21:33:56.019984 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7tt2\" (UniqueName: \"kubernetes.io/projected/41b73a06-0bf3-435a-90f4-71dc6706ffc7-kube-api-access-d7tt2\") pod \"community-operators-frkpn\" (UID: \"41b73a06-0bf3-435a-90f4-71dc6706ffc7\") " pod="openshift-marketplace/community-operators-frkpn" Jan 28 21:33:56 crc kubenswrapper[4746]: I0128 21:33:56.020053 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41b73a06-0bf3-435a-90f4-71dc6706ffc7-utilities\") pod \"community-operators-frkpn\" (UID: \"41b73a06-0bf3-435a-90f4-71dc6706ffc7\") " pod="openshift-marketplace/community-operators-frkpn" Jan 28 21:33:56 crc kubenswrapper[4746]: I0128 21:33:56.122184 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41b73a06-0bf3-435a-90f4-71dc6706ffc7-utilities\") pod \"community-operators-frkpn\" (UID: \"41b73a06-0bf3-435a-90f4-71dc6706ffc7\") " pod="openshift-marketplace/community-operators-frkpn" Jan 28 21:33:56 crc kubenswrapper[4746]: I0128 21:33:56.122285 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41b73a06-0bf3-435a-90f4-71dc6706ffc7-catalog-content\") pod \"community-operators-frkpn\" (UID: \"41b73a06-0bf3-435a-90f4-71dc6706ffc7\") " pod="openshift-marketplace/community-operators-frkpn" Jan 28 21:33:56 crc kubenswrapper[4746]: I0128 21:33:56.122362 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7tt2\" (UniqueName: \"kubernetes.io/projected/41b73a06-0bf3-435a-90f4-71dc6706ffc7-kube-api-access-d7tt2\") pod \"community-operators-frkpn\" (UID: \"41b73a06-0bf3-435a-90f4-71dc6706ffc7\") " pod="openshift-marketplace/community-operators-frkpn" Jan 28 21:33:56 crc kubenswrapper[4746]: I0128 21:33:56.123422 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41b73a06-0bf3-435a-90f4-71dc6706ffc7-catalog-content\") pod \"community-operators-frkpn\" (UID: \"41b73a06-0bf3-435a-90f4-71dc6706ffc7\") " pod="openshift-marketplace/community-operators-frkpn" Jan 28 21:33:56 crc kubenswrapper[4746]: I0128 21:33:56.124041 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41b73a06-0bf3-435a-90f4-71dc6706ffc7-utilities\") pod \"community-operators-frkpn\" (UID: \"41b73a06-0bf3-435a-90f4-71dc6706ffc7\") " pod="openshift-marketplace/community-operators-frkpn" Jan 28 21:33:56 crc kubenswrapper[4746]: I0128 21:33:56.154782 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7tt2\" (UniqueName: \"kubernetes.io/projected/41b73a06-0bf3-435a-90f4-71dc6706ffc7-kube-api-access-d7tt2\") pod \"community-operators-frkpn\" (UID: \"41b73a06-0bf3-435a-90f4-71dc6706ffc7\") " pod="openshift-marketplace/community-operators-frkpn" Jan 28 21:33:56 crc kubenswrapper[4746]: I0128 21:33:56.271066 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frkpn" Jan 28 21:33:56 crc kubenswrapper[4746]: I0128 21:33:56.916271 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-frkpn"] Jan 28 21:33:56 crc kubenswrapper[4746]: W0128 21:33:56.923519 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41b73a06_0bf3_435a_90f4_71dc6706ffc7.slice/crio-90a1e1f7da74e2dee0e47a3a040ac989eb1fae8d2d214342b5a4007df8e4d761 WatchSource:0}: Error finding container 90a1e1f7da74e2dee0e47a3a040ac989eb1fae8d2d214342b5a4007df8e4d761: Status 404 returned error can't find the container with id 90a1e1f7da74e2dee0e47a3a040ac989eb1fae8d2d214342b5a4007df8e4d761 Jan 28 21:33:57 crc kubenswrapper[4746]: I0128 21:33:57.817361 4746 generic.go:334] "Generic (PLEG): container finished" podID="41b73a06-0bf3-435a-90f4-71dc6706ffc7" containerID="a6bb88388ed7ae8965a6c1273c83e5bd65a1fc674d0e099a9684007ef4480bdd" exitCode=0 Jan 28 21:33:57 crc kubenswrapper[4746]: I0128 21:33:57.817539 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frkpn" event={"ID":"41b73a06-0bf3-435a-90f4-71dc6706ffc7","Type":"ContainerDied","Data":"a6bb88388ed7ae8965a6c1273c83e5bd65a1fc674d0e099a9684007ef4480bdd"} Jan 28 21:33:57 crc kubenswrapper[4746]: I0128 21:33:57.817771 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frkpn" event={"ID":"41b73a06-0bf3-435a-90f4-71dc6706ffc7","Type":"ContainerStarted","Data":"90a1e1f7da74e2dee0e47a3a040ac989eb1fae8d2d214342b5a4007df8e4d761"} Jan 28 21:33:59 crc kubenswrapper[4746]: I0128 21:33:59.836293 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frkpn" event={"ID":"41b73a06-0bf3-435a-90f4-71dc6706ffc7","Type":"ContainerStarted","Data":"999f488dedb4037b5018b157054eafb044ddaf3aa381131189c0e42cf041517f"} Jan 28 21:34:01 crc kubenswrapper[4746]: I0128 21:34:01.860773 4746 generic.go:334] "Generic (PLEG): container finished" podID="41b73a06-0bf3-435a-90f4-71dc6706ffc7" containerID="999f488dedb4037b5018b157054eafb044ddaf3aa381131189c0e42cf041517f" exitCode=0 Jan 28 21:34:01 crc kubenswrapper[4746]: I0128 21:34:01.860849 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frkpn" event={"ID":"41b73a06-0bf3-435a-90f4-71dc6706ffc7","Type":"ContainerDied","Data":"999f488dedb4037b5018b157054eafb044ddaf3aa381131189c0e42cf041517f"} Jan 28 21:34:02 crc kubenswrapper[4746]: I0128 21:34:02.887384 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frkpn" event={"ID":"41b73a06-0bf3-435a-90f4-71dc6706ffc7","Type":"ContainerStarted","Data":"558481c7abccdc6331dbc3289bee87fdbcc343d20dcf5c3e03528e040d67d1c5"} Jan 28 21:34:02 crc kubenswrapper[4746]: I0128 21:34:02.919791 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-frkpn" podStartSLOduration=3.476995283 podStartE2EDuration="7.919755627s" podCreationTimestamp="2026-01-28 21:33:55 +0000 UTC" firstStartedPulling="2026-01-28 21:33:57.823942974 +0000 UTC m=+3265.780129328" lastFinishedPulling="2026-01-28 21:34:02.266703318 +0000 UTC m=+3270.222889672" observedRunningTime="2026-01-28 21:34:02.907644668 +0000 UTC m=+3270.863831022" watchObservedRunningTime="2026-01-28 21:34:02.919755627 +0000 UTC m=+3270.875941991" Jan 28 21:34:03 crc kubenswrapper[4746]: I0128 21:34:03.835842 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:34:03 crc kubenswrapper[4746]: E0128 21:34:03.836490 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:34:03 crc kubenswrapper[4746]: I0128 21:34:03.974514 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0701e4bf-44d6-462c-a55b-140c2efceb6b/init-config-reloader/0.log" Jan 28 21:34:04 crc kubenswrapper[4746]: I0128 21:34:04.199684 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0701e4bf-44d6-462c-a55b-140c2efceb6b/init-config-reloader/0.log" Jan 28 21:34:04 crc kubenswrapper[4746]: I0128 21:34:04.275220 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0701e4bf-44d6-462c-a55b-140c2efceb6b/alertmanager/0.log" Jan 28 21:34:04 crc kubenswrapper[4746]: I0128 21:34:04.303429 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0701e4bf-44d6-462c-a55b-140c2efceb6b/config-reloader/0.log" Jan 28 21:34:04 crc kubenswrapper[4746]: I0128 21:34:04.416120 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-79599f5dcd-btgz7_8fa661e3-776e-42b0-83db-374d372232ad/barbican-api/0.log" Jan 28 21:34:04 crc kubenswrapper[4746]: I0128 21:34:04.531781 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-79599f5dcd-btgz7_8fa661e3-776e-42b0-83db-374d372232ad/barbican-api-log/0.log" Jan 28 21:34:04 crc kubenswrapper[4746]: I0128 21:34:04.598177 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-74d8954788-pqmtp_9d6c401d-18ee-432b-992c-749c69887786/barbican-keystone-listener/0.log" Jan 28 21:34:04 crc kubenswrapper[4746]: I0128 21:34:04.718854 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-74d8954788-pqmtp_9d6c401d-18ee-432b-992c-749c69887786/barbican-keystone-listener-log/0.log" Jan 28 21:34:05 crc kubenswrapper[4746]: I0128 21:34:05.062564 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-54d56bfd95-zhg7t_26db0906-ba06-4d40-b864-c7d956379296/barbican-worker/0.log" Jan 28 21:34:05 crc kubenswrapper[4746]: I0128 21:34:05.089001 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-54d56bfd95-zhg7t_26db0906-ba06-4d40-b864-c7d956379296/barbican-worker-log/0.log" Jan 28 21:34:05 crc kubenswrapper[4746]: I0128 21:34:05.326002 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr_ed8a3948-98ae-4e2a-a9f8-435287fc9583/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:34:05 crc kubenswrapper[4746]: I0128 21:34:05.532296 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6f6522f9-6035-4484-ba00-2255f04cd85d/ceilometer-central-agent/0.log" Jan 28 21:34:05 crc kubenswrapper[4746]: I0128 21:34:05.550960 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6f6522f9-6035-4484-ba00-2255f04cd85d/proxy-httpd/0.log" Jan 28 21:34:05 crc kubenswrapper[4746]: I0128 21:34:05.599459 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6f6522f9-6035-4484-ba00-2255f04cd85d/ceilometer-notification-agent/0.log" Jan 28 21:34:05 crc kubenswrapper[4746]: I0128 21:34:05.721475 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6f6522f9-6035-4484-ba00-2255f04cd85d/sg-core/0.log" Jan 28 21:34:05 crc kubenswrapper[4746]: I0128 21:34:05.781037 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a613bc41-1308-4925-a2df-026f6622f0c2/cinder-api-log/0.log" Jan 28 21:34:05 crc kubenswrapper[4746]: I0128 21:34:05.831667 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a613bc41-1308-4925-a2df-026f6622f0c2/cinder-api/0.log" Jan 28 21:34:06 crc kubenswrapper[4746]: I0128 21:34:06.085349 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9305786c-240c-4e6a-a110-599c0067ce78/cinder-scheduler/0.log" Jan 28 21:34:06 crc kubenswrapper[4746]: I0128 21:34:06.196639 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9305786c-240c-4e6a-a110-599c0067ce78/probe/0.log" Jan 28 21:34:06 crc kubenswrapper[4746]: I0128 21:34:06.271969 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-frkpn" Jan 28 21:34:06 crc kubenswrapper[4746]: I0128 21:34:06.272017 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-frkpn" Jan 28 21:34:06 crc kubenswrapper[4746]: I0128 21:34:06.331351 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_e6b50cb8-f8a4-49e7-b464-7e42fc66e499/cloudkitty-api-log/0.log" Jan 28 21:34:06 crc kubenswrapper[4746]: I0128 21:34:06.338759 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-frkpn" Jan 28 21:34:06 crc kubenswrapper[4746]: I0128 21:34:06.376707 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_e6b50cb8-f8a4-49e7-b464-7e42fc66e499/cloudkitty-api/0.log" Jan 28 21:34:06 crc kubenswrapper[4746]: I0128 21:34:06.453786 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_6edc718f-ce48-415e-ae81-574ef1f48cb6/loki-compactor/0.log" Jan 28 21:34:06 crc kubenswrapper[4746]: I0128 21:34:06.634937 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-66dfd9bb-55rmf_7b3d4385-f154-424c-b7b6-280c36a88967/loki-distributor/0.log" Jan 28 21:34:06 crc kubenswrapper[4746]: I0128 21:34:06.675713 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7db4f4db8c-jptw9_247c16c1-2e4e-48dd-b836-0792f7231417/gateway/0.log" Jan 28 21:34:06 crc kubenswrapper[4746]: I0128 21:34:06.825340 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt_f6b72417-5723-4d82-928b-f4be94e4bbfd/gateway/0.log" Jan 28 21:34:07 crc kubenswrapper[4746]: I0128 21:34:07.020973 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef/loki-index-gateway/0.log" Jan 28 21:34:07 crc kubenswrapper[4746]: I0128 21:34:07.343648 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_d3cad0b0-7b53-4280-9dec-05e01692820c/loki-ingester/0.log" Jan 28 21:34:07 crc kubenswrapper[4746]: I0128 21:34:07.622033 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-795fd8f8cc-gb5z2_9f570ea4-b303-46ab-8a65-cf64391aeb3b/loki-querier/0.log" Jan 28 21:34:07 crc kubenswrapper[4746]: I0128 21:34:07.629873 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh_add39f1a-2338-41e9-9a61-d32fe5a28097/loki-query-frontend/0.log" Jan 28 21:34:07 crc kubenswrapper[4746]: I0128 21:34:07.878672 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-42x6f_e9b6010d-cd57-4992-b441-1745330a0246/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:34:07 crc kubenswrapper[4746]: I0128 21:34:07.879572 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf_f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:34:08 crc kubenswrapper[4746]: I0128 21:34:08.086390 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-pndxq_7f070414-7083-40c4-b7aa-db248c3fd681/init/0.log" Jan 28 21:34:08 crc kubenswrapper[4746]: I0128 21:34:08.732871 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-pndxq_7f070414-7083-40c4-b7aa-db248c3fd681/init/0.log" Jan 28 21:34:08 crc kubenswrapper[4746]: I0128 21:34:08.837626 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m_bd3a62cf-5636-4a92-8cc8-8025e70ad3d0/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:34:08 crc kubenswrapper[4746]: I0128 21:34:08.867422 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-pndxq_7f070414-7083-40c4-b7aa-db248c3fd681/dnsmasq-dns/0.log" Jan 28 21:34:09 crc kubenswrapper[4746]: I0128 21:34:09.067471 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ed20e05e-643c-407e-bd2f-ce931e1e2bd1/glance-httpd/0.log" Jan 28 21:34:09 crc kubenswrapper[4746]: I0128 21:34:09.124467 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ed20e05e-643c-407e-bd2f-ce931e1e2bd1/glance-log/0.log" Jan 28 21:34:09 crc kubenswrapper[4746]: I0128 21:34:09.232594 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ffdc41d1-2cd3-446e-8d3f-6e374a19f56a/glance-httpd/0.log" Jan 28 21:34:09 crc kubenswrapper[4746]: I0128 21:34:09.331620 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ffdc41d1-2cd3-446e-8d3f-6e374a19f56a/glance-log/0.log" Jan 28 21:34:09 crc kubenswrapper[4746]: I0128 21:34:09.477110 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd_0dbab66d-c007-4c33-b6da-1e44860668a0/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:34:09 crc kubenswrapper[4746]: I0128 21:34:09.819185 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-qdhsw_55aba866-d60c-4581-8f83-28fc14e421f8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:34:09 crc kubenswrapper[4746]: I0128 21:34:09.970011 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29493901-2dcqp_018e2b8e-63bb-41fd-8153-f0c8fc106af7/keystone-cron/0.log" Jan 28 21:34:10 crc kubenswrapper[4746]: I0128 21:34:10.154457 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_62270f68-89c1-462f-8aac-c4944f92cc3f/kube-state-metrics/0.log" Jan 28 21:34:10 crc kubenswrapper[4746]: I0128 21:34:10.253674 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6ff88f78d4-bh6qm_b8f1ba06-a425-4474-94a2-80c68832caac/keystone-api/0.log" Jan 28 21:34:10 crc kubenswrapper[4746]: I0128 21:34:10.364271 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx_7b6fd411-07ae-42b1-bb00-68e72fdbe6fb/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:34:10 crc kubenswrapper[4746]: I0128 21:34:10.784949 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5569c8497f-nhjcs_d9c2d514-3bdc-4969-a429-0aac820c8e77/neutron-httpd/0.log" Jan 28 21:34:10 crc kubenswrapper[4746]: I0128 21:34:10.895959 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5569c8497f-nhjcs_d9c2d514-3bdc-4969-a429-0aac820c8e77/neutron-api/0.log" Jan 28 21:34:11 crc kubenswrapper[4746]: I0128 21:34:11.116331 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp_92c386a4-a812-4e5f-938a-611be2d329ff/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:34:11 crc kubenswrapper[4746]: I0128 21:34:11.655198 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_aca41824-3271-42e1-93f8-76a1a9000681/nova-api-log/0.log" Jan 28 21:34:11 crc kubenswrapper[4746]: I0128 21:34:11.775601 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_aca41824-3271-42e1-93f8-76a1a9000681/nova-api-api/0.log" Jan 28 21:34:11 crc kubenswrapper[4746]: I0128 21:34:11.907202 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a34b42b7-85e9-4934-bbe2-487072111391/nova-cell0-conductor-conductor/0.log" Jan 28 21:34:12 crc kubenswrapper[4746]: I0128 21:34:12.410204 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b1d56ad0-5928-4211-a272-59aaab5e538b/nova-cell1-conductor-conductor/0.log" Jan 28 21:34:12 crc kubenswrapper[4746]: I0128 21:34:12.462341 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_3466ae6e-8f00-4a2c-896e-cf1268924542/nova-cell1-novncproxy-novncproxy/0.log" Jan 28 21:34:12 crc kubenswrapper[4746]: I0128 21:34:12.745345 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-dpjnx_1d1f9f12-edab-459d-b9ac-2bb03644b752/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:34:12 crc kubenswrapper[4746]: I0128 21:34:12.986916 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0/nova-metadata-log/0.log" Jan 28 21:34:13 crc kubenswrapper[4746]: I0128 21:34:13.461492 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d36f2955-7688-4b25-9097-becffcb1f3ad/nova-scheduler-scheduler/0.log" Jan 28 21:34:13 crc kubenswrapper[4746]: I0128 21:34:13.540277 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7257206d-db68-4f31-84d1-ceb4175ea394/mysql-bootstrap/0.log" Jan 28 21:34:13 crc kubenswrapper[4746]: I0128 21:34:13.719099 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7257206d-db68-4f31-84d1-ceb4175ea394/galera/0.log" Jan 28 21:34:13 crc kubenswrapper[4746]: I0128 21:34:13.806702 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7257206d-db68-4f31-84d1-ceb4175ea394/mysql-bootstrap/0.log" Jan 28 21:34:14 crc kubenswrapper[4746]: I0128 21:34:14.037790 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e98da54b-efd0-4811-a433-9ce8134feb13/mysql-bootstrap/0.log" Jan 28 21:34:14 crc kubenswrapper[4746]: I0128 21:34:14.214647 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e98da54b-efd0-4811-a433-9ce8134feb13/mysql-bootstrap/0.log" Jan 28 21:34:14 crc kubenswrapper[4746]: I0128 21:34:14.226025 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0/nova-metadata-metadata/0.log" Jan 28 21:34:14 crc kubenswrapper[4746]: I0128 21:34:14.253892 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e98da54b-efd0-4811-a433-9ce8134feb13/galera/0.log" Jan 28 21:34:14 crc kubenswrapper[4746]: I0128 21:34:14.518896 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b9e46853-37d6-49c8-ada6-344f49a39e5f/openstackclient/0.log" Jan 28 21:34:14 crc kubenswrapper[4746]: I0128 21:34:14.800693 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-5vb9w_7349005b-b4d2-40b0-bc5c-d83acafaf9e3/openstack-network-exporter/0.log" Jan 28 21:34:14 crc kubenswrapper[4746]: I0128 21:34:14.896902 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ms9wc_754a9c43-4753-41cd-945d-93f7fa2b715e/ovn-controller/0.log" Jan 28 21:34:15 crc kubenswrapper[4746]: I0128 21:34:15.077922 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fcvh6_2b1288d6-9c28-48e5-a97f-bdd75de9b8a2/ovsdb-server-init/0.log" Jan 28 21:34:15 crc kubenswrapper[4746]: I0128 21:34:15.282829 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fcvh6_2b1288d6-9c28-48e5-a97f-bdd75de9b8a2/ovsdb-server-init/0.log" Jan 28 21:34:15 crc kubenswrapper[4746]: I0128 21:34:15.331790 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fcvh6_2b1288d6-9c28-48e5-a97f-bdd75de9b8a2/ovsdb-server/0.log" Jan 28 21:34:15 crc kubenswrapper[4746]: I0128 21:34:15.342672 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fcvh6_2b1288d6-9c28-48e5-a97f-bdd75de9b8a2/ovs-vswitchd/0.log" Jan 28 21:34:15 crc kubenswrapper[4746]: I0128 21:34:15.614214 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-sfgd4_824d1a68-929d-4c25-801a-17fdf5172893/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:34:15 crc kubenswrapper[4746]: I0128 21:34:15.835871 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:34:15 crc kubenswrapper[4746]: E0128 21:34:15.836285 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:34:15 crc kubenswrapper[4746]: I0128 21:34:15.855430 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_af4de16a-caed-4c86-9cf8-da6f9214ca5f/openstack-network-exporter/0.log" Jan 28 21:34:16 crc kubenswrapper[4746]: I0128 21:34:16.124588 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_af4de16a-caed-4c86-9cf8-da6f9214ca5f/ovn-northd/0.log" Jan 28 21:34:16 crc kubenswrapper[4746]: I0128 21:34:16.267306 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d1e0be80-baed-4c8f-affd-33a252b527ad/openstack-network-exporter/0.log" Jan 28 21:34:16 crc kubenswrapper[4746]: I0128 21:34:16.268567 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d1e0be80-baed-4c8f-affd-33a252b527ad/ovsdbserver-nb/0.log" Jan 28 21:34:16 crc kubenswrapper[4746]: I0128 21:34:16.334275 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-frkpn" Jan 28 21:34:16 crc kubenswrapper[4746]: I0128 21:34:16.397710 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-frkpn"] Jan 28 21:34:16 crc kubenswrapper[4746]: I0128 21:34:16.549654 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_692d10ed-801f-47d2-b069-b3a0cb8dc4b7/ovsdbserver-sb/0.log" Jan 28 21:34:16 crc kubenswrapper[4746]: I0128 21:34:16.559175 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_692d10ed-801f-47d2-b069-b3a0cb8dc4b7/openstack-network-exporter/0.log" Jan 28 21:34:16 crc kubenswrapper[4746]: I0128 21:34:16.927496 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5755bdbcc4-rbmx8_12408645-b253-4e59-bd2f-5a4ec243cabd/placement-log/0.log" Jan 28 21:34:16 crc kubenswrapper[4746]: I0128 21:34:16.956340 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5755bdbcc4-rbmx8_12408645-b253-4e59-bd2f-5a4ec243cabd/placement-api/0.log" Jan 28 21:34:17 crc kubenswrapper[4746]: I0128 21:34:17.039407 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-frkpn" podUID="41b73a06-0bf3-435a-90f4-71dc6706ffc7" containerName="registry-server" containerID="cri-o://558481c7abccdc6331dbc3289bee87fdbcc343d20dcf5c3e03528e040d67d1c5" gracePeriod=2 Jan 28 21:34:17 crc kubenswrapper[4746]: I0128 21:34:17.129444 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_914308b3-0f5e-4716-bc87-948f8a8acfb3/init-config-reloader/0.log" Jan 28 21:34:17 crc kubenswrapper[4746]: I0128 21:34:17.476663 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_914308b3-0f5e-4716-bc87-948f8a8acfb3/config-reloader/0.log" Jan 28 21:34:17 crc kubenswrapper[4746]: I0128 21:34:17.480249 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_914308b3-0f5e-4716-bc87-948f8a8acfb3/init-config-reloader/0.log" Jan 28 21:34:17 crc kubenswrapper[4746]: I0128 21:34:17.489380 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_914308b3-0f5e-4716-bc87-948f8a8acfb3/prometheus/0.log" Jan 28 21:34:17 crc kubenswrapper[4746]: I0128 21:34:17.698680 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_914308b3-0f5e-4716-bc87-948f8a8acfb3/thanos-sidecar/0.log" Jan 28 21:34:17 crc kubenswrapper[4746]: I0128 21:34:17.721762 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frkpn" Jan 28 21:34:17 crc kubenswrapper[4746]: I0128 21:34:17.736145 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_31ed4da0-c996-4afb-aa3d-d61a7c13ccfb/setup-container/0.log" Jan 28 21:34:17 crc kubenswrapper[4746]: I0128 21:34:17.791864 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7tt2\" (UniqueName: \"kubernetes.io/projected/41b73a06-0bf3-435a-90f4-71dc6706ffc7-kube-api-access-d7tt2\") pod \"41b73a06-0bf3-435a-90f4-71dc6706ffc7\" (UID: \"41b73a06-0bf3-435a-90f4-71dc6706ffc7\") " Jan 28 21:34:17 crc kubenswrapper[4746]: I0128 21:34:17.791925 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41b73a06-0bf3-435a-90f4-71dc6706ffc7-utilities\") pod \"41b73a06-0bf3-435a-90f4-71dc6706ffc7\" (UID: \"41b73a06-0bf3-435a-90f4-71dc6706ffc7\") " Jan 28 21:34:17 crc kubenswrapper[4746]: I0128 21:34:17.792021 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41b73a06-0bf3-435a-90f4-71dc6706ffc7-catalog-content\") pod \"41b73a06-0bf3-435a-90f4-71dc6706ffc7\" (UID: \"41b73a06-0bf3-435a-90f4-71dc6706ffc7\") " Jan 28 21:34:17 crc kubenswrapper[4746]: I0128 21:34:17.792774 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41b73a06-0bf3-435a-90f4-71dc6706ffc7-utilities" (OuterVolumeSpecName: "utilities") pod "41b73a06-0bf3-435a-90f4-71dc6706ffc7" (UID: "41b73a06-0bf3-435a-90f4-71dc6706ffc7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:34:17 crc kubenswrapper[4746]: I0128 21:34:17.813007 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41b73a06-0bf3-435a-90f4-71dc6706ffc7-kube-api-access-d7tt2" (OuterVolumeSpecName: "kube-api-access-d7tt2") pod "41b73a06-0bf3-435a-90f4-71dc6706ffc7" (UID: "41b73a06-0bf3-435a-90f4-71dc6706ffc7"). InnerVolumeSpecName "kube-api-access-d7tt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:34:17 crc kubenswrapper[4746]: I0128 21:34:17.860239 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41b73a06-0bf3-435a-90f4-71dc6706ffc7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41b73a06-0bf3-435a-90f4-71dc6706ffc7" (UID: "41b73a06-0bf3-435a-90f4-71dc6706ffc7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:34:17 crc kubenswrapper[4746]: I0128 21:34:17.894470 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41b73a06-0bf3-435a-90f4-71dc6706ffc7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 21:34:17 crc kubenswrapper[4746]: I0128 21:34:17.894504 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7tt2\" (UniqueName: \"kubernetes.io/projected/41b73a06-0bf3-435a-90f4-71dc6706ffc7-kube-api-access-d7tt2\") on node \"crc\" DevicePath \"\"" Jan 28 21:34:17 crc kubenswrapper[4746]: I0128 21:34:17.894516 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41b73a06-0bf3-435a-90f4-71dc6706ffc7-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.050415 4746 generic.go:334] "Generic (PLEG): container finished" podID="41b73a06-0bf3-435a-90f4-71dc6706ffc7" containerID="558481c7abccdc6331dbc3289bee87fdbcc343d20dcf5c3e03528e040d67d1c5" exitCode=0 Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.050456 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frkpn" event={"ID":"41b73a06-0bf3-435a-90f4-71dc6706ffc7","Type":"ContainerDied","Data":"558481c7abccdc6331dbc3289bee87fdbcc343d20dcf5c3e03528e040d67d1c5"} Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.050480 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-frkpn" event={"ID":"41b73a06-0bf3-435a-90f4-71dc6706ffc7","Type":"ContainerDied","Data":"90a1e1f7da74e2dee0e47a3a040ac989eb1fae8d2d214342b5a4007df8e4d761"} Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.050495 4746 scope.go:117] "RemoveContainer" containerID="558481c7abccdc6331dbc3289bee87fdbcc343d20dcf5c3e03528e040d67d1c5" Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.050614 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-frkpn" Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.077144 4746 scope.go:117] "RemoveContainer" containerID="999f488dedb4037b5018b157054eafb044ddaf3aa381131189c0e42cf041517f" Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.096925 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-frkpn"] Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.107706 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-frkpn"] Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.108419 4746 scope.go:117] "RemoveContainer" containerID="a6bb88388ed7ae8965a6c1273c83e5bd65a1fc674d0e099a9684007ef4480bdd" Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.115889 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_31ed4da0-c996-4afb-aa3d-d61a7c13ccfb/setup-container/0.log" Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.148121 4746 scope.go:117] "RemoveContainer" containerID="558481c7abccdc6331dbc3289bee87fdbcc343d20dcf5c3e03528e040d67d1c5" Jan 28 21:34:18 crc kubenswrapper[4746]: E0128 21:34:18.148673 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"558481c7abccdc6331dbc3289bee87fdbcc343d20dcf5c3e03528e040d67d1c5\": container with ID starting with 558481c7abccdc6331dbc3289bee87fdbcc343d20dcf5c3e03528e040d67d1c5 not found: ID does not exist" containerID="558481c7abccdc6331dbc3289bee87fdbcc343d20dcf5c3e03528e040d67d1c5" Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.148710 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"558481c7abccdc6331dbc3289bee87fdbcc343d20dcf5c3e03528e040d67d1c5"} err="failed to get container status \"558481c7abccdc6331dbc3289bee87fdbcc343d20dcf5c3e03528e040d67d1c5\": rpc error: code = NotFound desc = could not find container \"558481c7abccdc6331dbc3289bee87fdbcc343d20dcf5c3e03528e040d67d1c5\": container with ID starting with 558481c7abccdc6331dbc3289bee87fdbcc343d20dcf5c3e03528e040d67d1c5 not found: ID does not exist" Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.148744 4746 scope.go:117] "RemoveContainer" containerID="999f488dedb4037b5018b157054eafb044ddaf3aa381131189c0e42cf041517f" Jan 28 21:34:18 crc kubenswrapper[4746]: E0128 21:34:18.149042 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"999f488dedb4037b5018b157054eafb044ddaf3aa381131189c0e42cf041517f\": container with ID starting with 999f488dedb4037b5018b157054eafb044ddaf3aa381131189c0e42cf041517f not found: ID does not exist" containerID="999f488dedb4037b5018b157054eafb044ddaf3aa381131189c0e42cf041517f" Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.149085 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"999f488dedb4037b5018b157054eafb044ddaf3aa381131189c0e42cf041517f"} err="failed to get container status \"999f488dedb4037b5018b157054eafb044ddaf3aa381131189c0e42cf041517f\": rpc error: code = NotFound desc = could not find container \"999f488dedb4037b5018b157054eafb044ddaf3aa381131189c0e42cf041517f\": container with ID starting with 999f488dedb4037b5018b157054eafb044ddaf3aa381131189c0e42cf041517f not found: ID does not exist" Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.149154 4746 scope.go:117] "RemoveContainer" containerID="a6bb88388ed7ae8965a6c1273c83e5bd65a1fc674d0e099a9684007ef4480bdd" Jan 28 21:34:18 crc kubenswrapper[4746]: E0128 21:34:18.149522 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6bb88388ed7ae8965a6c1273c83e5bd65a1fc674d0e099a9684007ef4480bdd\": container with ID starting with a6bb88388ed7ae8965a6c1273c83e5bd65a1fc674d0e099a9684007ef4480bdd not found: ID does not exist" containerID="a6bb88388ed7ae8965a6c1273c83e5bd65a1fc674d0e099a9684007ef4480bdd" Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.149555 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6bb88388ed7ae8965a6c1273c83e5bd65a1fc674d0e099a9684007ef4480bdd"} err="failed to get container status \"a6bb88388ed7ae8965a6c1273c83e5bd65a1fc674d0e099a9684007ef4480bdd\": rpc error: code = NotFound desc = could not find container \"a6bb88388ed7ae8965a6c1273c83e5bd65a1fc674d0e099a9684007ef4480bdd\": container with ID starting with a6bb88388ed7ae8965a6c1273c83e5bd65a1fc674d0e099a9684007ef4480bdd not found: ID does not exist" Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.212830 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_31ed4da0-c996-4afb-aa3d-d61a7c13ccfb/rabbitmq/0.log" Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.374752 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f330def9-769c-4adf-9df3-c1a7c54cd502/setup-container/0.log" Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.594039 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f330def9-769c-4adf-9df3-c1a7c54cd502/rabbitmq/0.log" Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.634709 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f330def9-769c-4adf-9df3-c1a7c54cd502/setup-container/0.log" Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.805054 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn_f2c97be5-d93c-4a83-87ad-48abb73d603c/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.857169 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41b73a06-0bf3-435a-90f4-71dc6706ffc7" path="/var/lib/kubelet/pods/41b73a06-0bf3-435a-90f4-71dc6706ffc7/volumes" Jan 28 21:34:18 crc kubenswrapper[4746]: I0128 21:34:18.870899 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-p7xp2_a5bda0ca-2718-41bf-84d6-6c08d35d16b1/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:34:19 crc kubenswrapper[4746]: I0128 21:34:19.177289 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p_90153a28-4812-4b2e-a3a3-2443a8618c3d/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:34:19 crc kubenswrapper[4746]: I0128 21:34:19.316954 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-xv2wb_280f47f9-2f66-4991-bd8f-59b734c5a935/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:34:19 crc kubenswrapper[4746]: I0128 21:34:19.457357 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-44hcw_cb5bd90c-ea83-463c-aed8-3291063c50bc/ssh-known-hosts-edpm-deployment/0.log" Jan 28 21:34:19 crc kubenswrapper[4746]: I0128 21:34:19.507657 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_54f66341-e026-4e4e-a7d4-be4f199ff3d6/cloudkitty-proc/0.log" Jan 28 21:34:19 crc kubenswrapper[4746]: I0128 21:34:19.759379 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5d6f7ddd75-47x9g_4524d3f6-9b61-4b9c-b778-0078a31efc3e/proxy-server/0.log" Jan 28 21:34:19 crc kubenswrapper[4746]: I0128 21:34:19.805134 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5d6f7ddd75-47x9g_4524d3f6-9b61-4b9c-b778-0078a31efc3e/proxy-httpd/0.log" Jan 28 21:34:19 crc kubenswrapper[4746]: I0128 21:34:19.826007 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-5sxj6_61a4ff02-ae06-438a-a39c-8264c8e61b38/swift-ring-rebalance/0.log" Jan 28 21:34:20 crc kubenswrapper[4746]: I0128 21:34:20.181255 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/account-auditor/0.log" Jan 28 21:34:20 crc kubenswrapper[4746]: I0128 21:34:20.438847 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/account-reaper/0.log" Jan 28 21:34:20 crc kubenswrapper[4746]: I0128 21:34:20.541060 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/account-replicator/0.log" Jan 28 21:34:20 crc kubenswrapper[4746]: I0128 21:34:20.558230 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/account-server/0.log" Jan 28 21:34:20 crc kubenswrapper[4746]: I0128 21:34:20.561475 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/container-auditor/0.log" Jan 28 21:34:20 crc kubenswrapper[4746]: I0128 21:34:20.696820 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/container-replicator/0.log" Jan 28 21:34:20 crc kubenswrapper[4746]: I0128 21:34:20.773822 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/container-updater/0.log" Jan 28 21:34:20 crc kubenswrapper[4746]: I0128 21:34:20.776395 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/container-server/0.log" Jan 28 21:34:20 crc kubenswrapper[4746]: I0128 21:34:20.838366 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/object-auditor/0.log" Jan 28 21:34:20 crc kubenswrapper[4746]: I0128 21:34:20.980165 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/object-server/0.log" Jan 28 21:34:21 crc kubenswrapper[4746]: I0128 21:34:21.010213 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/object-replicator/0.log" Jan 28 21:34:21 crc kubenswrapper[4746]: I0128 21:34:21.026375 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/object-expirer/0.log" Jan 28 21:34:21 crc kubenswrapper[4746]: I0128 21:34:21.123535 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/object-updater/0.log" Jan 28 21:34:21 crc kubenswrapper[4746]: I0128 21:34:21.200189 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/rsync/0.log" Jan 28 21:34:21 crc kubenswrapper[4746]: I0128 21:34:21.310405 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/swift-recon-cron/0.log" Jan 28 21:34:21 crc kubenswrapper[4746]: I0128 21:34:21.488650 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt_9c46ddb7-5815-475e-b798-06a7fee944c8/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:34:21 crc kubenswrapper[4746]: I0128 21:34:21.693360 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a/tempest-tests-tempest-tests-runner/0.log" Jan 28 21:34:21 crc kubenswrapper[4746]: I0128 21:34:21.818964 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e4a0e4c1-a20f-4efd-aae8-3345c0c9c0f7/test-operator-logs-container/0.log" Jan 28 21:34:21 crc kubenswrapper[4746]: I0128 21:34:21.928124 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-rjltv_8728e263-d102-4878-a40e-30e414240224/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:34:25 crc kubenswrapper[4746]: I0128 21:34:25.323044 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_8c01b9a6-3e78-4a0c-9825-e39856c2df93/memcached/0.log" Jan 28 21:34:27 crc kubenswrapper[4746]: I0128 21:34:27.836054 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:34:27 crc kubenswrapper[4746]: E0128 21:34:27.836677 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:34:40 crc kubenswrapper[4746]: I0128 21:34:40.836687 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:34:40 crc kubenswrapper[4746]: E0128 21:34:40.837504 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:34:49 crc kubenswrapper[4746]: I0128 21:34:49.364051 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp_c1bfc71e-7105-4567-b92a-37c08b17a97c/util/0.log" Jan 28 21:34:49 crc kubenswrapper[4746]: I0128 21:34:49.598510 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp_c1bfc71e-7105-4567-b92a-37c08b17a97c/pull/0.log" Jan 28 21:34:49 crc kubenswrapper[4746]: I0128 21:34:49.600017 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp_c1bfc71e-7105-4567-b92a-37c08b17a97c/util/0.log" Jan 28 21:34:49 crc kubenswrapper[4746]: I0128 21:34:49.628498 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp_c1bfc71e-7105-4567-b92a-37c08b17a97c/pull/0.log" Jan 28 21:34:49 crc kubenswrapper[4746]: I0128 21:34:49.893759 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp_c1bfc71e-7105-4567-b92a-37c08b17a97c/pull/0.log" Jan 28 21:34:49 crc kubenswrapper[4746]: I0128 21:34:49.927048 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp_c1bfc71e-7105-4567-b92a-37c08b17a97c/util/0.log" Jan 28 21:34:50 crc kubenswrapper[4746]: I0128 21:34:50.084480 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp_c1bfc71e-7105-4567-b92a-37c08b17a97c/extract/0.log" Jan 28 21:34:50 crc kubenswrapper[4746]: I0128 21:34:50.330705 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7f86f8796f-kll6j_3c81bd6e-961b-42ae-8840-2607a13046df/manager/0.log" Jan 28 21:34:50 crc kubenswrapper[4746]: I0128 21:34:50.409049 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7478f7dbf9-n6qr7_f86e66ed-9f28-4514-8ff8-97b8353026d1/manager/0.log" Jan 28 21:34:50 crc kubenswrapper[4746]: I0128 21:34:50.553319 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-b45d7bf98-cm85d_63794c40-0128-457d-b223-84e87943cca9/manager/0.log" Jan 28 21:34:50 crc kubenswrapper[4746]: I0128 21:34:50.628530 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-78fdd796fd-bxtxd_fe660f4f-8806-4674-ab58-ea3303f51683/manager/0.log" Jan 28 21:34:50 crc kubenswrapper[4746]: I0128 21:34:50.746786 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-p6qjg_677d2ab0-897d-4fd5-8ca5-b75f310e38da/manager/0.log" Jan 28 21:34:50 crc kubenswrapper[4746]: I0128 21:34:50.905992 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-ws7k7_760877c4-6e86-4445-a4cf-002b48e93841/manager/0.log" Jan 28 21:34:51 crc kubenswrapper[4746]: I0128 21:34:51.378546 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-694cf4f878-th2hg_28de2427-e250-44f5-add2-1b738cf6ce3b/manager/0.log" Jan 28 21:34:51 crc kubenswrapper[4746]: I0128 21:34:51.686563 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-598f7747c9-5lc6j_b44b1510-0a60-4b4e-9541-cc6d18e10a7f/manager/0.log" Jan 28 21:34:51 crc kubenswrapper[4746]: I0128 21:34:51.808908 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b8b6d4659-65qb5_fc220202-4669-4c2e-94b0-583048b56c83/manager/0.log" Jan 28 21:34:51 crc kubenswrapper[4746]: I0128 21:34:51.882559 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-78c6999f6f-m5qbs_f682c47e-2151-466d-8cc5-9ef0fca79785/manager/0.log" Jan 28 21:34:52 crc kubenswrapper[4746]: I0128 21:34:52.071267 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6b9fb5fdcb-zcvgn_5521c5f5-d2f6-461b-a2fc-ee97a5b2df11/manager/0.log" Jan 28 21:34:52 crc kubenswrapper[4746]: I0128 21:34:52.193961 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-78d58447c5-pcprz_b182a0df-d0f9-46d6-9a0c-a3e332c84cff/manager/0.log" Jan 28 21:34:52 crc kubenswrapper[4746]: I0128 21:34:52.475932 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7bdb645866-pg4s4_ced3eeee-ed33-4c50-8531-a7e4df1849f6/manager/0.log" Jan 28 21:34:52 crc kubenswrapper[4746]: I0128 21:34:52.518072 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4cd88d46-hb6t9_3b28dc9c-6dcf-4fd1-8cbd-f13d0da9e954/manager/0.log" Jan 28 21:34:52 crc kubenswrapper[4746]: I0128 21:34:52.724800 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b85449cmp_3ff4c44c-0290-4ab0-abb8-316375200dc0/manager/0.log" Jan 28 21:34:52 crc kubenswrapper[4746]: I0128 21:34:52.911339 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-58bd5c8549-ggt7x_0e81bc43-baa9-4cbd-a255-233e12e2b84b/operator/0.log" Jan 28 21:34:53 crc kubenswrapper[4746]: I0128 21:34:53.098901 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hk7sk_161bd1ce-304a-4bcd-9188-568b362f4739/registry-server/0.log" Jan 28 21:34:53 crc kubenswrapper[4746]: I0128 21:34:53.483097 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f75f45d54-kpcqr_e3360f0f-1430-4b7e-9ee0-0a126a9b657d/manager/0.log" Jan 28 21:34:53 crc kubenswrapper[4746]: I0128 21:34:53.577490 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-79d5ccc684-6klzp_370a5739-7af0-4065-986c-af68a265423c/manager/0.log" Jan 28 21:34:53 crc kubenswrapper[4746]: I0128 21:34:53.836986 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:34:53 crc kubenswrapper[4746]: E0128 21:34:53.838337 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:34:53 crc kubenswrapper[4746]: I0128 21:34:53.858213 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-8kpht_6b7a0005-11ec-4c8a-87e9-872855585d4d/operator/0.log" Jan 28 21:34:54 crc kubenswrapper[4746]: I0128 21:34:54.098954 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-547cbdb99f-fjs9l_1f4e2d58-bbd0-45d2-81ba-2b4b47ab5af6/manager/0.log" Jan 28 21:34:54 crc kubenswrapper[4746]: I0128 21:34:54.121391 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-65d466cb7d-vf8n9_a7c2547a-3282-4748-a823-c3a0cc41ad46/manager/0.log" Jan 28 21:34:54 crc kubenswrapper[4746]: I0128 21:34:54.311462 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-69797bbcbd-m4x6j_beba987e-69be-47aa-a84c-7ea511c4d151/manager/0.log" Jan 28 21:34:54 crc kubenswrapper[4746]: I0128 21:34:54.367325 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-564965969-hd4k9_90c190b4-36db-406b-bca5-6c45ac745ed6/manager/0.log" Jan 28 21:34:54 crc kubenswrapper[4746]: I0128 21:34:54.491193 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-9477bbd48-z984g_e42669f3-6865-4ab6-9a9a-241c7b07509d/manager/0.log" Jan 28 21:35:04 crc kubenswrapper[4746]: I0128 21:35:04.679093 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-57lqt"] Jan 28 21:35:04 crc kubenswrapper[4746]: E0128 21:35:04.680245 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b73a06-0bf3-435a-90f4-71dc6706ffc7" containerName="extract-utilities" Jan 28 21:35:04 crc kubenswrapper[4746]: I0128 21:35:04.680269 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b73a06-0bf3-435a-90f4-71dc6706ffc7" containerName="extract-utilities" Jan 28 21:35:04 crc kubenswrapper[4746]: E0128 21:35:04.680291 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b73a06-0bf3-435a-90f4-71dc6706ffc7" containerName="extract-content" Jan 28 21:35:04 crc kubenswrapper[4746]: I0128 21:35:04.680300 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b73a06-0bf3-435a-90f4-71dc6706ffc7" containerName="extract-content" Jan 28 21:35:04 crc kubenswrapper[4746]: E0128 21:35:04.680319 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b73a06-0bf3-435a-90f4-71dc6706ffc7" containerName="registry-server" Jan 28 21:35:04 crc kubenswrapper[4746]: I0128 21:35:04.680329 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b73a06-0bf3-435a-90f4-71dc6706ffc7" containerName="registry-server" Jan 28 21:35:04 crc kubenswrapper[4746]: I0128 21:35:04.680602 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b73a06-0bf3-435a-90f4-71dc6706ffc7" containerName="registry-server" Jan 28 21:35:04 crc kubenswrapper[4746]: I0128 21:35:04.682698 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-57lqt" Jan 28 21:35:04 crc kubenswrapper[4746]: I0128 21:35:04.696746 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-57lqt"] Jan 28 21:35:04 crc kubenswrapper[4746]: I0128 21:35:04.755966 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f324dd81-f1d9-4f0e-8614-cee3ebf2edce-utilities\") pod \"redhat-operators-57lqt\" (UID: \"f324dd81-f1d9-4f0e-8614-cee3ebf2edce\") " pod="openshift-marketplace/redhat-operators-57lqt" Jan 28 21:35:04 crc kubenswrapper[4746]: I0128 21:35:04.756009 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f324dd81-f1d9-4f0e-8614-cee3ebf2edce-catalog-content\") pod \"redhat-operators-57lqt\" (UID: \"f324dd81-f1d9-4f0e-8614-cee3ebf2edce\") " pod="openshift-marketplace/redhat-operators-57lqt" Jan 28 21:35:04 crc kubenswrapper[4746]: I0128 21:35:04.756039 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rddv\" (UniqueName: \"kubernetes.io/projected/f324dd81-f1d9-4f0e-8614-cee3ebf2edce-kube-api-access-9rddv\") pod \"redhat-operators-57lqt\" (UID: \"f324dd81-f1d9-4f0e-8614-cee3ebf2edce\") " pod="openshift-marketplace/redhat-operators-57lqt" Jan 28 21:35:04 crc kubenswrapper[4746]: I0128 21:35:04.836626 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:35:04 crc kubenswrapper[4746]: E0128 21:35:04.836919 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:35:04 crc kubenswrapper[4746]: I0128 21:35:04.860407 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f324dd81-f1d9-4f0e-8614-cee3ebf2edce-utilities\") pod \"redhat-operators-57lqt\" (UID: \"f324dd81-f1d9-4f0e-8614-cee3ebf2edce\") " pod="openshift-marketplace/redhat-operators-57lqt" Jan 28 21:35:04 crc kubenswrapper[4746]: I0128 21:35:04.860464 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f324dd81-f1d9-4f0e-8614-cee3ebf2edce-catalog-content\") pod \"redhat-operators-57lqt\" (UID: \"f324dd81-f1d9-4f0e-8614-cee3ebf2edce\") " pod="openshift-marketplace/redhat-operators-57lqt" Jan 28 21:35:04 crc kubenswrapper[4746]: I0128 21:35:04.860511 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rddv\" (UniqueName: \"kubernetes.io/projected/f324dd81-f1d9-4f0e-8614-cee3ebf2edce-kube-api-access-9rddv\") pod \"redhat-operators-57lqt\" (UID: \"f324dd81-f1d9-4f0e-8614-cee3ebf2edce\") " pod="openshift-marketplace/redhat-operators-57lqt" Jan 28 21:35:04 crc kubenswrapper[4746]: I0128 21:35:04.860928 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f324dd81-f1d9-4f0e-8614-cee3ebf2edce-catalog-content\") pod \"redhat-operators-57lqt\" (UID: \"f324dd81-f1d9-4f0e-8614-cee3ebf2edce\") " pod="openshift-marketplace/redhat-operators-57lqt" Jan 28 21:35:04 crc kubenswrapper[4746]: I0128 21:35:04.861173 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f324dd81-f1d9-4f0e-8614-cee3ebf2edce-utilities\") pod \"redhat-operators-57lqt\" (UID: \"f324dd81-f1d9-4f0e-8614-cee3ebf2edce\") " pod="openshift-marketplace/redhat-operators-57lqt" Jan 28 21:35:04 crc kubenswrapper[4746]: I0128 21:35:04.883001 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rddv\" (UniqueName: \"kubernetes.io/projected/f324dd81-f1d9-4f0e-8614-cee3ebf2edce-kube-api-access-9rddv\") pod \"redhat-operators-57lqt\" (UID: \"f324dd81-f1d9-4f0e-8614-cee3ebf2edce\") " pod="openshift-marketplace/redhat-operators-57lqt" Jan 28 21:35:05 crc kubenswrapper[4746]: I0128 21:35:05.011163 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-57lqt" Jan 28 21:35:05 crc kubenswrapper[4746]: I0128 21:35:05.496422 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-57lqt"] Jan 28 21:35:06 crc kubenswrapper[4746]: I0128 21:35:06.477631 4746 generic.go:334] "Generic (PLEG): container finished" podID="f324dd81-f1d9-4f0e-8614-cee3ebf2edce" containerID="82526b15b96e63cae817230534a02b69a343a8f04ba4060ec1aa73787f23421a" exitCode=0 Jan 28 21:35:06 crc kubenswrapper[4746]: I0128 21:35:06.477696 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57lqt" event={"ID":"f324dd81-f1d9-4f0e-8614-cee3ebf2edce","Type":"ContainerDied","Data":"82526b15b96e63cae817230534a02b69a343a8f04ba4060ec1aa73787f23421a"} Jan 28 21:35:06 crc kubenswrapper[4746]: I0128 21:35:06.479110 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57lqt" event={"ID":"f324dd81-f1d9-4f0e-8614-cee3ebf2edce","Type":"ContainerStarted","Data":"87cc3b73d6bd47a986b1fc6a54419be8eb38ecf5a24479c0be6ed0bab6f37a6b"} Jan 28 21:35:07 crc kubenswrapper[4746]: I0128 21:35:07.491718 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57lqt" event={"ID":"f324dd81-f1d9-4f0e-8614-cee3ebf2edce","Type":"ContainerStarted","Data":"36f71c07c8a546f0fd82fa2ab7a3f6f6baa195b73024d429b2a718dab527d85e"} Jan 28 21:35:12 crc kubenswrapper[4746]: I0128 21:35:12.534270 4746 generic.go:334] "Generic (PLEG): container finished" podID="f324dd81-f1d9-4f0e-8614-cee3ebf2edce" containerID="36f71c07c8a546f0fd82fa2ab7a3f6f6baa195b73024d429b2a718dab527d85e" exitCode=0 Jan 28 21:35:12 crc kubenswrapper[4746]: I0128 21:35:12.534327 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57lqt" event={"ID":"f324dd81-f1d9-4f0e-8614-cee3ebf2edce","Type":"ContainerDied","Data":"36f71c07c8a546f0fd82fa2ab7a3f6f6baa195b73024d429b2a718dab527d85e"} Jan 28 21:35:13 crc kubenswrapper[4746]: I0128 21:35:13.545337 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57lqt" event={"ID":"f324dd81-f1d9-4f0e-8614-cee3ebf2edce","Type":"ContainerStarted","Data":"0d471a874c83efd500a3666b46d8303ebd19cd95ad378ed04a4a61c301220d24"} Jan 28 21:35:13 crc kubenswrapper[4746]: I0128 21:35:13.571638 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-57lqt" podStartSLOduration=3.112854772 podStartE2EDuration="9.571620243s" podCreationTimestamp="2026-01-28 21:35:04 +0000 UTC" firstStartedPulling="2026-01-28 21:35:06.479941899 +0000 UTC m=+3334.436128273" lastFinishedPulling="2026-01-28 21:35:12.93870735 +0000 UTC m=+3340.894893744" observedRunningTime="2026-01-28 21:35:13.563904244 +0000 UTC m=+3341.520090598" watchObservedRunningTime="2026-01-28 21:35:13.571620243 +0000 UTC m=+3341.527806587" Jan 28 21:35:15 crc kubenswrapper[4746]: I0128 21:35:15.011326 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-57lqt" Jan 28 21:35:15 crc kubenswrapper[4746]: I0128 21:35:15.019261 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-57lqt" Jan 28 21:35:16 crc kubenswrapper[4746]: I0128 21:35:16.061720 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-57lqt" podUID="f324dd81-f1d9-4f0e-8614-cee3ebf2edce" containerName="registry-server" probeResult="failure" output=< Jan 28 21:35:16 crc kubenswrapper[4746]: timeout: failed to connect service ":50051" within 1s Jan 28 21:35:16 crc kubenswrapper[4746]: > Jan 28 21:35:18 crc kubenswrapper[4746]: I0128 21:35:18.836705 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:35:18 crc kubenswrapper[4746]: E0128 21:35:18.837169 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:35:19 crc kubenswrapper[4746]: I0128 21:35:19.113772 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-mznpb_72e0847f-0a87-4710-9765-a10282cc0529/control-plane-machine-set-operator/0.log" Jan 28 21:35:19 crc kubenswrapper[4746]: I0128 21:35:19.347239 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lzj8l_3e64bb6e-1131-431b-b87c-71e25d294fe1/kube-rbac-proxy/0.log" Jan 28 21:35:19 crc kubenswrapper[4746]: I0128 21:35:19.390046 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lzj8l_3e64bb6e-1131-431b-b87c-71e25d294fe1/machine-api-operator/0.log" Jan 28 21:35:26 crc kubenswrapper[4746]: I0128 21:35:26.071320 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-57lqt" podUID="f324dd81-f1d9-4f0e-8614-cee3ebf2edce" containerName="registry-server" probeResult="failure" output=< Jan 28 21:35:26 crc kubenswrapper[4746]: timeout: failed to connect service ":50051" within 1s Jan 28 21:35:26 crc kubenswrapper[4746]: > Jan 28 21:35:30 crc kubenswrapper[4746]: I0128 21:35:30.836242 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:35:30 crc kubenswrapper[4746]: E0128 21:35:30.836857 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:35:35 crc kubenswrapper[4746]: I0128 21:35:35.067964 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-57lqt" Jan 28 21:35:35 crc kubenswrapper[4746]: I0128 21:35:35.129335 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-57lqt" Jan 28 21:35:35 crc kubenswrapper[4746]: I0128 21:35:35.133813 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-5dcxq_ca33d567-a88a-4cad-b323-ffbb4ac0e02e/cert-manager-controller/0.log" Jan 28 21:35:35 crc kubenswrapper[4746]: I0128 21:35:35.617157 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-m56r8_e669e571-cde2-4753-a233-bd4ff6c76f02/cert-manager-cainjector/0.log" Jan 28 21:35:35 crc kubenswrapper[4746]: I0128 21:35:35.724042 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-bzzv6_6ff603d5-0f8d-415a-8616-55be576956bf/cert-manager-webhook/0.log" Jan 28 21:35:35 crc kubenswrapper[4746]: I0128 21:35:35.880444 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-57lqt"] Jan 28 21:35:36 crc kubenswrapper[4746]: I0128 21:35:36.771863 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-57lqt" podUID="f324dd81-f1d9-4f0e-8614-cee3ebf2edce" containerName="registry-server" containerID="cri-o://0d471a874c83efd500a3666b46d8303ebd19cd95ad378ed04a4a61c301220d24" gracePeriod=2 Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.596631 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-57lqt" Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.769118 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f324dd81-f1d9-4f0e-8614-cee3ebf2edce-catalog-content\") pod \"f324dd81-f1d9-4f0e-8614-cee3ebf2edce\" (UID: \"f324dd81-f1d9-4f0e-8614-cee3ebf2edce\") " Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.769704 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f324dd81-f1d9-4f0e-8614-cee3ebf2edce-utilities\") pod \"f324dd81-f1d9-4f0e-8614-cee3ebf2edce\" (UID: \"f324dd81-f1d9-4f0e-8614-cee3ebf2edce\") " Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.769769 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rddv\" (UniqueName: \"kubernetes.io/projected/f324dd81-f1d9-4f0e-8614-cee3ebf2edce-kube-api-access-9rddv\") pod \"f324dd81-f1d9-4f0e-8614-cee3ebf2edce\" (UID: \"f324dd81-f1d9-4f0e-8614-cee3ebf2edce\") " Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.772323 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f324dd81-f1d9-4f0e-8614-cee3ebf2edce-utilities" (OuterVolumeSpecName: "utilities") pod "f324dd81-f1d9-4f0e-8614-cee3ebf2edce" (UID: "f324dd81-f1d9-4f0e-8614-cee3ebf2edce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.777864 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f324dd81-f1d9-4f0e-8614-cee3ebf2edce-kube-api-access-9rddv" (OuterVolumeSpecName: "kube-api-access-9rddv") pod "f324dd81-f1d9-4f0e-8614-cee3ebf2edce" (UID: "f324dd81-f1d9-4f0e-8614-cee3ebf2edce"). InnerVolumeSpecName "kube-api-access-9rddv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.791575 4746 generic.go:334] "Generic (PLEG): container finished" podID="f324dd81-f1d9-4f0e-8614-cee3ebf2edce" containerID="0d471a874c83efd500a3666b46d8303ebd19cd95ad378ed04a4a61c301220d24" exitCode=0 Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.791613 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57lqt" event={"ID":"f324dd81-f1d9-4f0e-8614-cee3ebf2edce","Type":"ContainerDied","Data":"0d471a874c83efd500a3666b46d8303ebd19cd95ad378ed04a4a61c301220d24"} Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.791638 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-57lqt" event={"ID":"f324dd81-f1d9-4f0e-8614-cee3ebf2edce","Type":"ContainerDied","Data":"87cc3b73d6bd47a986b1fc6a54419be8eb38ecf5a24479c0be6ed0bab6f37a6b"} Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.791650 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-57lqt" Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.791676 4746 scope.go:117] "RemoveContainer" containerID="0d471a874c83efd500a3666b46d8303ebd19cd95ad378ed04a4a61c301220d24" Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.848464 4746 scope.go:117] "RemoveContainer" containerID="36f71c07c8a546f0fd82fa2ab7a3f6f6baa195b73024d429b2a718dab527d85e" Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.872842 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f324dd81-f1d9-4f0e-8614-cee3ebf2edce-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.872886 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rddv\" (UniqueName: \"kubernetes.io/projected/f324dd81-f1d9-4f0e-8614-cee3ebf2edce-kube-api-access-9rddv\") on node \"crc\" DevicePath \"\"" Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.883360 4746 scope.go:117] "RemoveContainer" containerID="82526b15b96e63cae817230534a02b69a343a8f04ba4060ec1aa73787f23421a" Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.931292 4746 scope.go:117] "RemoveContainer" containerID="0d471a874c83efd500a3666b46d8303ebd19cd95ad378ed04a4a61c301220d24" Jan 28 21:35:37 crc kubenswrapper[4746]: E0128 21:35:37.933221 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d471a874c83efd500a3666b46d8303ebd19cd95ad378ed04a4a61c301220d24\": container with ID starting with 0d471a874c83efd500a3666b46d8303ebd19cd95ad378ed04a4a61c301220d24 not found: ID does not exist" containerID="0d471a874c83efd500a3666b46d8303ebd19cd95ad378ed04a4a61c301220d24" Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.933264 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d471a874c83efd500a3666b46d8303ebd19cd95ad378ed04a4a61c301220d24"} err="failed to get container status \"0d471a874c83efd500a3666b46d8303ebd19cd95ad378ed04a4a61c301220d24\": rpc error: code = NotFound desc = could not find container \"0d471a874c83efd500a3666b46d8303ebd19cd95ad378ed04a4a61c301220d24\": container with ID starting with 0d471a874c83efd500a3666b46d8303ebd19cd95ad378ed04a4a61c301220d24 not found: ID does not exist" Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.933296 4746 scope.go:117] "RemoveContainer" containerID="36f71c07c8a546f0fd82fa2ab7a3f6f6baa195b73024d429b2a718dab527d85e" Jan 28 21:35:37 crc kubenswrapper[4746]: E0128 21:35:37.933601 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36f71c07c8a546f0fd82fa2ab7a3f6f6baa195b73024d429b2a718dab527d85e\": container with ID starting with 36f71c07c8a546f0fd82fa2ab7a3f6f6baa195b73024d429b2a718dab527d85e not found: ID does not exist" containerID="36f71c07c8a546f0fd82fa2ab7a3f6f6baa195b73024d429b2a718dab527d85e" Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.933649 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36f71c07c8a546f0fd82fa2ab7a3f6f6baa195b73024d429b2a718dab527d85e"} err="failed to get container status \"36f71c07c8a546f0fd82fa2ab7a3f6f6baa195b73024d429b2a718dab527d85e\": rpc error: code = NotFound desc = could not find container \"36f71c07c8a546f0fd82fa2ab7a3f6f6baa195b73024d429b2a718dab527d85e\": container with ID starting with 36f71c07c8a546f0fd82fa2ab7a3f6f6baa195b73024d429b2a718dab527d85e not found: ID does not exist" Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.933677 4746 scope.go:117] "RemoveContainer" containerID="82526b15b96e63cae817230534a02b69a343a8f04ba4060ec1aa73787f23421a" Jan 28 21:35:37 crc kubenswrapper[4746]: E0128 21:35:37.934667 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82526b15b96e63cae817230534a02b69a343a8f04ba4060ec1aa73787f23421a\": container with ID starting with 82526b15b96e63cae817230534a02b69a343a8f04ba4060ec1aa73787f23421a not found: ID does not exist" containerID="82526b15b96e63cae817230534a02b69a343a8f04ba4060ec1aa73787f23421a" Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.934695 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82526b15b96e63cae817230534a02b69a343a8f04ba4060ec1aa73787f23421a"} err="failed to get container status \"82526b15b96e63cae817230534a02b69a343a8f04ba4060ec1aa73787f23421a\": rpc error: code = NotFound desc = could not find container \"82526b15b96e63cae817230534a02b69a343a8f04ba4060ec1aa73787f23421a\": container with ID starting with 82526b15b96e63cae817230534a02b69a343a8f04ba4060ec1aa73787f23421a not found: ID does not exist" Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.949300 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f324dd81-f1d9-4f0e-8614-cee3ebf2edce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f324dd81-f1d9-4f0e-8614-cee3ebf2edce" (UID: "f324dd81-f1d9-4f0e-8614-cee3ebf2edce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:35:37 crc kubenswrapper[4746]: I0128 21:35:37.975010 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f324dd81-f1d9-4f0e-8614-cee3ebf2edce-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 21:35:38 crc kubenswrapper[4746]: I0128 21:35:38.125145 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-57lqt"] Jan 28 21:35:38 crc kubenswrapper[4746]: I0128 21:35:38.134422 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-57lqt"] Jan 28 21:35:38 crc kubenswrapper[4746]: I0128 21:35:38.868817 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f324dd81-f1d9-4f0e-8614-cee3ebf2edce" path="/var/lib/kubelet/pods/f324dd81-f1d9-4f0e-8614-cee3ebf2edce/volumes" Jan 28 21:35:43 crc kubenswrapper[4746]: I0128 21:35:43.836528 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:35:43 crc kubenswrapper[4746]: E0128 21:35:43.837330 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:35:51 crc kubenswrapper[4746]: I0128 21:35:51.510611 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-4s5cl_6078a6ee-9b98-476e-89f3-5430a34e7ec9/nmstate-console-plugin/0.log" Jan 28 21:35:51 crc kubenswrapper[4746]: I0128 21:35:51.805944 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-sbwgb_30f828e0-bffb-4b84-be14-53eac55a3ca3/nmstate-handler/0.log" Jan 28 21:35:51 crc kubenswrapper[4746]: I0128 21:35:51.912287 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-v6k5d_48781ec4-e4a7-402c-a111-22310cfe0305/kube-rbac-proxy/0.log" Jan 28 21:35:52 crc kubenswrapper[4746]: I0128 21:35:52.041418 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-v6k5d_48781ec4-e4a7-402c-a111-22310cfe0305/nmstate-metrics/0.log" Jan 28 21:35:52 crc kubenswrapper[4746]: I0128 21:35:52.112026 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-x2kbm_e8f9251b-82e6-4fde-8a14-4430af400661/nmstate-operator/0.log" Jan 28 21:35:52 crc kubenswrapper[4746]: I0128 21:35:52.199456 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-vffsx_4108ee2d-3096-4956-95b4-7c2b8327175c/nmstate-webhook/0.log" Jan 28 21:35:56 crc kubenswrapper[4746]: I0128 21:35:56.835903 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:35:56 crc kubenswrapper[4746]: E0128 21:35:56.836981 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:36:07 crc kubenswrapper[4746]: I0128 21:36:07.204997 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6866b6794-24l8g_cfda6c5a-4e09-4579-9149-ba5c87aaf387/manager/0.log" Jan 28 21:36:07 crc kubenswrapper[4746]: I0128 21:36:07.208949 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6866b6794-24l8g_cfda6c5a-4e09-4579-9149-ba5c87aaf387/kube-rbac-proxy/0.log" Jan 28 21:36:09 crc kubenswrapper[4746]: I0128 21:36:09.836356 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:36:09 crc kubenswrapper[4746]: E0128 21:36:09.837122 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:36:21 crc kubenswrapper[4746]: I0128 21:36:21.835793 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:36:21 crc kubenswrapper[4746]: E0128 21:36:21.836628 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.462626 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-sx9pg_0e1b10c8-2491-403a-9ea3-9805d8167d7a/prometheus-operator/0.log" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.520355 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zvjft"] Jan 28 21:36:22 crc kubenswrapper[4746]: E0128 21:36:22.520924 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f324dd81-f1d9-4f0e-8614-cee3ebf2edce" containerName="registry-server" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.520945 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f324dd81-f1d9-4f0e-8614-cee3ebf2edce" containerName="registry-server" Jan 28 21:36:22 crc kubenswrapper[4746]: E0128 21:36:22.520980 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f324dd81-f1d9-4f0e-8614-cee3ebf2edce" containerName="extract-content" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.520989 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f324dd81-f1d9-4f0e-8614-cee3ebf2edce" containerName="extract-content" Jan 28 21:36:22 crc kubenswrapper[4746]: E0128 21:36:22.521007 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f324dd81-f1d9-4f0e-8614-cee3ebf2edce" containerName="extract-utilities" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.521017 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f324dd81-f1d9-4f0e-8614-cee3ebf2edce" containerName="extract-utilities" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.521337 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="f324dd81-f1d9-4f0e-8614-cee3ebf2edce" containerName="registry-server" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.522910 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zvjft" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.533681 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zvjft"] Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.653768 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7abd7c3e-51a3-46ca-b471-44926f68f9b8-utilities\") pod \"certified-operators-zvjft\" (UID: \"7abd7c3e-51a3-46ca-b471-44926f68f9b8\") " pod="openshift-marketplace/certified-operators-zvjft" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.653832 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7abd7c3e-51a3-46ca-b471-44926f68f9b8-catalog-content\") pod \"certified-operators-zvjft\" (UID: \"7abd7c3e-51a3-46ca-b471-44926f68f9b8\") " pod="openshift-marketplace/certified-operators-zvjft" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.653886 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2c9c\" (UniqueName: \"kubernetes.io/projected/7abd7c3e-51a3-46ca-b471-44926f68f9b8-kube-api-access-x2c9c\") pod \"certified-operators-zvjft\" (UID: \"7abd7c3e-51a3-46ca-b471-44926f68f9b8\") " pod="openshift-marketplace/certified-operators-zvjft" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.756104 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7abd7c3e-51a3-46ca-b471-44926f68f9b8-utilities\") pod \"certified-operators-zvjft\" (UID: \"7abd7c3e-51a3-46ca-b471-44926f68f9b8\") " pod="openshift-marketplace/certified-operators-zvjft" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.756154 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7abd7c3e-51a3-46ca-b471-44926f68f9b8-catalog-content\") pod \"certified-operators-zvjft\" (UID: \"7abd7c3e-51a3-46ca-b471-44926f68f9b8\") " pod="openshift-marketplace/certified-operators-zvjft" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.756180 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2c9c\" (UniqueName: \"kubernetes.io/projected/7abd7c3e-51a3-46ca-b471-44926f68f9b8-kube-api-access-x2c9c\") pod \"certified-operators-zvjft\" (UID: \"7abd7c3e-51a3-46ca-b471-44926f68f9b8\") " pod="openshift-marketplace/certified-operators-zvjft" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.756607 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7abd7c3e-51a3-46ca-b471-44926f68f9b8-utilities\") pod \"certified-operators-zvjft\" (UID: \"7abd7c3e-51a3-46ca-b471-44926f68f9b8\") " pod="openshift-marketplace/certified-operators-zvjft" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.756679 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7abd7c3e-51a3-46ca-b471-44926f68f9b8-catalog-content\") pod \"certified-operators-zvjft\" (UID: \"7abd7c3e-51a3-46ca-b471-44926f68f9b8\") " pod="openshift-marketplace/certified-operators-zvjft" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.775721 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2c9c\" (UniqueName: \"kubernetes.io/projected/7abd7c3e-51a3-46ca-b471-44926f68f9b8-kube-api-access-x2c9c\") pod \"certified-operators-zvjft\" (UID: \"7abd7c3e-51a3-46ca-b471-44926f68f9b8\") " pod="openshift-marketplace/certified-operators-zvjft" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.839148 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zvjft" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.906735 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_10acdec7-69f6-42e1-b065-c84b8d82fd03/prometheus-operator-admission-webhook/0.log" Jan 28 21:36:22 crc kubenswrapper[4746]: I0128 21:36:22.907523 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_09345bfc-4171-49c5-85e3-32616db6ce17/prometheus-operator-admission-webhook/0.log" Jan 28 21:36:23 crc kubenswrapper[4746]: I0128 21:36:23.234618 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-m2mx9_2788b8ac-4eb0-46cb-8861-c55d6b302dd7/operator/0.log" Jan 28 21:36:23 crc kubenswrapper[4746]: I0128 21:36:23.364344 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-jnzwc_f13f3a63-44b1-4644-8bea-99e25a6764c3/perses-operator/0.log" Jan 28 21:36:23 crc kubenswrapper[4746]: I0128 21:36:23.387555 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zvjft"] Jan 28 21:36:24 crc kubenswrapper[4746]: I0128 21:36:24.233255 4746 generic.go:334] "Generic (PLEG): container finished" podID="7abd7c3e-51a3-46ca-b471-44926f68f9b8" containerID="44f15d4465e2d9186438189c32a761bc9f9bf7511ee4ef719f0cb9bbc72bd490" exitCode=0 Jan 28 21:36:24 crc kubenswrapper[4746]: I0128 21:36:24.233351 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvjft" event={"ID":"7abd7c3e-51a3-46ca-b471-44926f68f9b8","Type":"ContainerDied","Data":"44f15d4465e2d9186438189c32a761bc9f9bf7511ee4ef719f0cb9bbc72bd490"} Jan 28 21:36:24 crc kubenswrapper[4746]: I0128 21:36:24.233582 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvjft" event={"ID":"7abd7c3e-51a3-46ca-b471-44926f68f9b8","Type":"ContainerStarted","Data":"0684a6e4d127e1d1c222dfc2fa012f4162afef79cc49a272e7459c29c44f28bd"} Jan 28 21:36:25 crc kubenswrapper[4746]: I0128 21:36:25.244613 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvjft" event={"ID":"7abd7c3e-51a3-46ca-b471-44926f68f9b8","Type":"ContainerStarted","Data":"8455d61e00567ce580277461d809d883e247937b4dbc05d770bc708e67255dff"} Jan 28 21:36:27 crc kubenswrapper[4746]: I0128 21:36:27.264120 4746 generic.go:334] "Generic (PLEG): container finished" podID="7abd7c3e-51a3-46ca-b471-44926f68f9b8" containerID="8455d61e00567ce580277461d809d883e247937b4dbc05d770bc708e67255dff" exitCode=0 Jan 28 21:36:27 crc kubenswrapper[4746]: I0128 21:36:27.264200 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvjft" event={"ID":"7abd7c3e-51a3-46ca-b471-44926f68f9b8","Type":"ContainerDied","Data":"8455d61e00567ce580277461d809d883e247937b4dbc05d770bc708e67255dff"} Jan 28 21:36:28 crc kubenswrapper[4746]: I0128 21:36:28.274708 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvjft" event={"ID":"7abd7c3e-51a3-46ca-b471-44926f68f9b8","Type":"ContainerStarted","Data":"c84da9e8fdcda6348f6cb65ac5dad9998e49ac6c7d2262c037692a4429f4f3d4"} Jan 28 21:36:28 crc kubenswrapper[4746]: I0128 21:36:28.291984 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zvjft" podStartSLOduration=2.821359799 podStartE2EDuration="6.291967431s" podCreationTimestamp="2026-01-28 21:36:22 +0000 UTC" firstStartedPulling="2026-01-28 21:36:24.234966378 +0000 UTC m=+3412.191152732" lastFinishedPulling="2026-01-28 21:36:27.70557401 +0000 UTC m=+3415.661760364" observedRunningTime="2026-01-28 21:36:28.289094943 +0000 UTC m=+3416.245281297" watchObservedRunningTime="2026-01-28 21:36:28.291967431 +0000 UTC m=+3416.248153795" Jan 28 21:36:32 crc kubenswrapper[4746]: I0128 21:36:32.882146 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zvjft" Jan 28 21:36:32 crc kubenswrapper[4746]: I0128 21:36:32.882907 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zvjft" Jan 28 21:36:32 crc kubenswrapper[4746]: I0128 21:36:32.940580 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zvjft" Jan 28 21:36:33 crc kubenswrapper[4746]: I0128 21:36:33.378717 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zvjft" Jan 28 21:36:33 crc kubenswrapper[4746]: I0128 21:36:33.422939 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zvjft"] Jan 28 21:36:33 crc kubenswrapper[4746]: I0128 21:36:33.836403 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:36:33 crc kubenswrapper[4746]: E0128 21:36:33.837380 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:36:35 crc kubenswrapper[4746]: I0128 21:36:35.327840 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zvjft" podUID="7abd7c3e-51a3-46ca-b471-44926f68f9b8" containerName="registry-server" containerID="cri-o://c84da9e8fdcda6348f6cb65ac5dad9998e49ac6c7d2262c037692a4429f4f3d4" gracePeriod=2 Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.054543 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zvjft" Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.160916 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7abd7c3e-51a3-46ca-b471-44926f68f9b8-utilities\") pod \"7abd7c3e-51a3-46ca-b471-44926f68f9b8\" (UID: \"7abd7c3e-51a3-46ca-b471-44926f68f9b8\") " Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.161173 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2c9c\" (UniqueName: \"kubernetes.io/projected/7abd7c3e-51a3-46ca-b471-44926f68f9b8-kube-api-access-x2c9c\") pod \"7abd7c3e-51a3-46ca-b471-44926f68f9b8\" (UID: \"7abd7c3e-51a3-46ca-b471-44926f68f9b8\") " Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.161270 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7abd7c3e-51a3-46ca-b471-44926f68f9b8-catalog-content\") pod \"7abd7c3e-51a3-46ca-b471-44926f68f9b8\" (UID: \"7abd7c3e-51a3-46ca-b471-44926f68f9b8\") " Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.161917 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7abd7c3e-51a3-46ca-b471-44926f68f9b8-utilities" (OuterVolumeSpecName: "utilities") pod "7abd7c3e-51a3-46ca-b471-44926f68f9b8" (UID: "7abd7c3e-51a3-46ca-b471-44926f68f9b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.166262 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7abd7c3e-51a3-46ca-b471-44926f68f9b8-kube-api-access-x2c9c" (OuterVolumeSpecName: "kube-api-access-x2c9c") pod "7abd7c3e-51a3-46ca-b471-44926f68f9b8" (UID: "7abd7c3e-51a3-46ca-b471-44926f68f9b8"). InnerVolumeSpecName "kube-api-access-x2c9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.208236 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7abd7c3e-51a3-46ca-b471-44926f68f9b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7abd7c3e-51a3-46ca-b471-44926f68f9b8" (UID: "7abd7c3e-51a3-46ca-b471-44926f68f9b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.263699 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2c9c\" (UniqueName: \"kubernetes.io/projected/7abd7c3e-51a3-46ca-b471-44926f68f9b8-kube-api-access-x2c9c\") on node \"crc\" DevicePath \"\"" Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.263721 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7abd7c3e-51a3-46ca-b471-44926f68f9b8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.263731 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7abd7c3e-51a3-46ca-b471-44926f68f9b8-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.340895 4746 generic.go:334] "Generic (PLEG): container finished" podID="7abd7c3e-51a3-46ca-b471-44926f68f9b8" containerID="c84da9e8fdcda6348f6cb65ac5dad9998e49ac6c7d2262c037692a4429f4f3d4" exitCode=0 Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.340931 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvjft" event={"ID":"7abd7c3e-51a3-46ca-b471-44926f68f9b8","Type":"ContainerDied","Data":"c84da9e8fdcda6348f6cb65ac5dad9998e49ac6c7d2262c037692a4429f4f3d4"} Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.340954 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvjft" event={"ID":"7abd7c3e-51a3-46ca-b471-44926f68f9b8","Type":"ContainerDied","Data":"0684a6e4d127e1d1c222dfc2fa012f4162afef79cc49a272e7459c29c44f28bd"} Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.340969 4746 scope.go:117] "RemoveContainer" containerID="c84da9e8fdcda6348f6cb65ac5dad9998e49ac6c7d2262c037692a4429f4f3d4" Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.341096 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zvjft" Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.379638 4746 scope.go:117] "RemoveContainer" containerID="8455d61e00567ce580277461d809d883e247937b4dbc05d770bc708e67255dff" Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.405232 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zvjft"] Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.407843 4746 scope.go:117] "RemoveContainer" containerID="44f15d4465e2d9186438189c32a761bc9f9bf7511ee4ef719f0cb9bbc72bd490" Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.415164 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zvjft"] Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.463715 4746 scope.go:117] "RemoveContainer" containerID="c84da9e8fdcda6348f6cb65ac5dad9998e49ac6c7d2262c037692a4429f4f3d4" Jan 28 21:36:36 crc kubenswrapper[4746]: E0128 21:36:36.464890 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c84da9e8fdcda6348f6cb65ac5dad9998e49ac6c7d2262c037692a4429f4f3d4\": container with ID starting with c84da9e8fdcda6348f6cb65ac5dad9998e49ac6c7d2262c037692a4429f4f3d4 not found: ID does not exist" containerID="c84da9e8fdcda6348f6cb65ac5dad9998e49ac6c7d2262c037692a4429f4f3d4" Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.464936 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c84da9e8fdcda6348f6cb65ac5dad9998e49ac6c7d2262c037692a4429f4f3d4"} err="failed to get container status \"c84da9e8fdcda6348f6cb65ac5dad9998e49ac6c7d2262c037692a4429f4f3d4\": rpc error: code = NotFound desc = could not find container \"c84da9e8fdcda6348f6cb65ac5dad9998e49ac6c7d2262c037692a4429f4f3d4\": container with ID starting with c84da9e8fdcda6348f6cb65ac5dad9998e49ac6c7d2262c037692a4429f4f3d4 not found: ID does not exist" Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.464977 4746 scope.go:117] "RemoveContainer" containerID="8455d61e00567ce580277461d809d883e247937b4dbc05d770bc708e67255dff" Jan 28 21:36:36 crc kubenswrapper[4746]: E0128 21:36:36.465330 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8455d61e00567ce580277461d809d883e247937b4dbc05d770bc708e67255dff\": container with ID starting with 8455d61e00567ce580277461d809d883e247937b4dbc05d770bc708e67255dff not found: ID does not exist" containerID="8455d61e00567ce580277461d809d883e247937b4dbc05d770bc708e67255dff" Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.465353 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8455d61e00567ce580277461d809d883e247937b4dbc05d770bc708e67255dff"} err="failed to get container status \"8455d61e00567ce580277461d809d883e247937b4dbc05d770bc708e67255dff\": rpc error: code = NotFound desc = could not find container \"8455d61e00567ce580277461d809d883e247937b4dbc05d770bc708e67255dff\": container with ID starting with 8455d61e00567ce580277461d809d883e247937b4dbc05d770bc708e67255dff not found: ID does not exist" Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.465365 4746 scope.go:117] "RemoveContainer" containerID="44f15d4465e2d9186438189c32a761bc9f9bf7511ee4ef719f0cb9bbc72bd490" Jan 28 21:36:36 crc kubenswrapper[4746]: E0128 21:36:36.465608 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44f15d4465e2d9186438189c32a761bc9f9bf7511ee4ef719f0cb9bbc72bd490\": container with ID starting with 44f15d4465e2d9186438189c32a761bc9f9bf7511ee4ef719f0cb9bbc72bd490 not found: ID does not exist" containerID="44f15d4465e2d9186438189c32a761bc9f9bf7511ee4ef719f0cb9bbc72bd490" Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.465628 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44f15d4465e2d9186438189c32a761bc9f9bf7511ee4ef719f0cb9bbc72bd490"} err="failed to get container status \"44f15d4465e2d9186438189c32a761bc9f9bf7511ee4ef719f0cb9bbc72bd490\": rpc error: code = NotFound desc = could not find container \"44f15d4465e2d9186438189c32a761bc9f9bf7511ee4ef719f0cb9bbc72bd490\": container with ID starting with 44f15d4465e2d9186438189c32a761bc9f9bf7511ee4ef719f0cb9bbc72bd490 not found: ID does not exist" Jan 28 21:36:36 crc kubenswrapper[4746]: I0128 21:36:36.848100 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7abd7c3e-51a3-46ca-b471-44926f68f9b8" path="/var/lib/kubelet/pods/7abd7c3e-51a3-46ca-b471-44926f68f9b8/volumes" Jan 28 21:36:38 crc kubenswrapper[4746]: I0128 21:36:38.353210 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-r2vlm_b5150ca9-e86d-4087-bc5d-c2dd26234ecd/kube-rbac-proxy/0.log" Jan 28 21:36:38 crc kubenswrapper[4746]: I0128 21:36:38.453014 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-r2vlm_b5150ca9-e86d-4087-bc5d-c2dd26234ecd/controller/0.log" Jan 28 21:36:38 crc kubenswrapper[4746]: I0128 21:36:38.626182 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-frr-files/0.log" Jan 28 21:36:38 crc kubenswrapper[4746]: I0128 21:36:38.810643 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-frr-files/0.log" Jan 28 21:36:38 crc kubenswrapper[4746]: I0128 21:36:38.881693 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-reloader/0.log" Jan 28 21:36:38 crc kubenswrapper[4746]: I0128 21:36:38.885734 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-reloader/0.log" Jan 28 21:36:38 crc kubenswrapper[4746]: I0128 21:36:38.916649 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-metrics/0.log" Jan 28 21:36:39 crc kubenswrapper[4746]: I0128 21:36:39.074621 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-frr-files/0.log" Jan 28 21:36:39 crc kubenswrapper[4746]: I0128 21:36:39.134999 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-metrics/0.log" Jan 28 21:36:39 crc kubenswrapper[4746]: I0128 21:36:39.143974 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-reloader/0.log" Jan 28 21:36:39 crc kubenswrapper[4746]: I0128 21:36:39.162320 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-metrics/0.log" Jan 28 21:36:39 crc kubenswrapper[4746]: I0128 21:36:39.294804 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-reloader/0.log" Jan 28 21:36:39 crc kubenswrapper[4746]: I0128 21:36:39.361053 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-metrics/0.log" Jan 28 21:36:39 crc kubenswrapper[4746]: I0128 21:36:39.375848 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/controller/0.log" Jan 28 21:36:39 crc kubenswrapper[4746]: I0128 21:36:39.432518 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-frr-files/0.log" Jan 28 21:36:39 crc kubenswrapper[4746]: I0128 21:36:39.597144 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/frr-metrics/0.log" Jan 28 21:36:39 crc kubenswrapper[4746]: I0128 21:36:39.597271 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/kube-rbac-proxy/0.log" Jan 28 21:36:39 crc kubenswrapper[4746]: I0128 21:36:39.737943 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/kube-rbac-proxy-frr/0.log" Jan 28 21:36:39 crc kubenswrapper[4746]: I0128 21:36:39.911849 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/reloader/0.log" Jan 28 21:36:39 crc kubenswrapper[4746]: I0128 21:36:39.942359 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-5crvf_64704f76-28dc-42cf-a696-9473b337eee9/frr-k8s-webhook-server/0.log" Jan 28 21:36:40 crc kubenswrapper[4746]: I0128 21:36:40.123996 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5999cb5f6c-ndf7t_eb7a3d58-a895-43a6-8f29-240cfb61ed98/manager/0.log" Jan 28 21:36:40 crc kubenswrapper[4746]: I0128 21:36:40.314334 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-79f4bb6c4-wm9hh_1d703849-bf20-4333-9213-23b52999ae43/webhook-server/0.log" Jan 28 21:36:40 crc kubenswrapper[4746]: I0128 21:36:40.428433 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m55jn_c3d285c6-0abf-4c0b-92f5-1c91659d1de1/kube-rbac-proxy/0.log" Jan 28 21:36:41 crc kubenswrapper[4746]: I0128 21:36:41.068182 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m55jn_c3d285c6-0abf-4c0b-92f5-1c91659d1de1/speaker/0.log" Jan 28 21:36:41 crc kubenswrapper[4746]: I0128 21:36:41.181328 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/frr/0.log" Jan 28 21:36:48 crc kubenswrapper[4746]: I0128 21:36:48.837163 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:36:49 crc kubenswrapper[4746]: I0128 21:36:49.492003 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerStarted","Data":"51c1ec6fe023b6b43a0fdc61b858972d2af817d4cc2d4c7a6797132edb658ffa"} Jan 28 21:36:54 crc kubenswrapper[4746]: I0128 21:36:54.750822 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x_b238ee1e-c43f-4eb7-8f69-de9f58747168/util/0.log" Jan 28 21:36:54 crc kubenswrapper[4746]: I0128 21:36:54.973015 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x_b238ee1e-c43f-4eb7-8f69-de9f58747168/util/0.log" Jan 28 21:36:55 crc kubenswrapper[4746]: I0128 21:36:55.010468 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x_b238ee1e-c43f-4eb7-8f69-de9f58747168/pull/0.log" Jan 28 21:36:55 crc kubenswrapper[4746]: I0128 21:36:55.063613 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x_b238ee1e-c43f-4eb7-8f69-de9f58747168/pull/0.log" Jan 28 21:36:55 crc kubenswrapper[4746]: I0128 21:36:55.212757 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x_b238ee1e-c43f-4eb7-8f69-de9f58747168/util/0.log" Jan 28 21:36:55 crc kubenswrapper[4746]: I0128 21:36:55.256654 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x_b238ee1e-c43f-4eb7-8f69-de9f58747168/pull/0.log" Jan 28 21:36:55 crc kubenswrapper[4746]: I0128 21:36:55.312969 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x_b238ee1e-c43f-4eb7-8f69-de9f58747168/extract/0.log" Jan 28 21:36:55 crc kubenswrapper[4746]: I0128 21:36:55.454978 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg_2e9f38fa-c869-458e-8a7c-fb15ec9acccd/util/0.log" Jan 28 21:36:55 crc kubenswrapper[4746]: I0128 21:36:55.629904 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg_2e9f38fa-c869-458e-8a7c-fb15ec9acccd/pull/0.log" Jan 28 21:36:55 crc kubenswrapper[4746]: I0128 21:36:55.645052 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg_2e9f38fa-c869-458e-8a7c-fb15ec9acccd/pull/0.log" Jan 28 21:36:55 crc kubenswrapper[4746]: I0128 21:36:55.648447 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg_2e9f38fa-c869-458e-8a7c-fb15ec9acccd/util/0.log" Jan 28 21:36:55 crc kubenswrapper[4746]: I0128 21:36:55.843171 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg_2e9f38fa-c869-458e-8a7c-fb15ec9acccd/extract/0.log" Jan 28 21:36:55 crc kubenswrapper[4746]: I0128 21:36:55.853162 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg_2e9f38fa-c869-458e-8a7c-fb15ec9acccd/pull/0.log" Jan 28 21:36:55 crc kubenswrapper[4746]: I0128 21:36:55.860426 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg_2e9f38fa-c869-458e-8a7c-fb15ec9acccd/util/0.log" Jan 28 21:36:56 crc kubenswrapper[4746]: I0128 21:36:56.051228 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr_743dd52c-2031-4ffc-a4f2-57dfa9438e4e/util/0.log" Jan 28 21:36:56 crc kubenswrapper[4746]: I0128 21:36:56.270324 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr_743dd52c-2031-4ffc-a4f2-57dfa9438e4e/util/0.log" Jan 28 21:36:56 crc kubenswrapper[4746]: I0128 21:36:56.297440 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr_743dd52c-2031-4ffc-a4f2-57dfa9438e4e/pull/0.log" Jan 28 21:36:56 crc kubenswrapper[4746]: I0128 21:36:56.325693 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr_743dd52c-2031-4ffc-a4f2-57dfa9438e4e/pull/0.log" Jan 28 21:36:56 crc kubenswrapper[4746]: I0128 21:36:56.466723 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr_743dd52c-2031-4ffc-a4f2-57dfa9438e4e/util/0.log" Jan 28 21:36:56 crc kubenswrapper[4746]: I0128 21:36:56.485711 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr_743dd52c-2031-4ffc-a4f2-57dfa9438e4e/extract/0.log" Jan 28 21:36:56 crc kubenswrapper[4746]: I0128 21:36:56.536399 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr_743dd52c-2031-4ffc-a4f2-57dfa9438e4e/pull/0.log" Jan 28 21:36:56 crc kubenswrapper[4746]: I0128 21:36:56.668623 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8_3e93805e-6f5f-4618-b962-d9fca6cfe272/util/0.log" Jan 28 21:36:56 crc kubenswrapper[4746]: I0128 21:36:56.820684 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8_3e93805e-6f5f-4618-b962-d9fca6cfe272/util/0.log" Jan 28 21:36:56 crc kubenswrapper[4746]: I0128 21:36:56.832158 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8_3e93805e-6f5f-4618-b962-d9fca6cfe272/pull/0.log" Jan 28 21:36:56 crc kubenswrapper[4746]: I0128 21:36:56.841716 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8_3e93805e-6f5f-4618-b962-d9fca6cfe272/pull/0.log" Jan 28 21:36:57 crc kubenswrapper[4746]: I0128 21:36:57.037897 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8_3e93805e-6f5f-4618-b962-d9fca6cfe272/pull/0.log" Jan 28 21:36:57 crc kubenswrapper[4746]: I0128 21:36:57.047364 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8_3e93805e-6f5f-4618-b962-d9fca6cfe272/util/0.log" Jan 28 21:36:57 crc kubenswrapper[4746]: I0128 21:36:57.069189 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8_3e93805e-6f5f-4618-b962-d9fca6cfe272/extract/0.log" Jan 28 21:36:57 crc kubenswrapper[4746]: I0128 21:36:57.325436 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s6mqb_f854deb7-4783-4ba8-8357-ffe4d1124a12/extract-utilities/0.log" Jan 28 21:36:57 crc kubenswrapper[4746]: I0128 21:36:57.489242 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s6mqb_f854deb7-4783-4ba8-8357-ffe4d1124a12/extract-content/0.log" Jan 28 21:36:57 crc kubenswrapper[4746]: I0128 21:36:57.518983 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s6mqb_f854deb7-4783-4ba8-8357-ffe4d1124a12/extract-utilities/0.log" Jan 28 21:36:57 crc kubenswrapper[4746]: I0128 21:36:57.550025 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s6mqb_f854deb7-4783-4ba8-8357-ffe4d1124a12/extract-content/0.log" Jan 28 21:36:57 crc kubenswrapper[4746]: I0128 21:36:57.685681 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s6mqb_f854deb7-4783-4ba8-8357-ffe4d1124a12/extract-utilities/0.log" Jan 28 21:36:57 crc kubenswrapper[4746]: I0128 21:36:57.775977 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s6mqb_f854deb7-4783-4ba8-8357-ffe4d1124a12/extract-content/0.log" Jan 28 21:36:57 crc kubenswrapper[4746]: I0128 21:36:57.982537 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-88s2s_2f7998d2-06e5-4567-8502-79dadd37daec/extract-utilities/0.log" Jan 28 21:36:58 crc kubenswrapper[4746]: I0128 21:36:58.192321 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-88s2s_2f7998d2-06e5-4567-8502-79dadd37daec/extract-content/0.log" Jan 28 21:36:58 crc kubenswrapper[4746]: I0128 21:36:58.210936 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-88s2s_2f7998d2-06e5-4567-8502-79dadd37daec/extract-content/0.log" Jan 28 21:36:58 crc kubenswrapper[4746]: I0128 21:36:58.263511 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s6mqb_f854deb7-4783-4ba8-8357-ffe4d1124a12/registry-server/0.log" Jan 28 21:36:58 crc kubenswrapper[4746]: I0128 21:36:58.303791 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-88s2s_2f7998d2-06e5-4567-8502-79dadd37daec/extract-utilities/0.log" Jan 28 21:36:58 crc kubenswrapper[4746]: I0128 21:36:58.526450 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-88s2s_2f7998d2-06e5-4567-8502-79dadd37daec/extract-content/0.log" Jan 28 21:36:58 crc kubenswrapper[4746]: I0128 21:36:58.547480 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-88s2s_2f7998d2-06e5-4567-8502-79dadd37daec/extract-utilities/0.log" Jan 28 21:36:58 crc kubenswrapper[4746]: I0128 21:36:58.778168 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-bgtlc_6663df81-0144-46d7-90a2-a1ff5edb9474/marketplace-operator/0.log" Jan 28 21:36:58 crc kubenswrapper[4746]: I0128 21:36:58.826629 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-525kj_13f17d99-49bb-4710-8d6b-e01933b5d396/extract-utilities/0.log" Jan 28 21:36:58 crc kubenswrapper[4746]: I0128 21:36:58.957051 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-88s2s_2f7998d2-06e5-4567-8502-79dadd37daec/registry-server/0.log" Jan 28 21:36:59 crc kubenswrapper[4746]: I0128 21:36:59.026633 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-525kj_13f17d99-49bb-4710-8d6b-e01933b5d396/extract-content/0.log" Jan 28 21:36:59 crc kubenswrapper[4746]: I0128 21:36:59.037027 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-525kj_13f17d99-49bb-4710-8d6b-e01933b5d396/extract-content/0.log" Jan 28 21:36:59 crc kubenswrapper[4746]: I0128 21:36:59.042307 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-525kj_13f17d99-49bb-4710-8d6b-e01933b5d396/extract-utilities/0.log" Jan 28 21:36:59 crc kubenswrapper[4746]: I0128 21:36:59.221177 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-525kj_13f17d99-49bb-4710-8d6b-e01933b5d396/extract-utilities/0.log" Jan 28 21:36:59 crc kubenswrapper[4746]: I0128 21:36:59.230436 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-525kj_13f17d99-49bb-4710-8d6b-e01933b5d396/extract-content/0.log" Jan 28 21:36:59 crc kubenswrapper[4746]: I0128 21:36:59.317397 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hp8gs_b5a70df4-251e-4d72-b220-9772f0b70727/extract-utilities/0.log" Jan 28 21:36:59 crc kubenswrapper[4746]: I0128 21:36:59.397514 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-525kj_13f17d99-49bb-4710-8d6b-e01933b5d396/registry-server/0.log" Jan 28 21:36:59 crc kubenswrapper[4746]: I0128 21:36:59.489940 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hp8gs_b5a70df4-251e-4d72-b220-9772f0b70727/extract-utilities/0.log" Jan 28 21:36:59 crc kubenswrapper[4746]: I0128 21:36:59.532800 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hp8gs_b5a70df4-251e-4d72-b220-9772f0b70727/extract-content/0.log" Jan 28 21:36:59 crc kubenswrapper[4746]: I0128 21:36:59.587252 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hp8gs_b5a70df4-251e-4d72-b220-9772f0b70727/extract-content/0.log" Jan 28 21:36:59 crc kubenswrapper[4746]: I0128 21:36:59.776431 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hp8gs_b5a70df4-251e-4d72-b220-9772f0b70727/extract-utilities/0.log" Jan 28 21:36:59 crc kubenswrapper[4746]: I0128 21:36:59.795210 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hp8gs_b5a70df4-251e-4d72-b220-9772f0b70727/extract-content/0.log" Jan 28 21:37:00 crc kubenswrapper[4746]: I0128 21:37:00.214146 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hp8gs_b5a70df4-251e-4d72-b220-9772f0b70727/registry-server/0.log" Jan 28 21:37:13 crc kubenswrapper[4746]: I0128 21:37:13.395387 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-sx9pg_0e1b10c8-2491-403a-9ea3-9805d8167d7a/prometheus-operator/0.log" Jan 28 21:37:13 crc kubenswrapper[4746]: I0128 21:37:13.418536 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_10acdec7-69f6-42e1-b065-c84b8d82fd03/prometheus-operator-admission-webhook/0.log" Jan 28 21:37:13 crc kubenswrapper[4746]: I0128 21:37:13.442150 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_09345bfc-4171-49c5-85e3-32616db6ce17/prometheus-operator-admission-webhook/0.log" Jan 28 21:37:13 crc kubenswrapper[4746]: I0128 21:37:13.616145 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-m2mx9_2788b8ac-4eb0-46cb-8861-c55d6b302dd7/operator/0.log" Jan 28 21:37:13 crc kubenswrapper[4746]: I0128 21:37:13.648973 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-jnzwc_f13f3a63-44b1-4644-8bea-99e25a6764c3/perses-operator/0.log" Jan 28 21:37:27 crc kubenswrapper[4746]: I0128 21:37:27.378823 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6866b6794-24l8g_cfda6c5a-4e09-4579-9149-ba5c87aaf387/kube-rbac-proxy/0.log" Jan 28 21:37:27 crc kubenswrapper[4746]: I0128 21:37:27.439103 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6866b6794-24l8g_cfda6c5a-4e09-4579-9149-ba5c87aaf387/manager/0.log" Jan 28 21:37:39 crc kubenswrapper[4746]: E0128 21:37:39.751250 4746 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.201:55476->38.102.83.201:46663: write tcp 38.102.83.201:55476->38.102.83.201:46663: write: broken pipe Jan 28 21:39:15 crc kubenswrapper[4746]: I0128 21:39:15.871832 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:39:15 crc kubenswrapper[4746]: I0128 21:39:15.872367 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:39:17 crc kubenswrapper[4746]: I0128 21:39:17.998443 4746 generic.go:334] "Generic (PLEG): container finished" podID="c0a60132-458a-46f2-a61f-444424b4c7cb" containerID="ca03b7997e8a5546e955fc8d2980d71dbf2f003ef00ac1b7464afe07799f1d18" exitCode=0 Jan 28 21:39:17 crc kubenswrapper[4746]: I0128 21:39:17.998768 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fjncs/must-gather-5v5j6" event={"ID":"c0a60132-458a-46f2-a61f-444424b4c7cb","Type":"ContainerDied","Data":"ca03b7997e8a5546e955fc8d2980d71dbf2f003ef00ac1b7464afe07799f1d18"} Jan 28 21:39:18 crc kubenswrapper[4746]: I0128 21:39:17.999961 4746 scope.go:117] "RemoveContainer" containerID="ca03b7997e8a5546e955fc8d2980d71dbf2f003ef00ac1b7464afe07799f1d18" Jan 28 21:39:18 crc kubenswrapper[4746]: I0128 21:39:18.968886 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fjncs_must-gather-5v5j6_c0a60132-458a-46f2-a61f-444424b4c7cb/gather/0.log" Jan 28 21:39:26 crc kubenswrapper[4746]: I0128 21:39:26.761522 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fjncs/must-gather-5v5j6"] Jan 28 21:39:26 crc kubenswrapper[4746]: I0128 21:39:26.762258 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-fjncs/must-gather-5v5j6" podUID="c0a60132-458a-46f2-a61f-444424b4c7cb" containerName="copy" containerID="cri-o://f784a010d13db326f2d27e8d7112b46dbad58c5d897846252677e1fb89b8b5b6" gracePeriod=2 Jan 28 21:39:26 crc kubenswrapper[4746]: I0128 21:39:26.770590 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fjncs/must-gather-5v5j6"] Jan 28 21:39:27 crc kubenswrapper[4746]: I0128 21:39:27.092058 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fjncs_must-gather-5v5j6_c0a60132-458a-46f2-a61f-444424b4c7cb/copy/0.log" Jan 28 21:39:27 crc kubenswrapper[4746]: I0128 21:39:27.093015 4746 generic.go:334] "Generic (PLEG): container finished" podID="c0a60132-458a-46f2-a61f-444424b4c7cb" containerID="f784a010d13db326f2d27e8d7112b46dbad58c5d897846252677e1fb89b8b5b6" exitCode=143 Jan 28 21:39:27 crc kubenswrapper[4746]: I0128 21:39:27.420276 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fjncs_must-gather-5v5j6_c0a60132-458a-46f2-a61f-444424b4c7cb/copy/0.log" Jan 28 21:39:27 crc kubenswrapper[4746]: I0128 21:39:27.420630 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjncs/must-gather-5v5j6" Jan 28 21:39:27 crc kubenswrapper[4746]: I0128 21:39:27.531055 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c0a60132-458a-46f2-a61f-444424b4c7cb-must-gather-output\") pod \"c0a60132-458a-46f2-a61f-444424b4c7cb\" (UID: \"c0a60132-458a-46f2-a61f-444424b4c7cb\") " Jan 28 21:39:27 crc kubenswrapper[4746]: I0128 21:39:27.531220 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8wfn\" (UniqueName: \"kubernetes.io/projected/c0a60132-458a-46f2-a61f-444424b4c7cb-kube-api-access-w8wfn\") pod \"c0a60132-458a-46f2-a61f-444424b4c7cb\" (UID: \"c0a60132-458a-46f2-a61f-444424b4c7cb\") " Jan 28 21:39:27 crc kubenswrapper[4746]: I0128 21:39:27.548304 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0a60132-458a-46f2-a61f-444424b4c7cb-kube-api-access-w8wfn" (OuterVolumeSpecName: "kube-api-access-w8wfn") pod "c0a60132-458a-46f2-a61f-444424b4c7cb" (UID: "c0a60132-458a-46f2-a61f-444424b4c7cb"). InnerVolumeSpecName "kube-api-access-w8wfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:39:27 crc kubenswrapper[4746]: I0128 21:39:27.633590 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8wfn\" (UniqueName: \"kubernetes.io/projected/c0a60132-458a-46f2-a61f-444424b4c7cb-kube-api-access-w8wfn\") on node \"crc\" DevicePath \"\"" Jan 28 21:39:27 crc kubenswrapper[4746]: I0128 21:39:27.708566 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0a60132-458a-46f2-a61f-444424b4c7cb-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c0a60132-458a-46f2-a61f-444424b4c7cb" (UID: "c0a60132-458a-46f2-a61f-444424b4c7cb"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:39:27 crc kubenswrapper[4746]: I0128 21:39:27.735900 4746 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c0a60132-458a-46f2-a61f-444424b4c7cb-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 28 21:39:28 crc kubenswrapper[4746]: I0128 21:39:28.104017 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fjncs_must-gather-5v5j6_c0a60132-458a-46f2-a61f-444424b4c7cb/copy/0.log" Jan 28 21:39:28 crc kubenswrapper[4746]: I0128 21:39:28.104743 4746 scope.go:117] "RemoveContainer" containerID="f784a010d13db326f2d27e8d7112b46dbad58c5d897846252677e1fb89b8b5b6" Jan 28 21:39:28 crc kubenswrapper[4746]: I0128 21:39:28.104887 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjncs/must-gather-5v5j6" Jan 28 21:39:28 crc kubenswrapper[4746]: I0128 21:39:28.123518 4746 scope.go:117] "RemoveContainer" containerID="ca03b7997e8a5546e955fc8d2980d71dbf2f003ef00ac1b7464afe07799f1d18" Jan 28 21:39:28 crc kubenswrapper[4746]: I0128 21:39:28.852416 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0a60132-458a-46f2-a61f-444424b4c7cb" path="/var/lib/kubelet/pods/c0a60132-458a-46f2-a61f-444424b4c7cb/volumes" Jan 28 21:39:45 crc kubenswrapper[4746]: I0128 21:39:45.871385 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:39:45 crc kubenswrapper[4746]: I0128 21:39:45.871886 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:40:15 crc kubenswrapper[4746]: I0128 21:40:15.871441 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:40:15 crc kubenswrapper[4746]: I0128 21:40:15.872126 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:40:15 crc kubenswrapper[4746]: I0128 21:40:15.872173 4746 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 21:40:15 crc kubenswrapper[4746]: I0128 21:40:15.872987 4746 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"51c1ec6fe023b6b43a0fdc61b858972d2af817d4cc2d4c7a6797132edb658ffa"} pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 21:40:15 crc kubenswrapper[4746]: I0128 21:40:15.873043 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" containerID="cri-o://51c1ec6fe023b6b43a0fdc61b858972d2af817d4cc2d4c7a6797132edb658ffa" gracePeriod=600 Jan 28 21:40:16 crc kubenswrapper[4746]: I0128 21:40:16.581706 4746 generic.go:334] "Generic (PLEG): container finished" podID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerID="51c1ec6fe023b6b43a0fdc61b858972d2af817d4cc2d4c7a6797132edb658ffa" exitCode=0 Jan 28 21:40:16 crc kubenswrapper[4746]: I0128 21:40:16.581783 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerDied","Data":"51c1ec6fe023b6b43a0fdc61b858972d2af817d4cc2d4c7a6797132edb658ffa"} Jan 28 21:40:16 crc kubenswrapper[4746]: I0128 21:40:16.582025 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerStarted","Data":"2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430"} Jan 28 21:40:16 crc kubenswrapper[4746]: I0128 21:40:16.582046 4746 scope.go:117] "RemoveContainer" containerID="3efd284713e29e28f4aa6dc8e613ca58f00d02f5b7a245a28f734915518bde66" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.393612 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pjp7s/must-gather-htgzp"] Jan 28 21:42:38 crc kubenswrapper[4746]: E0128 21:42:38.396107 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a60132-458a-46f2-a61f-444424b4c7cb" containerName="gather" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.396125 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a60132-458a-46f2-a61f-444424b4c7cb" containerName="gather" Jan 28 21:42:38 crc kubenswrapper[4746]: E0128 21:42:38.396142 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7abd7c3e-51a3-46ca-b471-44926f68f9b8" containerName="extract-content" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.396148 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abd7c3e-51a3-46ca-b471-44926f68f9b8" containerName="extract-content" Jan 28 21:42:38 crc kubenswrapper[4746]: E0128 21:42:38.396160 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7abd7c3e-51a3-46ca-b471-44926f68f9b8" containerName="registry-server" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.396166 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abd7c3e-51a3-46ca-b471-44926f68f9b8" containerName="registry-server" Jan 28 21:42:38 crc kubenswrapper[4746]: E0128 21:42:38.396182 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a60132-458a-46f2-a61f-444424b4c7cb" containerName="copy" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.396188 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a60132-458a-46f2-a61f-444424b4c7cb" containerName="copy" Jan 28 21:42:38 crc kubenswrapper[4746]: E0128 21:42:38.396200 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7abd7c3e-51a3-46ca-b471-44926f68f9b8" containerName="extract-utilities" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.396207 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abd7c3e-51a3-46ca-b471-44926f68f9b8" containerName="extract-utilities" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.396438 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0a60132-458a-46f2-a61f-444424b4c7cb" containerName="copy" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.396451 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0a60132-458a-46f2-a61f-444424b4c7cb" containerName="gather" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.396460 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="7abd7c3e-51a3-46ca-b471-44926f68f9b8" containerName="registry-server" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.397655 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pjp7s/must-gather-htgzp" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.400196 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pjp7s"/"kube-root-ca.crt" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.400260 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pjp7s"/"openshift-service-ca.crt" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.400373 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-pjp7s"/"default-dockercfg-pdm85" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.455750 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pjp7s/must-gather-htgzp"] Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.485164 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e439b8d8-d2b4-4169-8b41-497ec17f2018-must-gather-output\") pod \"must-gather-htgzp\" (UID: \"e439b8d8-d2b4-4169-8b41-497ec17f2018\") " pod="openshift-must-gather-pjp7s/must-gather-htgzp" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.485237 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nsqt\" (UniqueName: \"kubernetes.io/projected/e439b8d8-d2b4-4169-8b41-497ec17f2018-kube-api-access-9nsqt\") pod \"must-gather-htgzp\" (UID: \"e439b8d8-d2b4-4169-8b41-497ec17f2018\") " pod="openshift-must-gather-pjp7s/must-gather-htgzp" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.586874 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e439b8d8-d2b4-4169-8b41-497ec17f2018-must-gather-output\") pod \"must-gather-htgzp\" (UID: \"e439b8d8-d2b4-4169-8b41-497ec17f2018\") " pod="openshift-must-gather-pjp7s/must-gather-htgzp" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.586950 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nsqt\" (UniqueName: \"kubernetes.io/projected/e439b8d8-d2b4-4169-8b41-497ec17f2018-kube-api-access-9nsqt\") pod \"must-gather-htgzp\" (UID: \"e439b8d8-d2b4-4169-8b41-497ec17f2018\") " pod="openshift-must-gather-pjp7s/must-gather-htgzp" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.587508 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e439b8d8-d2b4-4169-8b41-497ec17f2018-must-gather-output\") pod \"must-gather-htgzp\" (UID: \"e439b8d8-d2b4-4169-8b41-497ec17f2018\") " pod="openshift-must-gather-pjp7s/must-gather-htgzp" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.622488 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nsqt\" (UniqueName: \"kubernetes.io/projected/e439b8d8-d2b4-4169-8b41-497ec17f2018-kube-api-access-9nsqt\") pod \"must-gather-htgzp\" (UID: \"e439b8d8-d2b4-4169-8b41-497ec17f2018\") " pod="openshift-must-gather-pjp7s/must-gather-htgzp" Jan 28 21:42:38 crc kubenswrapper[4746]: I0128 21:42:38.757143 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pjp7s/must-gather-htgzp" Jan 28 21:42:39 crc kubenswrapper[4746]: I0128 21:42:39.465850 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pjp7s/must-gather-htgzp"] Jan 28 21:42:40 crc kubenswrapper[4746]: I0128 21:42:40.320325 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pjp7s/must-gather-htgzp" event={"ID":"e439b8d8-d2b4-4169-8b41-497ec17f2018","Type":"ContainerStarted","Data":"0967d64c50b8fdb55fc3b93abc6ee7557195580632f2de7a93b54fa37c9b70c3"} Jan 28 21:42:40 crc kubenswrapper[4746]: I0128 21:42:40.320684 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pjp7s/must-gather-htgzp" event={"ID":"e439b8d8-d2b4-4169-8b41-497ec17f2018","Type":"ContainerStarted","Data":"c89060e6e5016b145262368556fe6a9e542ada9b3353928fadf424b0700285b3"} Jan 28 21:42:41 crc kubenswrapper[4746]: I0128 21:42:41.330877 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pjp7s/must-gather-htgzp" event={"ID":"e439b8d8-d2b4-4169-8b41-497ec17f2018","Type":"ContainerStarted","Data":"b07190ee5891a0c20f8eac80e364aa3a84e9446c7ac87eabf4a409ac5c5821f4"} Jan 28 21:42:41 crc kubenswrapper[4746]: I0128 21:42:41.355610 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pjp7s/must-gather-htgzp" podStartSLOduration=3.35558315 podStartE2EDuration="3.35558315s" podCreationTimestamp="2026-01-28 21:42:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:42:41.34714994 +0000 UTC m=+3789.303336314" watchObservedRunningTime="2026-01-28 21:42:41.35558315 +0000 UTC m=+3789.311769534" Jan 28 21:42:42 crc kubenswrapper[4746]: I0128 21:42:42.108532 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4nfpw"] Jan 28 21:42:42 crc kubenswrapper[4746]: I0128 21:42:42.111507 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nfpw" Jan 28 21:42:42 crc kubenswrapper[4746]: I0128 21:42:42.117148 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nfpw"] Jan 28 21:42:42 crc kubenswrapper[4746]: I0128 21:42:42.276962 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65c4644c-f5f2-4c20-8153-4a5df916881a-utilities\") pod \"redhat-marketplace-4nfpw\" (UID: \"65c4644c-f5f2-4c20-8153-4a5df916881a\") " pod="openshift-marketplace/redhat-marketplace-4nfpw" Jan 28 21:42:42 crc kubenswrapper[4746]: I0128 21:42:42.277282 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tflv\" (UniqueName: \"kubernetes.io/projected/65c4644c-f5f2-4c20-8153-4a5df916881a-kube-api-access-7tflv\") pod \"redhat-marketplace-4nfpw\" (UID: \"65c4644c-f5f2-4c20-8153-4a5df916881a\") " pod="openshift-marketplace/redhat-marketplace-4nfpw" Jan 28 21:42:42 crc kubenswrapper[4746]: I0128 21:42:42.277513 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65c4644c-f5f2-4c20-8153-4a5df916881a-catalog-content\") pod \"redhat-marketplace-4nfpw\" (UID: \"65c4644c-f5f2-4c20-8153-4a5df916881a\") " pod="openshift-marketplace/redhat-marketplace-4nfpw" Jan 28 21:42:42 crc kubenswrapper[4746]: I0128 21:42:42.379681 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65c4644c-f5f2-4c20-8153-4a5df916881a-catalog-content\") pod \"redhat-marketplace-4nfpw\" (UID: \"65c4644c-f5f2-4c20-8153-4a5df916881a\") " pod="openshift-marketplace/redhat-marketplace-4nfpw" Jan 28 21:42:42 crc kubenswrapper[4746]: I0128 21:42:42.380105 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65c4644c-f5f2-4c20-8153-4a5df916881a-utilities\") pod \"redhat-marketplace-4nfpw\" (UID: \"65c4644c-f5f2-4c20-8153-4a5df916881a\") " pod="openshift-marketplace/redhat-marketplace-4nfpw" Jan 28 21:42:42 crc kubenswrapper[4746]: I0128 21:42:42.380131 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tflv\" (UniqueName: \"kubernetes.io/projected/65c4644c-f5f2-4c20-8153-4a5df916881a-kube-api-access-7tflv\") pod \"redhat-marketplace-4nfpw\" (UID: \"65c4644c-f5f2-4c20-8153-4a5df916881a\") " pod="openshift-marketplace/redhat-marketplace-4nfpw" Jan 28 21:42:42 crc kubenswrapper[4746]: I0128 21:42:42.381043 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65c4644c-f5f2-4c20-8153-4a5df916881a-catalog-content\") pod \"redhat-marketplace-4nfpw\" (UID: \"65c4644c-f5f2-4c20-8153-4a5df916881a\") " pod="openshift-marketplace/redhat-marketplace-4nfpw" Jan 28 21:42:42 crc kubenswrapper[4746]: I0128 21:42:42.381340 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65c4644c-f5f2-4c20-8153-4a5df916881a-utilities\") pod \"redhat-marketplace-4nfpw\" (UID: \"65c4644c-f5f2-4c20-8153-4a5df916881a\") " pod="openshift-marketplace/redhat-marketplace-4nfpw" Jan 28 21:42:42 crc kubenswrapper[4746]: I0128 21:42:42.402027 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tflv\" (UniqueName: \"kubernetes.io/projected/65c4644c-f5f2-4c20-8153-4a5df916881a-kube-api-access-7tflv\") pod \"redhat-marketplace-4nfpw\" (UID: \"65c4644c-f5f2-4c20-8153-4a5df916881a\") " pod="openshift-marketplace/redhat-marketplace-4nfpw" Jan 28 21:42:42 crc kubenswrapper[4746]: I0128 21:42:42.448487 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nfpw" Jan 28 21:42:42 crc kubenswrapper[4746]: I0128 21:42:42.971058 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nfpw"] Jan 28 21:42:43 crc kubenswrapper[4746]: I0128 21:42:43.365980 4746 generic.go:334] "Generic (PLEG): container finished" podID="65c4644c-f5f2-4c20-8153-4a5df916881a" containerID="b482ab3b2156be3617fa7a2b8d670b9fc3f8c70d047cac0c8d37c37182c20e39" exitCode=0 Jan 28 21:42:43 crc kubenswrapper[4746]: I0128 21:42:43.366559 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nfpw" event={"ID":"65c4644c-f5f2-4c20-8153-4a5df916881a","Type":"ContainerDied","Data":"b482ab3b2156be3617fa7a2b8d670b9fc3f8c70d047cac0c8d37c37182c20e39"} Jan 28 21:42:43 crc kubenswrapper[4746]: I0128 21:42:43.366596 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nfpw" event={"ID":"65c4644c-f5f2-4c20-8153-4a5df916881a","Type":"ContainerStarted","Data":"b954d5e69b493dddd52ad82402dac0ebd7b23006b5caeccfd75020569ba5b7e2"} Jan 28 21:42:43 crc kubenswrapper[4746]: I0128 21:42:43.372004 4746 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 21:42:44 crc kubenswrapper[4746]: I0128 21:42:44.290292 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pjp7s/crc-debug-zhtlk"] Jan 28 21:42:44 crc kubenswrapper[4746]: I0128 21:42:44.291972 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pjp7s/crc-debug-zhtlk" Jan 28 21:42:44 crc kubenswrapper[4746]: I0128 21:42:44.409010 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nfpw" event={"ID":"65c4644c-f5f2-4c20-8153-4a5df916881a","Type":"ContainerStarted","Data":"663d568246baa3720cb98dcc014a177ad4e91d972bce807870581a6534991d3c"} Jan 28 21:42:44 crc kubenswrapper[4746]: I0128 21:42:44.425843 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d286f61d-5084-40ad-bad5-f62f9d66d7b1-host\") pod \"crc-debug-zhtlk\" (UID: \"d286f61d-5084-40ad-bad5-f62f9d66d7b1\") " pod="openshift-must-gather-pjp7s/crc-debug-zhtlk" Jan 28 21:42:44 crc kubenswrapper[4746]: I0128 21:42:44.425921 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks6tm\" (UniqueName: \"kubernetes.io/projected/d286f61d-5084-40ad-bad5-f62f9d66d7b1-kube-api-access-ks6tm\") pod \"crc-debug-zhtlk\" (UID: \"d286f61d-5084-40ad-bad5-f62f9d66d7b1\") " pod="openshift-must-gather-pjp7s/crc-debug-zhtlk" Jan 28 21:42:44 crc kubenswrapper[4746]: I0128 21:42:44.527296 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks6tm\" (UniqueName: \"kubernetes.io/projected/d286f61d-5084-40ad-bad5-f62f9d66d7b1-kube-api-access-ks6tm\") pod \"crc-debug-zhtlk\" (UID: \"d286f61d-5084-40ad-bad5-f62f9d66d7b1\") " pod="openshift-must-gather-pjp7s/crc-debug-zhtlk" Jan 28 21:42:44 crc kubenswrapper[4746]: I0128 21:42:44.527570 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d286f61d-5084-40ad-bad5-f62f9d66d7b1-host\") pod \"crc-debug-zhtlk\" (UID: \"d286f61d-5084-40ad-bad5-f62f9d66d7b1\") " pod="openshift-must-gather-pjp7s/crc-debug-zhtlk" Jan 28 21:42:44 crc kubenswrapper[4746]: I0128 21:42:44.527722 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d286f61d-5084-40ad-bad5-f62f9d66d7b1-host\") pod \"crc-debug-zhtlk\" (UID: \"d286f61d-5084-40ad-bad5-f62f9d66d7b1\") " pod="openshift-must-gather-pjp7s/crc-debug-zhtlk" Jan 28 21:42:44 crc kubenswrapper[4746]: I0128 21:42:44.555687 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks6tm\" (UniqueName: \"kubernetes.io/projected/d286f61d-5084-40ad-bad5-f62f9d66d7b1-kube-api-access-ks6tm\") pod \"crc-debug-zhtlk\" (UID: \"d286f61d-5084-40ad-bad5-f62f9d66d7b1\") " pod="openshift-must-gather-pjp7s/crc-debug-zhtlk" Jan 28 21:42:44 crc kubenswrapper[4746]: I0128 21:42:44.614621 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pjp7s/crc-debug-zhtlk" Jan 28 21:42:45 crc kubenswrapper[4746]: I0128 21:42:45.418689 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pjp7s/crc-debug-zhtlk" event={"ID":"d286f61d-5084-40ad-bad5-f62f9d66d7b1","Type":"ContainerStarted","Data":"e94f89637d4cf11d99b951b4498a6b179d665cf3e7df3b49b37df37d82854e94"} Jan 28 21:42:45 crc kubenswrapper[4746]: I0128 21:42:45.419146 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pjp7s/crc-debug-zhtlk" event={"ID":"d286f61d-5084-40ad-bad5-f62f9d66d7b1","Type":"ContainerStarted","Data":"ecab8574cb239da70bb3672c21b356be0cb59929e6619623b6d16d77af8f642f"} Jan 28 21:42:45 crc kubenswrapper[4746]: I0128 21:42:45.444218 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pjp7s/crc-debug-zhtlk" podStartSLOduration=1.444196059 podStartE2EDuration="1.444196059s" podCreationTimestamp="2026-01-28 21:42:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:42:45.432472351 +0000 UTC m=+3793.388658705" watchObservedRunningTime="2026-01-28 21:42:45.444196059 +0000 UTC m=+3793.400382413" Jan 28 21:42:45 crc kubenswrapper[4746]: I0128 21:42:45.871774 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:42:45 crc kubenswrapper[4746]: I0128 21:42:45.872167 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:42:46 crc kubenswrapper[4746]: I0128 21:42:46.431103 4746 generic.go:334] "Generic (PLEG): container finished" podID="65c4644c-f5f2-4c20-8153-4a5df916881a" containerID="663d568246baa3720cb98dcc014a177ad4e91d972bce807870581a6534991d3c" exitCode=0 Jan 28 21:42:46 crc kubenswrapper[4746]: I0128 21:42:46.431148 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nfpw" event={"ID":"65c4644c-f5f2-4c20-8153-4a5df916881a","Type":"ContainerDied","Data":"663d568246baa3720cb98dcc014a177ad4e91d972bce807870581a6534991d3c"} Jan 28 21:42:47 crc kubenswrapper[4746]: I0128 21:42:47.442404 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nfpw" event={"ID":"65c4644c-f5f2-4c20-8153-4a5df916881a","Type":"ContainerStarted","Data":"cafacea8efc896b86c6de26c83abf99585ed0583b3209d0e639173922e5e6f50"} Jan 28 21:42:47 crc kubenswrapper[4746]: I0128 21:42:47.462163 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4nfpw" podStartSLOduration=1.905233258 podStartE2EDuration="5.462144532s" podCreationTimestamp="2026-01-28 21:42:42 +0000 UTC" firstStartedPulling="2026-01-28 21:42:43.369750999 +0000 UTC m=+3791.325937353" lastFinishedPulling="2026-01-28 21:42:46.926662273 +0000 UTC m=+3794.882848627" observedRunningTime="2026-01-28 21:42:47.460367704 +0000 UTC m=+3795.416554058" watchObservedRunningTime="2026-01-28 21:42:47.462144532 +0000 UTC m=+3795.418330896" Jan 28 21:42:52 crc kubenswrapper[4746]: I0128 21:42:52.450401 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4nfpw" Jan 28 21:42:52 crc kubenswrapper[4746]: I0128 21:42:52.450986 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4nfpw" Jan 28 21:42:52 crc kubenswrapper[4746]: I0128 21:42:52.514906 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4nfpw" Jan 28 21:42:52 crc kubenswrapper[4746]: I0128 21:42:52.569866 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4nfpw" Jan 28 21:42:52 crc kubenswrapper[4746]: I0128 21:42:52.757013 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nfpw"] Jan 28 21:42:54 crc kubenswrapper[4746]: I0128 21:42:54.501004 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4nfpw" podUID="65c4644c-f5f2-4c20-8153-4a5df916881a" containerName="registry-server" containerID="cri-o://cafacea8efc896b86c6de26c83abf99585ed0583b3209d0e639173922e5e6f50" gracePeriod=2 Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.304122 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nfpw" Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.360989 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tflv\" (UniqueName: \"kubernetes.io/projected/65c4644c-f5f2-4c20-8153-4a5df916881a-kube-api-access-7tflv\") pod \"65c4644c-f5f2-4c20-8153-4a5df916881a\" (UID: \"65c4644c-f5f2-4c20-8153-4a5df916881a\") " Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.361100 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65c4644c-f5f2-4c20-8153-4a5df916881a-utilities\") pod \"65c4644c-f5f2-4c20-8153-4a5df916881a\" (UID: \"65c4644c-f5f2-4c20-8153-4a5df916881a\") " Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.361188 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65c4644c-f5f2-4c20-8153-4a5df916881a-catalog-content\") pod \"65c4644c-f5f2-4c20-8153-4a5df916881a\" (UID: \"65c4644c-f5f2-4c20-8153-4a5df916881a\") " Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.362066 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65c4644c-f5f2-4c20-8153-4a5df916881a-utilities" (OuterVolumeSpecName: "utilities") pod "65c4644c-f5f2-4c20-8153-4a5df916881a" (UID: "65c4644c-f5f2-4c20-8153-4a5df916881a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.366763 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65c4644c-f5f2-4c20-8153-4a5df916881a-kube-api-access-7tflv" (OuterVolumeSpecName: "kube-api-access-7tflv") pod "65c4644c-f5f2-4c20-8153-4a5df916881a" (UID: "65c4644c-f5f2-4c20-8153-4a5df916881a"). InnerVolumeSpecName "kube-api-access-7tflv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.391624 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65c4644c-f5f2-4c20-8153-4a5df916881a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65c4644c-f5f2-4c20-8153-4a5df916881a" (UID: "65c4644c-f5f2-4c20-8153-4a5df916881a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.463070 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tflv\" (UniqueName: \"kubernetes.io/projected/65c4644c-f5f2-4c20-8153-4a5df916881a-kube-api-access-7tflv\") on node \"crc\" DevicePath \"\"" Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.463116 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65c4644c-f5f2-4c20-8153-4a5df916881a-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.463126 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65c4644c-f5f2-4c20-8153-4a5df916881a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.509381 4746 generic.go:334] "Generic (PLEG): container finished" podID="65c4644c-f5f2-4c20-8153-4a5df916881a" containerID="cafacea8efc896b86c6de26c83abf99585ed0583b3209d0e639173922e5e6f50" exitCode=0 Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.509423 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nfpw" event={"ID":"65c4644c-f5f2-4c20-8153-4a5df916881a","Type":"ContainerDied","Data":"cafacea8efc896b86c6de26c83abf99585ed0583b3209d0e639173922e5e6f50"} Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.509451 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nfpw" event={"ID":"65c4644c-f5f2-4c20-8153-4a5df916881a","Type":"ContainerDied","Data":"b954d5e69b493dddd52ad82402dac0ebd7b23006b5caeccfd75020569ba5b7e2"} Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.509468 4746 scope.go:117] "RemoveContainer" containerID="cafacea8efc896b86c6de26c83abf99585ed0583b3209d0e639173922e5e6f50" Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.509589 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nfpw" Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.552147 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nfpw"] Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.569179 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nfpw"] Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.578213 4746 scope.go:117] "RemoveContainer" containerID="663d568246baa3720cb98dcc014a177ad4e91d972bce807870581a6534991d3c" Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.610251 4746 scope.go:117] "RemoveContainer" containerID="b482ab3b2156be3617fa7a2b8d670b9fc3f8c70d047cac0c8d37c37182c20e39" Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.668695 4746 scope.go:117] "RemoveContainer" containerID="cafacea8efc896b86c6de26c83abf99585ed0583b3209d0e639173922e5e6f50" Jan 28 21:42:55 crc kubenswrapper[4746]: E0128 21:42:55.669184 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cafacea8efc896b86c6de26c83abf99585ed0583b3209d0e639173922e5e6f50\": container with ID starting with cafacea8efc896b86c6de26c83abf99585ed0583b3209d0e639173922e5e6f50 not found: ID does not exist" containerID="cafacea8efc896b86c6de26c83abf99585ed0583b3209d0e639173922e5e6f50" Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.669252 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cafacea8efc896b86c6de26c83abf99585ed0583b3209d0e639173922e5e6f50"} err="failed to get container status \"cafacea8efc896b86c6de26c83abf99585ed0583b3209d0e639173922e5e6f50\": rpc error: code = NotFound desc = could not find container \"cafacea8efc896b86c6de26c83abf99585ed0583b3209d0e639173922e5e6f50\": container with ID starting with cafacea8efc896b86c6de26c83abf99585ed0583b3209d0e639173922e5e6f50 not found: ID does not exist" Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.669285 4746 scope.go:117] "RemoveContainer" containerID="663d568246baa3720cb98dcc014a177ad4e91d972bce807870581a6534991d3c" Jan 28 21:42:55 crc kubenswrapper[4746]: E0128 21:42:55.669664 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"663d568246baa3720cb98dcc014a177ad4e91d972bce807870581a6534991d3c\": container with ID starting with 663d568246baa3720cb98dcc014a177ad4e91d972bce807870581a6534991d3c not found: ID does not exist" containerID="663d568246baa3720cb98dcc014a177ad4e91d972bce807870581a6534991d3c" Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.669693 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"663d568246baa3720cb98dcc014a177ad4e91d972bce807870581a6534991d3c"} err="failed to get container status \"663d568246baa3720cb98dcc014a177ad4e91d972bce807870581a6534991d3c\": rpc error: code = NotFound desc = could not find container \"663d568246baa3720cb98dcc014a177ad4e91d972bce807870581a6534991d3c\": container with ID starting with 663d568246baa3720cb98dcc014a177ad4e91d972bce807870581a6534991d3c not found: ID does not exist" Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.669713 4746 scope.go:117] "RemoveContainer" containerID="b482ab3b2156be3617fa7a2b8d670b9fc3f8c70d047cac0c8d37c37182c20e39" Jan 28 21:42:55 crc kubenswrapper[4746]: E0128 21:42:55.669921 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b482ab3b2156be3617fa7a2b8d670b9fc3f8c70d047cac0c8d37c37182c20e39\": container with ID starting with b482ab3b2156be3617fa7a2b8d670b9fc3f8c70d047cac0c8d37c37182c20e39 not found: ID does not exist" containerID="b482ab3b2156be3617fa7a2b8d670b9fc3f8c70d047cac0c8d37c37182c20e39" Jan 28 21:42:55 crc kubenswrapper[4746]: I0128 21:42:55.669947 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b482ab3b2156be3617fa7a2b8d670b9fc3f8c70d047cac0c8d37c37182c20e39"} err="failed to get container status \"b482ab3b2156be3617fa7a2b8d670b9fc3f8c70d047cac0c8d37c37182c20e39\": rpc error: code = NotFound desc = could not find container \"b482ab3b2156be3617fa7a2b8d670b9fc3f8c70d047cac0c8d37c37182c20e39\": container with ID starting with b482ab3b2156be3617fa7a2b8d670b9fc3f8c70d047cac0c8d37c37182c20e39 not found: ID does not exist" Jan 28 21:42:56 crc kubenswrapper[4746]: I0128 21:42:56.847290 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65c4644c-f5f2-4c20-8153-4a5df916881a" path="/var/lib/kubelet/pods/65c4644c-f5f2-4c20-8153-4a5df916881a/volumes" Jan 28 21:43:15 crc kubenswrapper[4746]: I0128 21:43:15.871611 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:43:15 crc kubenswrapper[4746]: I0128 21:43:15.872174 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:43:37 crc kubenswrapper[4746]: I0128 21:43:37.897362 4746 generic.go:334] "Generic (PLEG): container finished" podID="d286f61d-5084-40ad-bad5-f62f9d66d7b1" containerID="e94f89637d4cf11d99b951b4498a6b179d665cf3e7df3b49b37df37d82854e94" exitCode=0 Jan 28 21:43:37 crc kubenswrapper[4746]: I0128 21:43:37.897427 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pjp7s/crc-debug-zhtlk" event={"ID":"d286f61d-5084-40ad-bad5-f62f9d66d7b1","Type":"ContainerDied","Data":"e94f89637d4cf11d99b951b4498a6b179d665cf3e7df3b49b37df37d82854e94"} Jan 28 21:43:39 crc kubenswrapper[4746]: I0128 21:43:39.146924 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pjp7s/crc-debug-zhtlk" Jan 28 21:43:39 crc kubenswrapper[4746]: I0128 21:43:39.190844 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pjp7s/crc-debug-zhtlk"] Jan 28 21:43:39 crc kubenswrapper[4746]: I0128 21:43:39.201070 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pjp7s/crc-debug-zhtlk"] Jan 28 21:43:39 crc kubenswrapper[4746]: I0128 21:43:39.272483 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d286f61d-5084-40ad-bad5-f62f9d66d7b1-host\") pod \"d286f61d-5084-40ad-bad5-f62f9d66d7b1\" (UID: \"d286f61d-5084-40ad-bad5-f62f9d66d7b1\") " Jan 28 21:43:39 crc kubenswrapper[4746]: I0128 21:43:39.272530 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks6tm\" (UniqueName: \"kubernetes.io/projected/d286f61d-5084-40ad-bad5-f62f9d66d7b1-kube-api-access-ks6tm\") pod \"d286f61d-5084-40ad-bad5-f62f9d66d7b1\" (UID: \"d286f61d-5084-40ad-bad5-f62f9d66d7b1\") " Jan 28 21:43:39 crc kubenswrapper[4746]: I0128 21:43:39.272942 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d286f61d-5084-40ad-bad5-f62f9d66d7b1-host" (OuterVolumeSpecName: "host") pod "d286f61d-5084-40ad-bad5-f62f9d66d7b1" (UID: "d286f61d-5084-40ad-bad5-f62f9d66d7b1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 21:43:39 crc kubenswrapper[4746]: I0128 21:43:39.273590 4746 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d286f61d-5084-40ad-bad5-f62f9d66d7b1-host\") on node \"crc\" DevicePath \"\"" Jan 28 21:43:39 crc kubenswrapper[4746]: I0128 21:43:39.278323 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d286f61d-5084-40ad-bad5-f62f9d66d7b1-kube-api-access-ks6tm" (OuterVolumeSpecName: "kube-api-access-ks6tm") pod "d286f61d-5084-40ad-bad5-f62f9d66d7b1" (UID: "d286f61d-5084-40ad-bad5-f62f9d66d7b1"). InnerVolumeSpecName "kube-api-access-ks6tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:43:39 crc kubenswrapper[4746]: I0128 21:43:39.375524 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks6tm\" (UniqueName: \"kubernetes.io/projected/d286f61d-5084-40ad-bad5-f62f9d66d7b1-kube-api-access-ks6tm\") on node \"crc\" DevicePath \"\"" Jan 28 21:43:40 crc kubenswrapper[4746]: I0128 21:43:40.020345 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecab8574cb239da70bb3672c21b356be0cb59929e6619623b6d16d77af8f642f" Jan 28 21:43:40 crc kubenswrapper[4746]: I0128 21:43:40.020448 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pjp7s/crc-debug-zhtlk" Jan 28 21:43:40 crc kubenswrapper[4746]: I0128 21:43:40.601522 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pjp7s/crc-debug-r7sfl"] Jan 28 21:43:40 crc kubenswrapper[4746]: E0128 21:43:40.601924 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65c4644c-f5f2-4c20-8153-4a5df916881a" containerName="extract-content" Jan 28 21:43:40 crc kubenswrapper[4746]: I0128 21:43:40.601938 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c4644c-f5f2-4c20-8153-4a5df916881a" containerName="extract-content" Jan 28 21:43:40 crc kubenswrapper[4746]: E0128 21:43:40.601952 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65c4644c-f5f2-4c20-8153-4a5df916881a" containerName="extract-utilities" Jan 28 21:43:40 crc kubenswrapper[4746]: I0128 21:43:40.601960 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c4644c-f5f2-4c20-8153-4a5df916881a" containerName="extract-utilities" Jan 28 21:43:40 crc kubenswrapper[4746]: E0128 21:43:40.601978 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65c4644c-f5f2-4c20-8153-4a5df916881a" containerName="registry-server" Jan 28 21:43:40 crc kubenswrapper[4746]: I0128 21:43:40.601986 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c4644c-f5f2-4c20-8153-4a5df916881a" containerName="registry-server" Jan 28 21:43:40 crc kubenswrapper[4746]: E0128 21:43:40.602003 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d286f61d-5084-40ad-bad5-f62f9d66d7b1" containerName="container-00" Jan 28 21:43:40 crc kubenswrapper[4746]: I0128 21:43:40.602010 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="d286f61d-5084-40ad-bad5-f62f9d66d7b1" containerName="container-00" Jan 28 21:43:40 crc kubenswrapper[4746]: I0128 21:43:40.602267 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="65c4644c-f5f2-4c20-8153-4a5df916881a" containerName="registry-server" Jan 28 21:43:40 crc kubenswrapper[4746]: I0128 21:43:40.602285 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="d286f61d-5084-40ad-bad5-f62f9d66d7b1" containerName="container-00" Jan 28 21:43:40 crc kubenswrapper[4746]: I0128 21:43:40.602989 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pjp7s/crc-debug-r7sfl" Jan 28 21:43:40 crc kubenswrapper[4746]: I0128 21:43:40.702775 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jlbd\" (UniqueName: \"kubernetes.io/projected/f693142a-e152-4541-bc50-2ae235e28844-kube-api-access-4jlbd\") pod \"crc-debug-r7sfl\" (UID: \"f693142a-e152-4541-bc50-2ae235e28844\") " pod="openshift-must-gather-pjp7s/crc-debug-r7sfl" Jan 28 21:43:40 crc kubenswrapper[4746]: I0128 21:43:40.703178 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f693142a-e152-4541-bc50-2ae235e28844-host\") pod \"crc-debug-r7sfl\" (UID: \"f693142a-e152-4541-bc50-2ae235e28844\") " pod="openshift-must-gather-pjp7s/crc-debug-r7sfl" Jan 28 21:43:40 crc kubenswrapper[4746]: I0128 21:43:40.804945 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f693142a-e152-4541-bc50-2ae235e28844-host\") pod \"crc-debug-r7sfl\" (UID: \"f693142a-e152-4541-bc50-2ae235e28844\") " pod="openshift-must-gather-pjp7s/crc-debug-r7sfl" Jan 28 21:43:40 crc kubenswrapper[4746]: I0128 21:43:40.805160 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f693142a-e152-4541-bc50-2ae235e28844-host\") pod \"crc-debug-r7sfl\" (UID: \"f693142a-e152-4541-bc50-2ae235e28844\") " pod="openshift-must-gather-pjp7s/crc-debug-r7sfl" Jan 28 21:43:40 crc kubenswrapper[4746]: I0128 21:43:40.805438 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jlbd\" (UniqueName: \"kubernetes.io/projected/f693142a-e152-4541-bc50-2ae235e28844-kube-api-access-4jlbd\") pod \"crc-debug-r7sfl\" (UID: \"f693142a-e152-4541-bc50-2ae235e28844\") " pod="openshift-must-gather-pjp7s/crc-debug-r7sfl" Jan 28 21:43:40 crc kubenswrapper[4746]: I0128 21:43:40.828286 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jlbd\" (UniqueName: \"kubernetes.io/projected/f693142a-e152-4541-bc50-2ae235e28844-kube-api-access-4jlbd\") pod \"crc-debug-r7sfl\" (UID: \"f693142a-e152-4541-bc50-2ae235e28844\") " pod="openshift-must-gather-pjp7s/crc-debug-r7sfl" Jan 28 21:43:40 crc kubenswrapper[4746]: I0128 21:43:40.845333 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d286f61d-5084-40ad-bad5-f62f9d66d7b1" path="/var/lib/kubelet/pods/d286f61d-5084-40ad-bad5-f62f9d66d7b1/volumes" Jan 28 21:43:40 crc kubenswrapper[4746]: I0128 21:43:40.923384 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pjp7s/crc-debug-r7sfl" Jan 28 21:43:41 crc kubenswrapper[4746]: I0128 21:43:41.034453 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pjp7s/crc-debug-r7sfl" event={"ID":"f693142a-e152-4541-bc50-2ae235e28844","Type":"ContainerStarted","Data":"dfc43147453c40bcd6c35c9c0c9004b5ed167b925f48d5fed10d37b1eff7adef"} Jan 28 21:43:42 crc kubenswrapper[4746]: I0128 21:43:42.044047 4746 generic.go:334] "Generic (PLEG): container finished" podID="f693142a-e152-4541-bc50-2ae235e28844" containerID="fa48f78d8a459990ab27d5e0d3a7362141af9dd1ef6dd9ac093376c1e06c240f" exitCode=0 Jan 28 21:43:42 crc kubenswrapper[4746]: I0128 21:43:42.044128 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pjp7s/crc-debug-r7sfl" event={"ID":"f693142a-e152-4541-bc50-2ae235e28844","Type":"ContainerDied","Data":"fa48f78d8a459990ab27d5e0d3a7362141af9dd1ef6dd9ac093376c1e06c240f"} Jan 28 21:43:43 crc kubenswrapper[4746]: I0128 21:43:43.202481 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pjp7s/crc-debug-r7sfl" Jan 28 21:43:43 crc kubenswrapper[4746]: I0128 21:43:43.353370 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jlbd\" (UniqueName: \"kubernetes.io/projected/f693142a-e152-4541-bc50-2ae235e28844-kube-api-access-4jlbd\") pod \"f693142a-e152-4541-bc50-2ae235e28844\" (UID: \"f693142a-e152-4541-bc50-2ae235e28844\") " Jan 28 21:43:43 crc kubenswrapper[4746]: I0128 21:43:43.353819 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f693142a-e152-4541-bc50-2ae235e28844-host\") pod \"f693142a-e152-4541-bc50-2ae235e28844\" (UID: \"f693142a-e152-4541-bc50-2ae235e28844\") " Jan 28 21:43:43 crc kubenswrapper[4746]: I0128 21:43:43.354697 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f693142a-e152-4541-bc50-2ae235e28844-host" (OuterVolumeSpecName: "host") pod "f693142a-e152-4541-bc50-2ae235e28844" (UID: "f693142a-e152-4541-bc50-2ae235e28844"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 21:43:43 crc kubenswrapper[4746]: I0128 21:43:43.378606 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f693142a-e152-4541-bc50-2ae235e28844-kube-api-access-4jlbd" (OuterVolumeSpecName: "kube-api-access-4jlbd") pod "f693142a-e152-4541-bc50-2ae235e28844" (UID: "f693142a-e152-4541-bc50-2ae235e28844"). InnerVolumeSpecName "kube-api-access-4jlbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:43:43 crc kubenswrapper[4746]: I0128 21:43:43.455964 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jlbd\" (UniqueName: \"kubernetes.io/projected/f693142a-e152-4541-bc50-2ae235e28844-kube-api-access-4jlbd\") on node \"crc\" DevicePath \"\"" Jan 28 21:43:43 crc kubenswrapper[4746]: I0128 21:43:43.455997 4746 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f693142a-e152-4541-bc50-2ae235e28844-host\") on node \"crc\" DevicePath \"\"" Jan 28 21:43:43 crc kubenswrapper[4746]: I0128 21:43:43.581938 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pjp7s/crc-debug-r7sfl"] Jan 28 21:43:43 crc kubenswrapper[4746]: I0128 21:43:43.592095 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pjp7s/crc-debug-r7sfl"] Jan 28 21:43:44 crc kubenswrapper[4746]: I0128 21:43:44.070661 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfc43147453c40bcd6c35c9c0c9004b5ed167b925f48d5fed10d37b1eff7adef" Jan 28 21:43:44 crc kubenswrapper[4746]: I0128 21:43:44.070711 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pjp7s/crc-debug-r7sfl" Jan 28 21:43:44 crc kubenswrapper[4746]: I0128 21:43:44.846444 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f693142a-e152-4541-bc50-2ae235e28844" path="/var/lib/kubelet/pods/f693142a-e152-4541-bc50-2ae235e28844/volumes" Jan 28 21:43:45 crc kubenswrapper[4746]: I0128 21:43:45.099074 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pjp7s/crc-debug-7vmlh"] Jan 28 21:43:45 crc kubenswrapper[4746]: E0128 21:43:45.099517 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f693142a-e152-4541-bc50-2ae235e28844" containerName="container-00" Jan 28 21:43:45 crc kubenswrapper[4746]: I0128 21:43:45.099534 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f693142a-e152-4541-bc50-2ae235e28844" containerName="container-00" Jan 28 21:43:45 crc kubenswrapper[4746]: I0128 21:43:45.099742 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="f693142a-e152-4541-bc50-2ae235e28844" containerName="container-00" Jan 28 21:43:45 crc kubenswrapper[4746]: I0128 21:43:45.100445 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pjp7s/crc-debug-7vmlh" Jan 28 21:43:45 crc kubenswrapper[4746]: I0128 21:43:45.189231 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhrt9\" (UniqueName: \"kubernetes.io/projected/3130cdfc-d68d-47d7-8ee9-aefb6cdff777-kube-api-access-xhrt9\") pod \"crc-debug-7vmlh\" (UID: \"3130cdfc-d68d-47d7-8ee9-aefb6cdff777\") " pod="openshift-must-gather-pjp7s/crc-debug-7vmlh" Jan 28 21:43:45 crc kubenswrapper[4746]: I0128 21:43:45.189775 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3130cdfc-d68d-47d7-8ee9-aefb6cdff777-host\") pod \"crc-debug-7vmlh\" (UID: \"3130cdfc-d68d-47d7-8ee9-aefb6cdff777\") " pod="openshift-must-gather-pjp7s/crc-debug-7vmlh" Jan 28 21:43:45 crc kubenswrapper[4746]: I0128 21:43:45.292067 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3130cdfc-d68d-47d7-8ee9-aefb6cdff777-host\") pod \"crc-debug-7vmlh\" (UID: \"3130cdfc-d68d-47d7-8ee9-aefb6cdff777\") " pod="openshift-must-gather-pjp7s/crc-debug-7vmlh" Jan 28 21:43:45 crc kubenswrapper[4746]: I0128 21:43:45.292427 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3130cdfc-d68d-47d7-8ee9-aefb6cdff777-host\") pod \"crc-debug-7vmlh\" (UID: \"3130cdfc-d68d-47d7-8ee9-aefb6cdff777\") " pod="openshift-must-gather-pjp7s/crc-debug-7vmlh" Jan 28 21:43:45 crc kubenswrapper[4746]: I0128 21:43:45.292453 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhrt9\" (UniqueName: \"kubernetes.io/projected/3130cdfc-d68d-47d7-8ee9-aefb6cdff777-kube-api-access-xhrt9\") pod \"crc-debug-7vmlh\" (UID: \"3130cdfc-d68d-47d7-8ee9-aefb6cdff777\") " pod="openshift-must-gather-pjp7s/crc-debug-7vmlh" Jan 28 21:43:45 crc kubenswrapper[4746]: I0128 21:43:45.317863 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhrt9\" (UniqueName: \"kubernetes.io/projected/3130cdfc-d68d-47d7-8ee9-aefb6cdff777-kube-api-access-xhrt9\") pod \"crc-debug-7vmlh\" (UID: \"3130cdfc-d68d-47d7-8ee9-aefb6cdff777\") " pod="openshift-must-gather-pjp7s/crc-debug-7vmlh" Jan 28 21:43:45 crc kubenswrapper[4746]: I0128 21:43:45.419403 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pjp7s/crc-debug-7vmlh" Jan 28 21:43:45 crc kubenswrapper[4746]: W0128 21:43:45.447438 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3130cdfc_d68d_47d7_8ee9_aefb6cdff777.slice/crio-9fd62de208084f7dddc49e30776cf5f8232a42f806bf15da48f35ec06fdce79f WatchSource:0}: Error finding container 9fd62de208084f7dddc49e30776cf5f8232a42f806bf15da48f35ec06fdce79f: Status 404 returned error can't find the container with id 9fd62de208084f7dddc49e30776cf5f8232a42f806bf15da48f35ec06fdce79f Jan 28 21:43:45 crc kubenswrapper[4746]: I0128 21:43:45.871420 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:43:45 crc kubenswrapper[4746]: I0128 21:43:45.871754 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:43:45 crc kubenswrapper[4746]: I0128 21:43:45.871814 4746 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 21:43:45 crc kubenswrapper[4746]: I0128 21:43:45.872960 4746 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430"} pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 21:43:45 crc kubenswrapper[4746]: I0128 21:43:45.873068 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" containerID="cri-o://2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" gracePeriod=600 Jan 28 21:43:46 crc kubenswrapper[4746]: E0128 21:43:46.003503 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:43:46 crc kubenswrapper[4746]: I0128 21:43:46.089101 4746 generic.go:334] "Generic (PLEG): container finished" podID="3130cdfc-d68d-47d7-8ee9-aefb6cdff777" containerID="190d8c84ddd2d763d994a92263da842819c1f953d09cabaf61b4bc3d0d8d59d3" exitCode=0 Jan 28 21:43:46 crc kubenswrapper[4746]: I0128 21:43:46.089194 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pjp7s/crc-debug-7vmlh" event={"ID":"3130cdfc-d68d-47d7-8ee9-aefb6cdff777","Type":"ContainerDied","Data":"190d8c84ddd2d763d994a92263da842819c1f953d09cabaf61b4bc3d0d8d59d3"} Jan 28 21:43:46 crc kubenswrapper[4746]: I0128 21:43:46.089226 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pjp7s/crc-debug-7vmlh" event={"ID":"3130cdfc-d68d-47d7-8ee9-aefb6cdff777","Type":"ContainerStarted","Data":"9fd62de208084f7dddc49e30776cf5f8232a42f806bf15da48f35ec06fdce79f"} Jan 28 21:43:46 crc kubenswrapper[4746]: I0128 21:43:46.092384 4746 generic.go:334] "Generic (PLEG): container finished" podID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" exitCode=0 Jan 28 21:43:46 crc kubenswrapper[4746]: I0128 21:43:46.092443 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerDied","Data":"2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430"} Jan 28 21:43:46 crc kubenswrapper[4746]: I0128 21:43:46.092599 4746 scope.go:117] "RemoveContainer" containerID="51c1ec6fe023b6b43a0fdc61b858972d2af817d4cc2d4c7a6797132edb658ffa" Jan 28 21:43:46 crc kubenswrapper[4746]: I0128 21:43:46.094118 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:43:46 crc kubenswrapper[4746]: E0128 21:43:46.094534 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:43:46 crc kubenswrapper[4746]: I0128 21:43:46.130045 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pjp7s/crc-debug-7vmlh"] Jan 28 21:43:46 crc kubenswrapper[4746]: I0128 21:43:46.154474 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pjp7s/crc-debug-7vmlh"] Jan 28 21:43:47 crc kubenswrapper[4746]: I0128 21:43:47.232932 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pjp7s/crc-debug-7vmlh" Jan 28 21:43:47 crc kubenswrapper[4746]: I0128 21:43:47.333377 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3130cdfc-d68d-47d7-8ee9-aefb6cdff777-host\") pod \"3130cdfc-d68d-47d7-8ee9-aefb6cdff777\" (UID: \"3130cdfc-d68d-47d7-8ee9-aefb6cdff777\") " Jan 28 21:43:47 crc kubenswrapper[4746]: I0128 21:43:47.333540 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3130cdfc-d68d-47d7-8ee9-aefb6cdff777-host" (OuterVolumeSpecName: "host") pod "3130cdfc-d68d-47d7-8ee9-aefb6cdff777" (UID: "3130cdfc-d68d-47d7-8ee9-aefb6cdff777"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 28 21:43:47 crc kubenswrapper[4746]: I0128 21:43:47.333568 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhrt9\" (UniqueName: \"kubernetes.io/projected/3130cdfc-d68d-47d7-8ee9-aefb6cdff777-kube-api-access-xhrt9\") pod \"3130cdfc-d68d-47d7-8ee9-aefb6cdff777\" (UID: \"3130cdfc-d68d-47d7-8ee9-aefb6cdff777\") " Jan 28 21:43:47 crc kubenswrapper[4746]: I0128 21:43:47.334181 4746 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3130cdfc-d68d-47d7-8ee9-aefb6cdff777-host\") on node \"crc\" DevicePath \"\"" Jan 28 21:43:47 crc kubenswrapper[4746]: I0128 21:43:47.343700 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3130cdfc-d68d-47d7-8ee9-aefb6cdff777-kube-api-access-xhrt9" (OuterVolumeSpecName: "kube-api-access-xhrt9") pod "3130cdfc-d68d-47d7-8ee9-aefb6cdff777" (UID: "3130cdfc-d68d-47d7-8ee9-aefb6cdff777"). InnerVolumeSpecName "kube-api-access-xhrt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:43:47 crc kubenswrapper[4746]: I0128 21:43:47.436251 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhrt9\" (UniqueName: \"kubernetes.io/projected/3130cdfc-d68d-47d7-8ee9-aefb6cdff777-kube-api-access-xhrt9\") on node \"crc\" DevicePath \"\"" Jan 28 21:43:48 crc kubenswrapper[4746]: I0128 21:43:48.115742 4746 scope.go:117] "RemoveContainer" containerID="190d8c84ddd2d763d994a92263da842819c1f953d09cabaf61b4bc3d0d8d59d3" Jan 28 21:43:48 crc kubenswrapper[4746]: I0128 21:43:48.115884 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pjp7s/crc-debug-7vmlh" Jan 28 21:43:48 crc kubenswrapper[4746]: I0128 21:43:48.850915 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3130cdfc-d68d-47d7-8ee9-aefb6cdff777" path="/var/lib/kubelet/pods/3130cdfc-d68d-47d7-8ee9-aefb6cdff777/volumes" Jan 28 21:43:57 crc kubenswrapper[4746]: I0128 21:43:57.836010 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:43:57 crc kubenswrapper[4746]: E0128 21:43:57.836793 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:44:10 crc kubenswrapper[4746]: I0128 21:44:10.836253 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:44:10 crc kubenswrapper[4746]: E0128 21:44:10.837788 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:44:24 crc kubenswrapper[4746]: I0128 21:44:24.836588 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:44:24 crc kubenswrapper[4746]: E0128 21:44:24.837848 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:44:27 crc kubenswrapper[4746]: I0128 21:44:27.936538 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0701e4bf-44d6-462c-a55b-140c2efceb6b/init-config-reloader/0.log" Jan 28 21:44:28 crc kubenswrapper[4746]: I0128 21:44:28.224497 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0701e4bf-44d6-462c-a55b-140c2efceb6b/init-config-reloader/0.log" Jan 28 21:44:28 crc kubenswrapper[4746]: I0128 21:44:28.260881 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0701e4bf-44d6-462c-a55b-140c2efceb6b/config-reloader/0.log" Jan 28 21:44:28 crc kubenswrapper[4746]: I0128 21:44:28.270588 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_0701e4bf-44d6-462c-a55b-140c2efceb6b/alertmanager/0.log" Jan 28 21:44:28 crc kubenswrapper[4746]: I0128 21:44:28.470067 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-79599f5dcd-btgz7_8fa661e3-776e-42b0-83db-374d372232ad/barbican-api/0.log" Jan 28 21:44:28 crc kubenswrapper[4746]: I0128 21:44:28.515633 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-79599f5dcd-btgz7_8fa661e3-776e-42b0-83db-374d372232ad/barbican-api-log/0.log" Jan 28 21:44:28 crc kubenswrapper[4746]: I0128 21:44:28.573907 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-74d8954788-pqmtp_9d6c401d-18ee-432b-992c-749c69887786/barbican-keystone-listener/0.log" Jan 28 21:44:28 crc kubenswrapper[4746]: I0128 21:44:28.806044 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-54d56bfd95-zhg7t_26db0906-ba06-4d40-b864-c7d956379296/barbican-worker-log/0.log" Jan 28 21:44:28 crc kubenswrapper[4746]: I0128 21:44:28.818532 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-74d8954788-pqmtp_9d6c401d-18ee-432b-992c-749c69887786/barbican-keystone-listener-log/0.log" Jan 28 21:44:28 crc kubenswrapper[4746]: I0128 21:44:28.820967 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-54d56bfd95-zhg7t_26db0906-ba06-4d40-b864-c7d956379296/barbican-worker/0.log" Jan 28 21:44:29 crc kubenswrapper[4746]: I0128 21:44:29.048157 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-x7ptr_ed8a3948-98ae-4e2a-a9f8-435287fc9583/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:44:29 crc kubenswrapper[4746]: I0128 21:44:29.269474 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6f6522f9-6035-4484-ba00-2255f04cd85d/ceilometer-central-agent/0.log" Jan 28 21:44:29 crc kubenswrapper[4746]: I0128 21:44:29.475300 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6f6522f9-6035-4484-ba00-2255f04cd85d/ceilometer-notification-agent/0.log" Jan 28 21:44:29 crc kubenswrapper[4746]: I0128 21:44:29.477054 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6f6522f9-6035-4484-ba00-2255f04cd85d/sg-core/0.log" Jan 28 21:44:29 crc kubenswrapper[4746]: I0128 21:44:29.503440 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6f6522f9-6035-4484-ba00-2255f04cd85d/proxy-httpd/0.log" Jan 28 21:44:29 crc kubenswrapper[4746]: I0128 21:44:29.712037 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a613bc41-1308-4925-a2df-026f6622f0c2/cinder-api-log/0.log" Jan 28 21:44:29 crc kubenswrapper[4746]: I0128 21:44:29.831174 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a613bc41-1308-4925-a2df-026f6622f0c2/cinder-api/0.log" Jan 28 21:44:29 crc kubenswrapper[4746]: I0128 21:44:29.929220 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9305786c-240c-4e6a-a110-599c0067ce78/cinder-scheduler/0.log" Jan 28 21:44:30 crc kubenswrapper[4746]: I0128 21:44:30.019152 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_9305786c-240c-4e6a-a110-599c0067ce78/probe/0.log" Jan 28 21:44:30 crc kubenswrapper[4746]: I0128 21:44:30.119253 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_e6b50cb8-f8a4-49e7-b464-7e42fc66e499/cloudkitty-api/0.log" Jan 28 21:44:30 crc kubenswrapper[4746]: I0128 21:44:30.140580 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_e6b50cb8-f8a4-49e7-b464-7e42fc66e499/cloudkitty-api-log/0.log" Jan 28 21:44:30 crc kubenswrapper[4746]: I0128 21:44:30.294588 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_6edc718f-ce48-415e-ae81-574ef1f48cb6/loki-compactor/0.log" Jan 28 21:44:30 crc kubenswrapper[4746]: I0128 21:44:30.425238 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-66dfd9bb-55rmf_7b3d4385-f154-424c-b7b6-280c36a88967/loki-distributor/0.log" Jan 28 21:44:30 crc kubenswrapper[4746]: I0128 21:44:30.660365 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7db4f4db8c-jptw9_247c16c1-2e4e-48dd-b836-0792f7231417/gateway/0.log" Jan 28 21:44:30 crc kubenswrapper[4746]: I0128 21:44:30.667406 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7db4f4db8c-s5zzt_f6b72417-5723-4d82-928b-f4be94e4bbfd/gateway/0.log" Jan 28 21:44:30 crc kubenswrapper[4746]: I0128 21:44:30.980048 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_e6e1dec5-d0eb-4a49-b8c7-c89f3defbcef/loki-index-gateway/0.log" Jan 28 21:44:31 crc kubenswrapper[4746]: I0128 21:44:31.062441 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_d3cad0b0-7b53-4280-9dec-05e01692820c/loki-ingester/0.log" Jan 28 21:44:31 crc kubenswrapper[4746]: I0128 21:44:31.302996 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-5cd44666df-gnlkh_add39f1a-2338-41e9-9a61-d32fe5a28097/loki-query-frontend/0.log" Jan 28 21:44:31 crc kubenswrapper[4746]: I0128 21:44:31.690153 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-795fd8f8cc-gb5z2_9f570ea4-b303-46ab-8a65-cf64391aeb3b/loki-querier/0.log" Jan 28 21:44:31 crc kubenswrapper[4746]: I0128 21:44:31.876668 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-tfrpf_f10b4bd3-0df0-4ab1-8ec6-4163541a3bf2/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:44:32 crc kubenswrapper[4746]: I0128 21:44:32.068687 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-42x6f_e9b6010d-cd57-4992-b441-1745330a0246/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:44:32 crc kubenswrapper[4746]: I0128 21:44:32.191184 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-pndxq_7f070414-7083-40c4-b7aa-db248c3fd681/init/0.log" Jan 28 21:44:32 crc kubenswrapper[4746]: I0128 21:44:32.382457 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-pndxq_7f070414-7083-40c4-b7aa-db248c3fd681/dnsmasq-dns/0.log" Jan 28 21:44:32 crc kubenswrapper[4746]: I0128 21:44:32.391731 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-pndxq_7f070414-7083-40c4-b7aa-db248c3fd681/init/0.log" Jan 28 21:44:32 crc kubenswrapper[4746]: I0128 21:44:32.613145 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-jgh8m_bd3a62cf-5636-4a92-8cc8-8025e70ad3d0/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:44:32 crc kubenswrapper[4746]: I0128 21:44:32.970119 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ed20e05e-643c-407e-bd2f-ce931e1e2bd1/glance-log/0.log" Jan 28 21:44:33 crc kubenswrapper[4746]: I0128 21:44:33.036649 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ed20e05e-643c-407e-bd2f-ce931e1e2bd1/glance-httpd/0.log" Jan 28 21:44:33 crc kubenswrapper[4746]: I0128 21:44:33.268934 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ffdc41d1-2cd3-446e-8d3f-6e374a19f56a/glance-httpd/0.log" Jan 28 21:44:33 crc kubenswrapper[4746]: I0128 21:44:33.281707 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ffdc41d1-2cd3-446e-8d3f-6e374a19f56a/glance-log/0.log" Jan 28 21:44:33 crc kubenswrapper[4746]: I0128 21:44:33.422904 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-5d8wd_0dbab66d-c007-4c33-b6da-1e44860668a0/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:44:33 crc kubenswrapper[4746]: I0128 21:44:33.513477 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-qdhsw_55aba866-d60c-4581-8f83-28fc14e421f8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:44:33 crc kubenswrapper[4746]: I0128 21:44:33.702760 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29493901-2dcqp_018e2b8e-63bb-41fd-8153-f0c8fc106af7/keystone-cron/0.log" Jan 28 21:44:33 crc kubenswrapper[4746]: I0128 21:44:33.954122 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_62270f68-89c1-462f-8aac-c4944f92cc3f/kube-state-metrics/0.log" Jan 28 21:44:33 crc kubenswrapper[4746]: I0128 21:44:33.988996 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_54f66341-e026-4e4e-a7d4-be4f199ff3d6/cloudkitty-proc/0.log" Jan 28 21:44:34 crc kubenswrapper[4746]: I0128 21:44:34.081310 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6ff88f78d4-bh6qm_b8f1ba06-a425-4474-94a2-80c68832caac/keystone-api/0.log" Jan 28 21:44:34 crc kubenswrapper[4746]: I0128 21:44:34.300990 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-qbvvx_7b6fd411-07ae-42b1-bb00-68e72fdbe6fb/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:44:34 crc kubenswrapper[4746]: I0128 21:44:34.604886 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5569c8497f-nhjcs_d9c2d514-3bdc-4969-a429-0aac820c8e77/neutron-httpd/0.log" Jan 28 21:44:34 crc kubenswrapper[4746]: I0128 21:44:34.639715 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5569c8497f-nhjcs_d9c2d514-3bdc-4969-a429-0aac820c8e77/neutron-api/0.log" Jan 28 21:44:34 crc kubenswrapper[4746]: I0128 21:44:34.689365 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-mnncp_92c386a4-a812-4e5f-938a-611be2d329ff/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:44:35 crc kubenswrapper[4746]: I0128 21:44:35.276256 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_aca41824-3271-42e1-93f8-76a1a9000681/nova-api-log/0.log" Jan 28 21:44:35 crc kubenswrapper[4746]: I0128 21:44:35.456831 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a34b42b7-85e9-4934-bbe2-487072111391/nova-cell0-conductor-conductor/0.log" Jan 28 21:44:35 crc kubenswrapper[4746]: I0128 21:44:35.677649 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b1d56ad0-5928-4211-a272-59aaab5e538b/nova-cell1-conductor-conductor/0.log" Jan 28 21:44:35 crc kubenswrapper[4746]: I0128 21:44:35.681485 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_aca41824-3271-42e1-93f8-76a1a9000681/nova-api-api/0.log" Jan 28 21:44:36 crc kubenswrapper[4746]: I0128 21:44:36.015433 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_3466ae6e-8f00-4a2c-896e-cf1268924542/nova-cell1-novncproxy-novncproxy/0.log" Jan 28 21:44:36 crc kubenswrapper[4746]: I0128 21:44:36.046600 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-dpjnx_1d1f9f12-edab-459d-b9ac-2bb03644b752/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:44:36 crc kubenswrapper[4746]: I0128 21:44:36.373855 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0/nova-metadata-log/0.log" Jan 28 21:44:36 crc kubenswrapper[4746]: I0128 21:44:36.592431 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d36f2955-7688-4b25-9097-becffcb1f3ad/nova-scheduler-scheduler/0.log" Jan 28 21:44:36 crc kubenswrapper[4746]: I0128 21:44:36.746342 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7257206d-db68-4f31-84d1-ceb4175ea394/mysql-bootstrap/0.log" Jan 28 21:44:36 crc kubenswrapper[4746]: I0128 21:44:36.919128 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7257206d-db68-4f31-84d1-ceb4175ea394/mysql-bootstrap/0.log" Jan 28 21:44:37 crc kubenswrapper[4746]: I0128 21:44:37.024616 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7257206d-db68-4f31-84d1-ceb4175ea394/galera/0.log" Jan 28 21:44:37 crc kubenswrapper[4746]: I0128 21:44:37.188397 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e98da54b-efd0-4811-a433-9ce8134feb13/mysql-bootstrap/0.log" Jan 28 21:44:37 crc kubenswrapper[4746]: I0128 21:44:37.417636 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e98da54b-efd0-4811-a433-9ce8134feb13/galera/0.log" Jan 28 21:44:37 crc kubenswrapper[4746]: I0128 21:44:37.526581 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e98da54b-efd0-4811-a433-9ce8134feb13/mysql-bootstrap/0.log" Jan 28 21:44:37 crc kubenswrapper[4746]: I0128 21:44:37.605971 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_305b9f70-9fd4-4f7c-a4c5-dc46a63ebbc0/nova-metadata-metadata/0.log" Jan 28 21:44:37 crc kubenswrapper[4746]: I0128 21:44:37.741530 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b9e46853-37d6-49c8-ada6-344f49a39e5f/openstackclient/0.log" Jan 28 21:44:37 crc kubenswrapper[4746]: I0128 21:44:37.742369 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-5vb9w_7349005b-b4d2-40b0-bc5c-d83acafaf9e3/openstack-network-exporter/0.log" Jan 28 21:44:37 crc kubenswrapper[4746]: I0128 21:44:37.835388 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:44:37 crc kubenswrapper[4746]: E0128 21:44:37.835624 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:44:38 crc kubenswrapper[4746]: I0128 21:44:38.068029 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ms9wc_754a9c43-4753-41cd-945d-93f7fa2b715e/ovn-controller/0.log" Jan 28 21:44:38 crc kubenswrapper[4746]: I0128 21:44:38.074933 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fcvh6_2b1288d6-9c28-48e5-a97f-bdd75de9b8a2/ovsdb-server-init/0.log" Jan 28 21:44:38 crc kubenswrapper[4746]: I0128 21:44:38.441787 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fcvh6_2b1288d6-9c28-48e5-a97f-bdd75de9b8a2/ovsdb-server/0.log" Jan 28 21:44:38 crc kubenswrapper[4746]: I0128 21:44:38.462417 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fcvh6_2b1288d6-9c28-48e5-a97f-bdd75de9b8a2/ovs-vswitchd/0.log" Jan 28 21:44:38 crc kubenswrapper[4746]: I0128 21:44:38.490870 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fcvh6_2b1288d6-9c28-48e5-a97f-bdd75de9b8a2/ovsdb-server-init/0.log" Jan 28 21:44:38 crc kubenswrapper[4746]: I0128 21:44:38.755369 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_af4de16a-caed-4c86-9cf8-da6f9214ca5f/openstack-network-exporter/0.log" Jan 28 21:44:38 crc kubenswrapper[4746]: I0128 21:44:38.767754 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-sfgd4_824d1a68-929d-4c25-801a-17fdf5172893/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:44:38 crc kubenswrapper[4746]: I0128 21:44:38.819231 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_af4de16a-caed-4c86-9cf8-da6f9214ca5f/ovn-northd/0.log" Jan 28 21:44:39 crc kubenswrapper[4746]: I0128 21:44:39.057691 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d1e0be80-baed-4c8f-affd-33a252b527ad/ovsdbserver-nb/0.log" Jan 28 21:44:39 crc kubenswrapper[4746]: I0128 21:44:39.281629 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d1e0be80-baed-4c8f-affd-33a252b527ad/openstack-network-exporter/0.log" Jan 28 21:44:39 crc kubenswrapper[4746]: I0128 21:44:39.459624 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_692d10ed-801f-47d2-b069-b3a0cb8dc4b7/openstack-network-exporter/0.log" Jan 28 21:44:39 crc kubenswrapper[4746]: I0128 21:44:39.517766 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_692d10ed-801f-47d2-b069-b3a0cb8dc4b7/ovsdbserver-sb/0.log" Jan 28 21:44:39 crc kubenswrapper[4746]: I0128 21:44:39.547358 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5755bdbcc4-rbmx8_12408645-b253-4e59-bd2f-5a4ec243cabd/placement-api/0.log" Jan 28 21:44:39 crc kubenswrapper[4746]: I0128 21:44:39.809831 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_914308b3-0f5e-4716-bc87-948f8a8acfb3/init-config-reloader/0.log" Jan 28 21:44:39 crc kubenswrapper[4746]: I0128 21:44:39.829244 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5755bdbcc4-rbmx8_12408645-b253-4e59-bd2f-5a4ec243cabd/placement-log/0.log" Jan 28 21:44:40 crc kubenswrapper[4746]: I0128 21:44:40.054724 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_914308b3-0f5e-4716-bc87-948f8a8acfb3/init-config-reloader/0.log" Jan 28 21:44:40 crc kubenswrapper[4746]: I0128 21:44:40.067133 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_914308b3-0f5e-4716-bc87-948f8a8acfb3/prometheus/0.log" Jan 28 21:44:40 crc kubenswrapper[4746]: I0128 21:44:40.110327 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_914308b3-0f5e-4716-bc87-948f8a8acfb3/config-reloader/0.log" Jan 28 21:44:40 crc kubenswrapper[4746]: I0128 21:44:40.202053 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_914308b3-0f5e-4716-bc87-948f8a8acfb3/thanos-sidecar/0.log" Jan 28 21:44:40 crc kubenswrapper[4746]: I0128 21:44:40.286551 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_31ed4da0-c996-4afb-aa3d-d61a7c13ccfb/setup-container/0.log" Jan 28 21:44:40 crc kubenswrapper[4746]: I0128 21:44:40.600414 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_31ed4da0-c996-4afb-aa3d-d61a7c13ccfb/rabbitmq/0.log" Jan 28 21:44:40 crc kubenswrapper[4746]: I0128 21:44:40.632398 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_31ed4da0-c996-4afb-aa3d-d61a7c13ccfb/setup-container/0.log" Jan 28 21:44:40 crc kubenswrapper[4746]: I0128 21:44:40.708510 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f330def9-769c-4adf-9df3-c1a7c54cd502/setup-container/0.log" Jan 28 21:44:40 crc kubenswrapper[4746]: I0128 21:44:40.945873 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f330def9-769c-4adf-9df3-c1a7c54cd502/setup-container/0.log" Jan 28 21:44:41 crc kubenswrapper[4746]: I0128 21:44:41.190412 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-p7xp2_a5bda0ca-2718-41bf-84d6-6c08d35d16b1/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:44:41 crc kubenswrapper[4746]: I0128 21:44:41.405027 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f330def9-769c-4adf-9df3-c1a7c54cd502/rabbitmq/0.log" Jan 28 21:44:41 crc kubenswrapper[4746]: I0128 21:44:41.619881 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-nvgcn_f2c97be5-d93c-4a83-87ad-48abb73d603c/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:44:41 crc kubenswrapper[4746]: I0128 21:44:41.621609 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-nfz6p_90153a28-4812-4b2e-a3a3-2443a8618c3d/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:44:41 crc kubenswrapper[4746]: I0128 21:44:41.872973 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-xv2wb_280f47f9-2f66-4991-bd8f-59b734c5a935/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:44:42 crc kubenswrapper[4746]: I0128 21:44:42.112438 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-44hcw_cb5bd90c-ea83-463c-aed8-3291063c50bc/ssh-known-hosts-edpm-deployment/0.log" Jan 28 21:44:42 crc kubenswrapper[4746]: I0128 21:44:42.372603 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5d6f7ddd75-47x9g_4524d3f6-9b61-4b9c-b778-0078a31efc3e/proxy-server/0.log" Jan 28 21:44:42 crc kubenswrapper[4746]: I0128 21:44:42.450765 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5d6f7ddd75-47x9g_4524d3f6-9b61-4b9c-b778-0078a31efc3e/proxy-httpd/0.log" Jan 28 21:44:42 crc kubenswrapper[4746]: I0128 21:44:42.510285 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-5sxj6_61a4ff02-ae06-438a-a39c-8264c8e61b38/swift-ring-rebalance/0.log" Jan 28 21:44:42 crc kubenswrapper[4746]: I0128 21:44:42.824485 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/account-replicator/0.log" Jan 28 21:44:42 crc kubenswrapper[4746]: I0128 21:44:42.826768 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/account-reaper/0.log" Jan 28 21:44:42 crc kubenswrapper[4746]: I0128 21:44:42.881824 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/account-auditor/0.log" Jan 28 21:44:43 crc kubenswrapper[4746]: I0128 21:44:43.080259 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/account-server/0.log" Jan 28 21:44:43 crc kubenswrapper[4746]: I0128 21:44:43.106895 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/container-replicator/0.log" Jan 28 21:44:43 crc kubenswrapper[4746]: I0128 21:44:43.108653 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/container-auditor/0.log" Jan 28 21:44:43 crc kubenswrapper[4746]: I0128 21:44:43.173880 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/container-server/0.log" Jan 28 21:44:43 crc kubenswrapper[4746]: I0128 21:44:43.274675 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/container-updater/0.log" Jan 28 21:44:43 crc kubenswrapper[4746]: I0128 21:44:43.359364 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/object-auditor/0.log" Jan 28 21:44:43 crc kubenswrapper[4746]: I0128 21:44:43.449956 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/object-expirer/0.log" Jan 28 21:44:43 crc kubenswrapper[4746]: I0128 21:44:43.486330 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/object-replicator/0.log" Jan 28 21:44:43 crc kubenswrapper[4746]: I0128 21:44:43.501028 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/object-server/0.log" Jan 28 21:44:43 crc kubenswrapper[4746]: I0128 21:44:43.609196 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/object-updater/0.log" Jan 28 21:44:43 crc kubenswrapper[4746]: I0128 21:44:43.633392 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/rsync/0.log" Jan 28 21:44:43 crc kubenswrapper[4746]: I0128 21:44:43.744834 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_39e8de66-78c6-45cf-b026-7783ef89922d/swift-recon-cron/0.log" Jan 28 21:44:44 crc kubenswrapper[4746]: I0128 21:44:44.078744 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-6jfqt_9c46ddb7-5815-475e-b798-06a7fee944c8/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:44:44 crc kubenswrapper[4746]: I0128 21:44:44.131364 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_4bc6c6e0-6feb-4270-ad0e-2e8cbbb3430a/tempest-tests-tempest-tests-runner/0.log" Jan 28 21:44:44 crc kubenswrapper[4746]: I0128 21:44:44.216883 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e4a0e4c1-a20f-4efd-aae8-3345c0c9c0f7/test-operator-logs-container/0.log" Jan 28 21:44:44 crc kubenswrapper[4746]: I0128 21:44:44.623069 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-rjltv_8728e263-d102-4878-a40e-30e414240224/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 28 21:44:45 crc kubenswrapper[4746]: I0128 21:44:45.781956 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xmrwm"] Jan 28 21:44:45 crc kubenswrapper[4746]: E0128 21:44:45.782636 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3130cdfc-d68d-47d7-8ee9-aefb6cdff777" containerName="container-00" Jan 28 21:44:45 crc kubenswrapper[4746]: I0128 21:44:45.782648 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="3130cdfc-d68d-47d7-8ee9-aefb6cdff777" containerName="container-00" Jan 28 21:44:45 crc kubenswrapper[4746]: I0128 21:44:45.782901 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="3130cdfc-d68d-47d7-8ee9-aefb6cdff777" containerName="container-00" Jan 28 21:44:45 crc kubenswrapper[4746]: I0128 21:44:45.784312 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmrwm" Jan 28 21:44:45 crc kubenswrapper[4746]: I0128 21:44:45.801696 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xmrwm"] Jan 28 21:44:45 crc kubenswrapper[4746]: I0128 21:44:45.868658 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2639171e-9ece-47e7-a06e-d363dda17700-utilities\") pod \"community-operators-xmrwm\" (UID: \"2639171e-9ece-47e7-a06e-d363dda17700\") " pod="openshift-marketplace/community-operators-xmrwm" Jan 28 21:44:45 crc kubenswrapper[4746]: I0128 21:44:45.868803 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2639171e-9ece-47e7-a06e-d363dda17700-catalog-content\") pod \"community-operators-xmrwm\" (UID: \"2639171e-9ece-47e7-a06e-d363dda17700\") " pod="openshift-marketplace/community-operators-xmrwm" Jan 28 21:44:45 crc kubenswrapper[4746]: I0128 21:44:45.868824 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwrt4\" (UniqueName: \"kubernetes.io/projected/2639171e-9ece-47e7-a06e-d363dda17700-kube-api-access-xwrt4\") pod \"community-operators-xmrwm\" (UID: \"2639171e-9ece-47e7-a06e-d363dda17700\") " pod="openshift-marketplace/community-operators-xmrwm" Jan 28 21:44:45 crc kubenswrapper[4746]: I0128 21:44:45.972649 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2639171e-9ece-47e7-a06e-d363dda17700-catalog-content\") pod \"community-operators-xmrwm\" (UID: \"2639171e-9ece-47e7-a06e-d363dda17700\") " pod="openshift-marketplace/community-operators-xmrwm" Jan 28 21:44:45 crc kubenswrapper[4746]: I0128 21:44:45.972686 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwrt4\" (UniqueName: \"kubernetes.io/projected/2639171e-9ece-47e7-a06e-d363dda17700-kube-api-access-xwrt4\") pod \"community-operators-xmrwm\" (UID: \"2639171e-9ece-47e7-a06e-d363dda17700\") " pod="openshift-marketplace/community-operators-xmrwm" Jan 28 21:44:45 crc kubenswrapper[4746]: I0128 21:44:45.972768 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2639171e-9ece-47e7-a06e-d363dda17700-utilities\") pod \"community-operators-xmrwm\" (UID: \"2639171e-9ece-47e7-a06e-d363dda17700\") " pod="openshift-marketplace/community-operators-xmrwm" Jan 28 21:44:45 crc kubenswrapper[4746]: I0128 21:44:45.973227 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2639171e-9ece-47e7-a06e-d363dda17700-catalog-content\") pod \"community-operators-xmrwm\" (UID: \"2639171e-9ece-47e7-a06e-d363dda17700\") " pod="openshift-marketplace/community-operators-xmrwm" Jan 28 21:44:45 crc kubenswrapper[4746]: I0128 21:44:45.973794 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2639171e-9ece-47e7-a06e-d363dda17700-utilities\") pod \"community-operators-xmrwm\" (UID: \"2639171e-9ece-47e7-a06e-d363dda17700\") " pod="openshift-marketplace/community-operators-xmrwm" Jan 28 21:44:46 crc kubenswrapper[4746]: I0128 21:44:46.014021 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwrt4\" (UniqueName: \"kubernetes.io/projected/2639171e-9ece-47e7-a06e-d363dda17700-kube-api-access-xwrt4\") pod \"community-operators-xmrwm\" (UID: \"2639171e-9ece-47e7-a06e-d363dda17700\") " pod="openshift-marketplace/community-operators-xmrwm" Jan 28 21:44:46 crc kubenswrapper[4746]: I0128 21:44:46.104181 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmrwm" Jan 28 21:44:46 crc kubenswrapper[4746]: I0128 21:44:46.758713 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xmrwm"] Jan 28 21:44:47 crc kubenswrapper[4746]: I0128 21:44:47.645524 4746 generic.go:334] "Generic (PLEG): container finished" podID="2639171e-9ece-47e7-a06e-d363dda17700" containerID="55667b129bad8521fe17460f1876d79c1f6fd904ce96e11cdda875497b71c4ec" exitCode=0 Jan 28 21:44:47 crc kubenswrapper[4746]: I0128 21:44:47.645567 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmrwm" event={"ID":"2639171e-9ece-47e7-a06e-d363dda17700","Type":"ContainerDied","Data":"55667b129bad8521fe17460f1876d79c1f6fd904ce96e11cdda875497b71c4ec"} Jan 28 21:44:47 crc kubenswrapper[4746]: I0128 21:44:47.645594 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmrwm" event={"ID":"2639171e-9ece-47e7-a06e-d363dda17700","Type":"ContainerStarted","Data":"b88629c36fca8f8a514c749958060f0b17b64ebdb53006b37ebd3a282c62a0b7"} Jan 28 21:44:47 crc kubenswrapper[4746]: I0128 21:44:47.719070 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_8c01b9a6-3e78-4a0c-9825-e39856c2df93/memcached/0.log" Jan 28 21:44:48 crc kubenswrapper[4746]: I0128 21:44:48.655089 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmrwm" event={"ID":"2639171e-9ece-47e7-a06e-d363dda17700","Type":"ContainerStarted","Data":"066ec1e16d88b3e653e1344862ed9aab37960af6b85b7cdef213f0d89dec4910"} Jan 28 21:44:50 crc kubenswrapper[4746]: I0128 21:44:50.673538 4746 generic.go:334] "Generic (PLEG): container finished" podID="2639171e-9ece-47e7-a06e-d363dda17700" containerID="066ec1e16d88b3e653e1344862ed9aab37960af6b85b7cdef213f0d89dec4910" exitCode=0 Jan 28 21:44:50 crc kubenswrapper[4746]: I0128 21:44:50.673599 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmrwm" event={"ID":"2639171e-9ece-47e7-a06e-d363dda17700","Type":"ContainerDied","Data":"066ec1e16d88b3e653e1344862ed9aab37960af6b85b7cdef213f0d89dec4910"} Jan 28 21:44:51 crc kubenswrapper[4746]: I0128 21:44:51.687206 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmrwm" event={"ID":"2639171e-9ece-47e7-a06e-d363dda17700","Type":"ContainerStarted","Data":"ba5715a99765e3ce5296bfd6ea83dfa93c84fbf17246700162b57954eafca347"} Jan 28 21:44:51 crc kubenswrapper[4746]: I0128 21:44:51.722931 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xmrwm" podStartSLOduration=3.335658694 podStartE2EDuration="6.722914292s" podCreationTimestamp="2026-01-28 21:44:45 +0000 UTC" firstStartedPulling="2026-01-28 21:44:47.650302697 +0000 UTC m=+3915.606489051" lastFinishedPulling="2026-01-28 21:44:51.037558285 +0000 UTC m=+3918.993744649" observedRunningTime="2026-01-28 21:44:51.719937781 +0000 UTC m=+3919.676124135" watchObservedRunningTime="2026-01-28 21:44:51.722914292 +0000 UTC m=+3919.679100646" Jan 28 21:44:52 crc kubenswrapper[4746]: I0128 21:44:52.841654 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:44:52 crc kubenswrapper[4746]: E0128 21:44:52.842118 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:44:56 crc kubenswrapper[4746]: I0128 21:44:56.105202 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xmrwm" Jan 28 21:44:56 crc kubenswrapper[4746]: I0128 21:44:56.105700 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xmrwm" Jan 28 21:44:56 crc kubenswrapper[4746]: I0128 21:44:56.156105 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xmrwm" Jan 28 21:44:56 crc kubenswrapper[4746]: I0128 21:44:56.771581 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xmrwm" Jan 28 21:44:57 crc kubenswrapper[4746]: I0128 21:44:57.960179 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xmrwm"] Jan 28 21:44:58 crc kubenswrapper[4746]: I0128 21:44:58.746989 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xmrwm" podUID="2639171e-9ece-47e7-a06e-d363dda17700" containerName="registry-server" containerID="cri-o://ba5715a99765e3ce5296bfd6ea83dfa93c84fbf17246700162b57954eafca347" gracePeriod=2 Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.649683 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmrwm" Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.767323 4746 generic.go:334] "Generic (PLEG): container finished" podID="2639171e-9ece-47e7-a06e-d363dda17700" containerID="ba5715a99765e3ce5296bfd6ea83dfa93c84fbf17246700162b57954eafca347" exitCode=0 Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.767371 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmrwm" event={"ID":"2639171e-9ece-47e7-a06e-d363dda17700","Type":"ContainerDied","Data":"ba5715a99765e3ce5296bfd6ea83dfa93c84fbf17246700162b57954eafca347"} Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.767399 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xmrwm" Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.767420 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xmrwm" event={"ID":"2639171e-9ece-47e7-a06e-d363dda17700","Type":"ContainerDied","Data":"b88629c36fca8f8a514c749958060f0b17b64ebdb53006b37ebd3a282c62a0b7"} Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.767438 4746 scope.go:117] "RemoveContainer" containerID="ba5715a99765e3ce5296bfd6ea83dfa93c84fbf17246700162b57954eafca347" Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.776673 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwrt4\" (UniqueName: \"kubernetes.io/projected/2639171e-9ece-47e7-a06e-d363dda17700-kube-api-access-xwrt4\") pod \"2639171e-9ece-47e7-a06e-d363dda17700\" (UID: \"2639171e-9ece-47e7-a06e-d363dda17700\") " Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.776794 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2639171e-9ece-47e7-a06e-d363dda17700-utilities\") pod \"2639171e-9ece-47e7-a06e-d363dda17700\" (UID: \"2639171e-9ece-47e7-a06e-d363dda17700\") " Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.777706 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2639171e-9ece-47e7-a06e-d363dda17700-utilities" (OuterVolumeSpecName: "utilities") pod "2639171e-9ece-47e7-a06e-d363dda17700" (UID: "2639171e-9ece-47e7-a06e-d363dda17700"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.777852 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2639171e-9ece-47e7-a06e-d363dda17700-catalog-content\") pod \"2639171e-9ece-47e7-a06e-d363dda17700\" (UID: \"2639171e-9ece-47e7-a06e-d363dda17700\") " Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.784097 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2639171e-9ece-47e7-a06e-d363dda17700-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.798506 4746 scope.go:117] "RemoveContainer" containerID="066ec1e16d88b3e653e1344862ed9aab37960af6b85b7cdef213f0d89dec4910" Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.798541 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2639171e-9ece-47e7-a06e-d363dda17700-kube-api-access-xwrt4" (OuterVolumeSpecName: "kube-api-access-xwrt4") pod "2639171e-9ece-47e7-a06e-d363dda17700" (UID: "2639171e-9ece-47e7-a06e-d363dda17700"). InnerVolumeSpecName "kube-api-access-xwrt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.845662 4746 scope.go:117] "RemoveContainer" containerID="55667b129bad8521fe17460f1876d79c1f6fd904ce96e11cdda875497b71c4ec" Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.857985 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2639171e-9ece-47e7-a06e-d363dda17700-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2639171e-9ece-47e7-a06e-d363dda17700" (UID: "2639171e-9ece-47e7-a06e-d363dda17700"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.885986 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwrt4\" (UniqueName: \"kubernetes.io/projected/2639171e-9ece-47e7-a06e-d363dda17700-kube-api-access-xwrt4\") on node \"crc\" DevicePath \"\"" Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.886216 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2639171e-9ece-47e7-a06e-d363dda17700-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.888888 4746 scope.go:117] "RemoveContainer" containerID="ba5715a99765e3ce5296bfd6ea83dfa93c84fbf17246700162b57954eafca347" Jan 28 21:44:59 crc kubenswrapper[4746]: E0128 21:44:59.891439 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba5715a99765e3ce5296bfd6ea83dfa93c84fbf17246700162b57954eafca347\": container with ID starting with ba5715a99765e3ce5296bfd6ea83dfa93c84fbf17246700162b57954eafca347 not found: ID does not exist" containerID="ba5715a99765e3ce5296bfd6ea83dfa93c84fbf17246700162b57954eafca347" Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.891503 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba5715a99765e3ce5296bfd6ea83dfa93c84fbf17246700162b57954eafca347"} err="failed to get container status \"ba5715a99765e3ce5296bfd6ea83dfa93c84fbf17246700162b57954eafca347\": rpc error: code = NotFound desc = could not find container \"ba5715a99765e3ce5296bfd6ea83dfa93c84fbf17246700162b57954eafca347\": container with ID starting with ba5715a99765e3ce5296bfd6ea83dfa93c84fbf17246700162b57954eafca347 not found: ID does not exist" Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.891529 4746 scope.go:117] "RemoveContainer" containerID="066ec1e16d88b3e653e1344862ed9aab37960af6b85b7cdef213f0d89dec4910" Jan 28 21:44:59 crc kubenswrapper[4746]: E0128 21:44:59.891785 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"066ec1e16d88b3e653e1344862ed9aab37960af6b85b7cdef213f0d89dec4910\": container with ID starting with 066ec1e16d88b3e653e1344862ed9aab37960af6b85b7cdef213f0d89dec4910 not found: ID does not exist" containerID="066ec1e16d88b3e653e1344862ed9aab37960af6b85b7cdef213f0d89dec4910" Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.891816 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"066ec1e16d88b3e653e1344862ed9aab37960af6b85b7cdef213f0d89dec4910"} err="failed to get container status \"066ec1e16d88b3e653e1344862ed9aab37960af6b85b7cdef213f0d89dec4910\": rpc error: code = NotFound desc = could not find container \"066ec1e16d88b3e653e1344862ed9aab37960af6b85b7cdef213f0d89dec4910\": container with ID starting with 066ec1e16d88b3e653e1344862ed9aab37960af6b85b7cdef213f0d89dec4910 not found: ID does not exist" Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.891831 4746 scope.go:117] "RemoveContainer" containerID="55667b129bad8521fe17460f1876d79c1f6fd904ce96e11cdda875497b71c4ec" Jan 28 21:44:59 crc kubenswrapper[4746]: E0128 21:44:59.891995 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55667b129bad8521fe17460f1876d79c1f6fd904ce96e11cdda875497b71c4ec\": container with ID starting with 55667b129bad8521fe17460f1876d79c1f6fd904ce96e11cdda875497b71c4ec not found: ID does not exist" containerID="55667b129bad8521fe17460f1876d79c1f6fd904ce96e11cdda875497b71c4ec" Jan 28 21:44:59 crc kubenswrapper[4746]: I0128 21:44:59.892014 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55667b129bad8521fe17460f1876d79c1f6fd904ce96e11cdda875497b71c4ec"} err="failed to get container status \"55667b129bad8521fe17460f1876d79c1f6fd904ce96e11cdda875497b71c4ec\": rpc error: code = NotFound desc = could not find container \"55667b129bad8521fe17460f1876d79c1f6fd904ce96e11cdda875497b71c4ec\": container with ID starting with 55667b129bad8521fe17460f1876d79c1f6fd904ce96e11cdda875497b71c4ec not found: ID does not exist" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.097416 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xmrwm"] Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.104848 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xmrwm"] Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.181023 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc"] Jan 28 21:45:00 crc kubenswrapper[4746]: E0128 21:45:00.181432 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2639171e-9ece-47e7-a06e-d363dda17700" containerName="extract-utilities" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.181449 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="2639171e-9ece-47e7-a06e-d363dda17700" containerName="extract-utilities" Jan 28 21:45:00 crc kubenswrapper[4746]: E0128 21:45:00.181476 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2639171e-9ece-47e7-a06e-d363dda17700" containerName="extract-content" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.181483 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="2639171e-9ece-47e7-a06e-d363dda17700" containerName="extract-content" Jan 28 21:45:00 crc kubenswrapper[4746]: E0128 21:45:00.181498 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2639171e-9ece-47e7-a06e-d363dda17700" containerName="registry-server" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.181504 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="2639171e-9ece-47e7-a06e-d363dda17700" containerName="registry-server" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.181710 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="2639171e-9ece-47e7-a06e-d363dda17700" containerName="registry-server" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.182592 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.185479 4746 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.185591 4746 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.198435 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc"] Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.293057 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e81e481-7c5b-44bf-8f7d-ae0a582942a5-config-volume\") pod \"collect-profiles-29493945-ftcjc\" (UID: \"6e81e481-7c5b-44bf-8f7d-ae0a582942a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.293199 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e81e481-7c5b-44bf-8f7d-ae0a582942a5-secret-volume\") pod \"collect-profiles-29493945-ftcjc\" (UID: \"6e81e481-7c5b-44bf-8f7d-ae0a582942a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.293302 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5gqq\" (UniqueName: \"kubernetes.io/projected/6e81e481-7c5b-44bf-8f7d-ae0a582942a5-kube-api-access-d5gqq\") pod \"collect-profiles-29493945-ftcjc\" (UID: \"6e81e481-7c5b-44bf-8f7d-ae0a582942a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.395089 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e81e481-7c5b-44bf-8f7d-ae0a582942a5-config-volume\") pod \"collect-profiles-29493945-ftcjc\" (UID: \"6e81e481-7c5b-44bf-8f7d-ae0a582942a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.395203 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e81e481-7c5b-44bf-8f7d-ae0a582942a5-secret-volume\") pod \"collect-profiles-29493945-ftcjc\" (UID: \"6e81e481-7c5b-44bf-8f7d-ae0a582942a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.395293 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5gqq\" (UniqueName: \"kubernetes.io/projected/6e81e481-7c5b-44bf-8f7d-ae0a582942a5-kube-api-access-d5gqq\") pod \"collect-profiles-29493945-ftcjc\" (UID: \"6e81e481-7c5b-44bf-8f7d-ae0a582942a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.396413 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e81e481-7c5b-44bf-8f7d-ae0a582942a5-config-volume\") pod \"collect-profiles-29493945-ftcjc\" (UID: \"6e81e481-7c5b-44bf-8f7d-ae0a582942a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.400220 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e81e481-7c5b-44bf-8f7d-ae0a582942a5-secret-volume\") pod \"collect-profiles-29493945-ftcjc\" (UID: \"6e81e481-7c5b-44bf-8f7d-ae0a582942a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.410817 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5gqq\" (UniqueName: \"kubernetes.io/projected/6e81e481-7c5b-44bf-8f7d-ae0a582942a5-kube-api-access-d5gqq\") pod \"collect-profiles-29493945-ftcjc\" (UID: \"6e81e481-7c5b-44bf-8f7d-ae0a582942a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.502137 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc" Jan 28 21:45:00 crc kubenswrapper[4746]: I0128 21:45:00.848625 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2639171e-9ece-47e7-a06e-d363dda17700" path="/var/lib/kubelet/pods/2639171e-9ece-47e7-a06e-d363dda17700/volumes" Jan 28 21:45:01 crc kubenswrapper[4746]: I0128 21:45:01.002070 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc"] Jan 28 21:45:01 crc kubenswrapper[4746]: I0128 21:45:01.787057 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc" event={"ID":"6e81e481-7c5b-44bf-8f7d-ae0a582942a5","Type":"ContainerStarted","Data":"f035390f3b7c197dd898fbdbc5b613e2ca0094c2ca90d8b8d629c18a150c34cf"} Jan 28 21:45:01 crc kubenswrapper[4746]: I0128 21:45:01.787372 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc" event={"ID":"6e81e481-7c5b-44bf-8f7d-ae0a582942a5","Type":"ContainerStarted","Data":"7f1724e1e93066b5a570e8c219b4ede2d076dd3e81eca95c8d8d528d29270c4b"} Jan 28 21:45:01 crc kubenswrapper[4746]: I0128 21:45:01.808795 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc" podStartSLOduration=1.808754949 podStartE2EDuration="1.808754949s" podCreationTimestamp="2026-01-28 21:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-28 21:45:01.801531773 +0000 UTC m=+3929.757718137" watchObservedRunningTime="2026-01-28 21:45:01.808754949 +0000 UTC m=+3929.764941313" Jan 28 21:45:02 crc kubenswrapper[4746]: I0128 21:45:02.797720 4746 generic.go:334] "Generic (PLEG): container finished" podID="6e81e481-7c5b-44bf-8f7d-ae0a582942a5" containerID="f035390f3b7c197dd898fbdbc5b613e2ca0094c2ca90d8b8d629c18a150c34cf" exitCode=0 Jan 28 21:45:02 crc kubenswrapper[4746]: I0128 21:45:02.797825 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc" event={"ID":"6e81e481-7c5b-44bf-8f7d-ae0a582942a5","Type":"ContainerDied","Data":"f035390f3b7c197dd898fbdbc5b613e2ca0094c2ca90d8b8d629c18a150c34cf"} Jan 28 21:45:04 crc kubenswrapper[4746]: I0128 21:45:04.836583 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:45:04 crc kubenswrapper[4746]: E0128 21:45:04.837206 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:45:05 crc kubenswrapper[4746]: I0128 21:45:05.319654 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc" Jan 28 21:45:05 crc kubenswrapper[4746]: I0128 21:45:05.408642 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e81e481-7c5b-44bf-8f7d-ae0a582942a5-secret-volume\") pod \"6e81e481-7c5b-44bf-8f7d-ae0a582942a5\" (UID: \"6e81e481-7c5b-44bf-8f7d-ae0a582942a5\") " Jan 28 21:45:05 crc kubenswrapper[4746]: I0128 21:45:05.408951 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5gqq\" (UniqueName: \"kubernetes.io/projected/6e81e481-7c5b-44bf-8f7d-ae0a582942a5-kube-api-access-d5gqq\") pod \"6e81e481-7c5b-44bf-8f7d-ae0a582942a5\" (UID: \"6e81e481-7c5b-44bf-8f7d-ae0a582942a5\") " Jan 28 21:45:05 crc kubenswrapper[4746]: I0128 21:45:05.409225 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e81e481-7c5b-44bf-8f7d-ae0a582942a5-config-volume\") pod \"6e81e481-7c5b-44bf-8f7d-ae0a582942a5\" (UID: \"6e81e481-7c5b-44bf-8f7d-ae0a582942a5\") " Jan 28 21:45:05 crc kubenswrapper[4746]: I0128 21:45:05.410682 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e81e481-7c5b-44bf-8f7d-ae0a582942a5-config-volume" (OuterVolumeSpecName: "config-volume") pod "6e81e481-7c5b-44bf-8f7d-ae0a582942a5" (UID: "6e81e481-7c5b-44bf-8f7d-ae0a582942a5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 28 21:45:05 crc kubenswrapper[4746]: I0128 21:45:05.496158 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e81e481-7c5b-44bf-8f7d-ae0a582942a5-kube-api-access-d5gqq" (OuterVolumeSpecName: "kube-api-access-d5gqq") pod "6e81e481-7c5b-44bf-8f7d-ae0a582942a5" (UID: "6e81e481-7c5b-44bf-8f7d-ae0a582942a5"). InnerVolumeSpecName "kube-api-access-d5gqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:45:05 crc kubenswrapper[4746]: I0128 21:45:05.502846 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e81e481-7c5b-44bf-8f7d-ae0a582942a5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6e81e481-7c5b-44bf-8f7d-ae0a582942a5" (UID: "6e81e481-7c5b-44bf-8f7d-ae0a582942a5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 28 21:45:05 crc kubenswrapper[4746]: I0128 21:45:05.512269 4746 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e81e481-7c5b-44bf-8f7d-ae0a582942a5-config-volume\") on node \"crc\" DevicePath \"\"" Jan 28 21:45:05 crc kubenswrapper[4746]: I0128 21:45:05.512303 4746 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6e81e481-7c5b-44bf-8f7d-ae0a582942a5-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 28 21:45:05 crc kubenswrapper[4746]: I0128 21:45:05.512315 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5gqq\" (UniqueName: \"kubernetes.io/projected/6e81e481-7c5b-44bf-8f7d-ae0a582942a5-kube-api-access-d5gqq\") on node \"crc\" DevicePath \"\"" Jan 28 21:45:05 crc kubenswrapper[4746]: I0128 21:45:05.826433 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc" event={"ID":"6e81e481-7c5b-44bf-8f7d-ae0a582942a5","Type":"ContainerDied","Data":"7f1724e1e93066b5a570e8c219b4ede2d076dd3e81eca95c8d8d528d29270c4b"} Jan 28 21:45:05 crc kubenswrapper[4746]: I0128 21:45:05.826674 4746 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f1724e1e93066b5a570e8c219b4ede2d076dd3e81eca95c8d8d528d29270c4b" Jan 28 21:45:05 crc kubenswrapper[4746]: I0128 21:45:05.826646 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29493945-ftcjc" Jan 28 21:45:06 crc kubenswrapper[4746]: I0128 21:45:06.487695 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw"] Jan 28 21:45:06 crc kubenswrapper[4746]: I0128 21:45:06.504467 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29493900-dxkdw"] Jan 28 21:45:06 crc kubenswrapper[4746]: I0128 21:45:06.848286 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74f95799-0ea5-46f5-b2e7-7ef3370e9215" path="/var/lib/kubelet/pods/74f95799-0ea5-46f5-b2e7-7ef3370e9215/volumes" Jan 28 21:45:09 crc kubenswrapper[4746]: I0128 21:45:09.053058 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fbjdp"] Jan 28 21:45:09 crc kubenswrapper[4746]: E0128 21:45:09.053738 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e81e481-7c5b-44bf-8f7d-ae0a582942a5" containerName="collect-profiles" Jan 28 21:45:09 crc kubenswrapper[4746]: I0128 21:45:09.053752 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e81e481-7c5b-44bf-8f7d-ae0a582942a5" containerName="collect-profiles" Jan 28 21:45:09 crc kubenswrapper[4746]: I0128 21:45:09.053931 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e81e481-7c5b-44bf-8f7d-ae0a582942a5" containerName="collect-profiles" Jan 28 21:45:09 crc kubenswrapper[4746]: I0128 21:45:09.056291 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbjdp" Jan 28 21:45:09 crc kubenswrapper[4746]: I0128 21:45:09.078641 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbjdp"] Jan 28 21:45:09 crc kubenswrapper[4746]: I0128 21:45:09.209987 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54897ff3-c2b3-450b-97c9-59f657902640-utilities\") pod \"redhat-operators-fbjdp\" (UID: \"54897ff3-c2b3-450b-97c9-59f657902640\") " pod="openshift-marketplace/redhat-operators-fbjdp" Jan 28 21:45:09 crc kubenswrapper[4746]: I0128 21:45:09.210321 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vksk\" (UniqueName: \"kubernetes.io/projected/54897ff3-c2b3-450b-97c9-59f657902640-kube-api-access-5vksk\") pod \"redhat-operators-fbjdp\" (UID: \"54897ff3-c2b3-450b-97c9-59f657902640\") " pod="openshift-marketplace/redhat-operators-fbjdp" Jan 28 21:45:09 crc kubenswrapper[4746]: I0128 21:45:09.210472 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54897ff3-c2b3-450b-97c9-59f657902640-catalog-content\") pod \"redhat-operators-fbjdp\" (UID: \"54897ff3-c2b3-450b-97c9-59f657902640\") " pod="openshift-marketplace/redhat-operators-fbjdp" Jan 28 21:45:09 crc kubenswrapper[4746]: I0128 21:45:09.312236 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vksk\" (UniqueName: \"kubernetes.io/projected/54897ff3-c2b3-450b-97c9-59f657902640-kube-api-access-5vksk\") pod \"redhat-operators-fbjdp\" (UID: \"54897ff3-c2b3-450b-97c9-59f657902640\") " pod="openshift-marketplace/redhat-operators-fbjdp" Jan 28 21:45:09 crc kubenswrapper[4746]: I0128 21:45:09.312337 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54897ff3-c2b3-450b-97c9-59f657902640-catalog-content\") pod \"redhat-operators-fbjdp\" (UID: \"54897ff3-c2b3-450b-97c9-59f657902640\") " pod="openshift-marketplace/redhat-operators-fbjdp" Jan 28 21:45:09 crc kubenswrapper[4746]: I0128 21:45:09.312444 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54897ff3-c2b3-450b-97c9-59f657902640-utilities\") pod \"redhat-operators-fbjdp\" (UID: \"54897ff3-c2b3-450b-97c9-59f657902640\") " pod="openshift-marketplace/redhat-operators-fbjdp" Jan 28 21:45:09 crc kubenswrapper[4746]: I0128 21:45:09.312872 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54897ff3-c2b3-450b-97c9-59f657902640-utilities\") pod \"redhat-operators-fbjdp\" (UID: \"54897ff3-c2b3-450b-97c9-59f657902640\") " pod="openshift-marketplace/redhat-operators-fbjdp" Jan 28 21:45:09 crc kubenswrapper[4746]: I0128 21:45:09.312988 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54897ff3-c2b3-450b-97c9-59f657902640-catalog-content\") pod \"redhat-operators-fbjdp\" (UID: \"54897ff3-c2b3-450b-97c9-59f657902640\") " pod="openshift-marketplace/redhat-operators-fbjdp" Jan 28 21:45:09 crc kubenswrapper[4746]: I0128 21:45:09.332878 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vksk\" (UniqueName: \"kubernetes.io/projected/54897ff3-c2b3-450b-97c9-59f657902640-kube-api-access-5vksk\") pod \"redhat-operators-fbjdp\" (UID: \"54897ff3-c2b3-450b-97c9-59f657902640\") " pod="openshift-marketplace/redhat-operators-fbjdp" Jan 28 21:45:09 crc kubenswrapper[4746]: I0128 21:45:09.380502 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbjdp" Jan 28 21:45:09 crc kubenswrapper[4746]: I0128 21:45:09.958442 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbjdp"] Jan 28 21:45:09 crc kubenswrapper[4746]: W0128 21:45:09.967283 4746 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54897ff3_c2b3_450b_97c9_59f657902640.slice/crio-9f2404cf89ea73f1c2b47ebfe63f95369aed71885c2fd4a41975199a5a4eea0e WatchSource:0}: Error finding container 9f2404cf89ea73f1c2b47ebfe63f95369aed71885c2fd4a41975199a5a4eea0e: Status 404 returned error can't find the container with id 9f2404cf89ea73f1c2b47ebfe63f95369aed71885c2fd4a41975199a5a4eea0e Jan 28 21:45:10 crc kubenswrapper[4746]: I0128 21:45:10.924868 4746 generic.go:334] "Generic (PLEG): container finished" podID="54897ff3-c2b3-450b-97c9-59f657902640" containerID="ffafd5f22fa77a5ce7fe4d3c526983d51dd9cfd8dba3d60dfd7053a17ed6cd5f" exitCode=0 Jan 28 21:45:10 crc kubenswrapper[4746]: I0128 21:45:10.924941 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbjdp" event={"ID":"54897ff3-c2b3-450b-97c9-59f657902640","Type":"ContainerDied","Data":"ffafd5f22fa77a5ce7fe4d3c526983d51dd9cfd8dba3d60dfd7053a17ed6cd5f"} Jan 28 21:45:10 crc kubenswrapper[4746]: I0128 21:45:10.925137 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbjdp" event={"ID":"54897ff3-c2b3-450b-97c9-59f657902640","Type":"ContainerStarted","Data":"9f2404cf89ea73f1c2b47ebfe63f95369aed71885c2fd4a41975199a5a4eea0e"} Jan 28 21:45:11 crc kubenswrapper[4746]: I0128 21:45:11.937028 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbjdp" event={"ID":"54897ff3-c2b3-450b-97c9-59f657902640","Type":"ContainerStarted","Data":"bdee1db41786dcb6e8fd8722ae8a3297045aa7a103c7305124a78d616b2b56cd"} Jan 28 21:45:16 crc kubenswrapper[4746]: I0128 21:45:16.836220 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:45:16 crc kubenswrapper[4746]: E0128 21:45:16.836906 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:45:17 crc kubenswrapper[4746]: I0128 21:45:17.995926 4746 generic.go:334] "Generic (PLEG): container finished" podID="54897ff3-c2b3-450b-97c9-59f657902640" containerID="bdee1db41786dcb6e8fd8722ae8a3297045aa7a103c7305124a78d616b2b56cd" exitCode=0 Jan 28 21:45:17 crc kubenswrapper[4746]: I0128 21:45:17.995973 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbjdp" event={"ID":"54897ff3-c2b3-450b-97c9-59f657902640","Type":"ContainerDied","Data":"bdee1db41786dcb6e8fd8722ae8a3297045aa7a103c7305124a78d616b2b56cd"} Jan 28 21:45:19 crc kubenswrapper[4746]: I0128 21:45:19.009358 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbjdp" event={"ID":"54897ff3-c2b3-450b-97c9-59f657902640","Type":"ContainerStarted","Data":"f5dfbaa8387b5d26520cbae94a0d2f23671d06d7c2fc1eb39d4666f5e2103502"} Jan 28 21:45:19 crc kubenswrapper[4746]: I0128 21:45:19.028544 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fbjdp" podStartSLOduration=2.550808964 podStartE2EDuration="10.028529192s" podCreationTimestamp="2026-01-28 21:45:09 +0000 UTC" firstStartedPulling="2026-01-28 21:45:10.926748335 +0000 UTC m=+3938.882934689" lastFinishedPulling="2026-01-28 21:45:18.404468563 +0000 UTC m=+3946.360654917" observedRunningTime="2026-01-28 21:45:19.02589703 +0000 UTC m=+3946.982083384" watchObservedRunningTime="2026-01-28 21:45:19.028529192 +0000 UTC m=+3946.984715546" Jan 28 21:45:19 crc kubenswrapper[4746]: I0128 21:45:19.380968 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fbjdp" Jan 28 21:45:19 crc kubenswrapper[4746]: I0128 21:45:19.381029 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fbjdp" Jan 28 21:45:20 crc kubenswrapper[4746]: I0128 21:45:20.437286 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fbjdp" podUID="54897ff3-c2b3-450b-97c9-59f657902640" containerName="registry-server" probeResult="failure" output=< Jan 28 21:45:20 crc kubenswrapper[4746]: timeout: failed to connect service ":50051" within 1s Jan 28 21:45:20 crc kubenswrapper[4746]: > Jan 28 21:45:23 crc kubenswrapper[4746]: I0128 21:45:23.711284 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp_c1bfc71e-7105-4567-b92a-37c08b17a97c/util/0.log" Jan 28 21:45:23 crc kubenswrapper[4746]: I0128 21:45:23.842358 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp_c1bfc71e-7105-4567-b92a-37c08b17a97c/util/0.log" Jan 28 21:45:23 crc kubenswrapper[4746]: I0128 21:45:23.939263 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp_c1bfc71e-7105-4567-b92a-37c08b17a97c/pull/0.log" Jan 28 21:45:23 crc kubenswrapper[4746]: I0128 21:45:23.971061 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp_c1bfc71e-7105-4567-b92a-37c08b17a97c/pull/0.log" Jan 28 21:45:24 crc kubenswrapper[4746]: I0128 21:45:24.592492 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp_c1bfc71e-7105-4567-b92a-37c08b17a97c/util/0.log" Jan 28 21:45:24 crc kubenswrapper[4746]: I0128 21:45:24.747611 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp_c1bfc71e-7105-4567-b92a-37c08b17a97c/extract/0.log" Jan 28 21:45:24 crc kubenswrapper[4746]: I0128 21:45:24.763799 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1dcfae1c51044912e4345cf7baf7051d6a45f7abdb8f477454f0bd2ab0g68tp_c1bfc71e-7105-4567-b92a-37c08b17a97c/pull/0.log" Jan 28 21:45:24 crc kubenswrapper[4746]: I0128 21:45:24.923842 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7f86f8796f-kll6j_3c81bd6e-961b-42ae-8840-2607a13046df/manager/0.log" Jan 28 21:45:25 crc kubenswrapper[4746]: I0128 21:45:25.053474 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7478f7dbf9-n6qr7_f86e66ed-9f28-4514-8ff8-97b8353026d1/manager/0.log" Jan 28 21:45:25 crc kubenswrapper[4746]: I0128 21:45:25.246596 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-b45d7bf98-cm85d_63794c40-0128-457d-b223-84e87943cca9/manager/0.log" Jan 28 21:45:25 crc kubenswrapper[4746]: I0128 21:45:25.385366 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-78fdd796fd-bxtxd_fe660f4f-8806-4674-ab58-ea3303f51683/manager/0.log" Jan 28 21:45:25 crc kubenswrapper[4746]: I0128 21:45:25.745857 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-p6qjg_677d2ab0-897d-4fd5-8ca5-b75f310e38da/manager/0.log" Jan 28 21:45:25 crc kubenswrapper[4746]: I0128 21:45:25.755866 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-ws7k7_760877c4-6e86-4445-a4cf-002b48e93841/manager/0.log" Jan 28 21:45:26 crc kubenswrapper[4746]: I0128 21:45:26.087938 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-598f7747c9-5lc6j_b44b1510-0a60-4b4e-9541-cc6d18e10a7f/manager/0.log" Jan 28 21:45:26 crc kubenswrapper[4746]: I0128 21:45:26.217857 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-694cf4f878-th2hg_28de2427-e250-44f5-add2-1b738cf6ce3b/manager/0.log" Jan 28 21:45:26 crc kubenswrapper[4746]: I0128 21:45:26.429572 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b8b6d4659-65qb5_fc220202-4669-4c2e-94b0-583048b56c83/manager/0.log" Jan 28 21:45:26 crc kubenswrapper[4746]: I0128 21:45:26.462881 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-78c6999f6f-m5qbs_f682c47e-2151-466d-8cc5-9ef0fca79785/manager/0.log" Jan 28 21:45:26 crc kubenswrapper[4746]: I0128 21:45:26.749736 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-78d58447c5-pcprz_b182a0df-d0f9-46d6-9a0c-a3e332c84cff/manager/0.log" Jan 28 21:45:26 crc kubenswrapper[4746]: I0128 21:45:26.823997 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6b9fb5fdcb-zcvgn_5521c5f5-d2f6-461b-a2fc-ee97a5b2df11/manager/0.log" Jan 28 21:45:27 crc kubenswrapper[4746]: I0128 21:45:27.074801 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4cd88d46-hb6t9_3b28dc9c-6dcf-4fd1-8cbd-f13d0da9e954/manager/0.log" Jan 28 21:45:27 crc kubenswrapper[4746]: I0128 21:45:27.159627 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7bdb645866-pg4s4_ced3eeee-ed33-4c50-8531-a7e4df1849f6/manager/0.log" Jan 28 21:45:27 crc kubenswrapper[4746]: I0128 21:45:27.334489 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b85449cmp_3ff4c44c-0290-4ab0-abb8-316375200dc0/manager/0.log" Jan 28 21:45:27 crc kubenswrapper[4746]: I0128 21:45:27.542115 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-58bd5c8549-ggt7x_0e81bc43-baa9-4cbd-a255-233e12e2b84b/operator/0.log" Jan 28 21:45:27 crc kubenswrapper[4746]: I0128 21:45:27.656485 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hk7sk_161bd1ce-304a-4bcd-9188-568b362f4739/registry-server/0.log" Jan 28 21:45:27 crc kubenswrapper[4746]: I0128 21:45:27.894946 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f75f45d54-kpcqr_e3360f0f-1430-4b7e-9ee0-0a126a9b657d/manager/0.log" Jan 28 21:45:27 crc kubenswrapper[4746]: I0128 21:45:27.994177 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-79d5ccc684-6klzp_370a5739-7af0-4065-986c-af68a265423c/manager/0.log" Jan 28 21:45:28 crc kubenswrapper[4746]: I0128 21:45:28.172390 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-8kpht_6b7a0005-11ec-4c8a-87e9-872855585d4d/operator/0.log" Jan 28 21:45:28 crc kubenswrapper[4746]: I0128 21:45:28.322685 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-547cbdb99f-fjs9l_1f4e2d58-bbd0-45d2-81ba-2b4b47ab5af6/manager/0.log" Jan 28 21:45:28 crc kubenswrapper[4746]: I0128 21:45:28.590613 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-69797bbcbd-m4x6j_beba987e-69be-47aa-a84c-7ea511c4d151/manager/0.log" Jan 28 21:45:28 crc kubenswrapper[4746]: I0128 21:45:28.757937 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-564965969-hd4k9_90c190b4-36db-406b-bca5-6c45ac745ed6/manager/0.log" Jan 28 21:45:28 crc kubenswrapper[4746]: I0128 21:45:28.792480 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-65d466cb7d-vf8n9_a7c2547a-3282-4748-a823-c3a0cc41ad46/manager/0.log" Jan 28 21:45:28 crc kubenswrapper[4746]: I0128 21:45:28.848360 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:45:28 crc kubenswrapper[4746]: E0128 21:45:28.848936 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:45:28 crc kubenswrapper[4746]: I0128 21:45:28.922497 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-9477bbd48-z984g_e42669f3-6865-4ab6-9a9a-241c7b07509d/manager/0.log" Jan 28 21:45:30 crc kubenswrapper[4746]: I0128 21:45:30.436269 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fbjdp" podUID="54897ff3-c2b3-450b-97c9-59f657902640" containerName="registry-server" probeResult="failure" output=< Jan 28 21:45:30 crc kubenswrapper[4746]: timeout: failed to connect service ":50051" within 1s Jan 28 21:45:30 crc kubenswrapper[4746]: > Jan 28 21:45:40 crc kubenswrapper[4746]: I0128 21:45:40.435740 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fbjdp" podUID="54897ff3-c2b3-450b-97c9-59f657902640" containerName="registry-server" probeResult="failure" output=< Jan 28 21:45:40 crc kubenswrapper[4746]: timeout: failed to connect service ":50051" within 1s Jan 28 21:45:40 crc kubenswrapper[4746]: > Jan 28 21:45:42 crc kubenswrapper[4746]: I0128 21:45:42.845210 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:45:42 crc kubenswrapper[4746]: E0128 21:45:42.845721 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:45:50 crc kubenswrapper[4746]: I0128 21:45:50.441681 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fbjdp" podUID="54897ff3-c2b3-450b-97c9-59f657902640" containerName="registry-server" probeResult="failure" output=< Jan 28 21:45:50 crc kubenswrapper[4746]: timeout: failed to connect service ":50051" within 1s Jan 28 21:45:50 crc kubenswrapper[4746]: > Jan 28 21:45:52 crc kubenswrapper[4746]: I0128 21:45:52.753452 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-mznpb_72e0847f-0a87-4710-9765-a10282cc0529/control-plane-machine-set-operator/0.log" Jan 28 21:45:52 crc kubenswrapper[4746]: I0128 21:45:52.982849 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lzj8l_3e64bb6e-1131-431b-b87c-71e25d294fe1/kube-rbac-proxy/0.log" Jan 28 21:45:52 crc kubenswrapper[4746]: I0128 21:45:52.995416 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lzj8l_3e64bb6e-1131-431b-b87c-71e25d294fe1/machine-api-operator/0.log" Jan 28 21:45:55 crc kubenswrapper[4746]: I0128 21:45:55.836958 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:45:55 crc kubenswrapper[4746]: E0128 21:45:55.837534 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:45:59 crc kubenswrapper[4746]: I0128 21:45:59.442741 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fbjdp" Jan 28 21:45:59 crc kubenswrapper[4746]: I0128 21:45:59.498059 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fbjdp" Jan 28 21:45:59 crc kubenswrapper[4746]: I0128 21:45:59.690019 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbjdp"] Jan 28 21:46:00 crc kubenswrapper[4746]: I0128 21:46:00.053591 4746 scope.go:117] "RemoveContainer" containerID="59a82c4c52affbab10dbd5190aec777d3afa26f9c37a8f62e333d4a0dea63385" Jan 28 21:46:01 crc kubenswrapper[4746]: I0128 21:46:01.388479 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fbjdp" podUID="54897ff3-c2b3-450b-97c9-59f657902640" containerName="registry-server" containerID="cri-o://f5dfbaa8387b5d26520cbae94a0d2f23671d06d7c2fc1eb39d4666f5e2103502" gracePeriod=2 Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.209808 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbjdp" Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.294976 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vksk\" (UniqueName: \"kubernetes.io/projected/54897ff3-c2b3-450b-97c9-59f657902640-kube-api-access-5vksk\") pod \"54897ff3-c2b3-450b-97c9-59f657902640\" (UID: \"54897ff3-c2b3-450b-97c9-59f657902640\") " Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.295091 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54897ff3-c2b3-450b-97c9-59f657902640-catalog-content\") pod \"54897ff3-c2b3-450b-97c9-59f657902640\" (UID: \"54897ff3-c2b3-450b-97c9-59f657902640\") " Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.295353 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54897ff3-c2b3-450b-97c9-59f657902640-utilities\") pod \"54897ff3-c2b3-450b-97c9-59f657902640\" (UID: \"54897ff3-c2b3-450b-97c9-59f657902640\") " Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.296231 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54897ff3-c2b3-450b-97c9-59f657902640-utilities" (OuterVolumeSpecName: "utilities") pod "54897ff3-c2b3-450b-97c9-59f657902640" (UID: "54897ff3-c2b3-450b-97c9-59f657902640"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.300905 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54897ff3-c2b3-450b-97c9-59f657902640-kube-api-access-5vksk" (OuterVolumeSpecName: "kube-api-access-5vksk") pod "54897ff3-c2b3-450b-97c9-59f657902640" (UID: "54897ff3-c2b3-450b-97c9-59f657902640"). InnerVolumeSpecName "kube-api-access-5vksk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.399833 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54897ff3-c2b3-450b-97c9-59f657902640-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.399873 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vksk\" (UniqueName: \"kubernetes.io/projected/54897ff3-c2b3-450b-97c9-59f657902640-kube-api-access-5vksk\") on node \"crc\" DevicePath \"\"" Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.406092 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54897ff3-c2b3-450b-97c9-59f657902640-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54897ff3-c2b3-450b-97c9-59f657902640" (UID: "54897ff3-c2b3-450b-97c9-59f657902640"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.406174 4746 generic.go:334] "Generic (PLEG): container finished" podID="54897ff3-c2b3-450b-97c9-59f657902640" containerID="f5dfbaa8387b5d26520cbae94a0d2f23671d06d7c2fc1eb39d4666f5e2103502" exitCode=0 Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.406206 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbjdp" Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.406208 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbjdp" event={"ID":"54897ff3-c2b3-450b-97c9-59f657902640","Type":"ContainerDied","Data":"f5dfbaa8387b5d26520cbae94a0d2f23671d06d7c2fc1eb39d4666f5e2103502"} Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.406258 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbjdp" event={"ID":"54897ff3-c2b3-450b-97c9-59f657902640","Type":"ContainerDied","Data":"9f2404cf89ea73f1c2b47ebfe63f95369aed71885c2fd4a41975199a5a4eea0e"} Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.406273 4746 scope.go:117] "RemoveContainer" containerID="f5dfbaa8387b5d26520cbae94a0d2f23671d06d7c2fc1eb39d4666f5e2103502" Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.438867 4746 scope.go:117] "RemoveContainer" containerID="bdee1db41786dcb6e8fd8722ae8a3297045aa7a103c7305124a78d616b2b56cd" Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.444506 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbjdp"] Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.456798 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fbjdp"] Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.464725 4746 scope.go:117] "RemoveContainer" containerID="ffafd5f22fa77a5ce7fe4d3c526983d51dd9cfd8dba3d60dfd7053a17ed6cd5f" Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.501832 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54897ff3-c2b3-450b-97c9-59f657902640-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.524430 4746 scope.go:117] "RemoveContainer" containerID="f5dfbaa8387b5d26520cbae94a0d2f23671d06d7c2fc1eb39d4666f5e2103502" Jan 28 21:46:02 crc kubenswrapper[4746]: E0128 21:46:02.525035 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5dfbaa8387b5d26520cbae94a0d2f23671d06d7c2fc1eb39d4666f5e2103502\": container with ID starting with f5dfbaa8387b5d26520cbae94a0d2f23671d06d7c2fc1eb39d4666f5e2103502 not found: ID does not exist" containerID="f5dfbaa8387b5d26520cbae94a0d2f23671d06d7c2fc1eb39d4666f5e2103502" Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.525097 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5dfbaa8387b5d26520cbae94a0d2f23671d06d7c2fc1eb39d4666f5e2103502"} err="failed to get container status \"f5dfbaa8387b5d26520cbae94a0d2f23671d06d7c2fc1eb39d4666f5e2103502\": rpc error: code = NotFound desc = could not find container \"f5dfbaa8387b5d26520cbae94a0d2f23671d06d7c2fc1eb39d4666f5e2103502\": container with ID starting with f5dfbaa8387b5d26520cbae94a0d2f23671d06d7c2fc1eb39d4666f5e2103502 not found: ID does not exist" Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.525132 4746 scope.go:117] "RemoveContainer" containerID="bdee1db41786dcb6e8fd8722ae8a3297045aa7a103c7305124a78d616b2b56cd" Jan 28 21:46:02 crc kubenswrapper[4746]: E0128 21:46:02.525523 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdee1db41786dcb6e8fd8722ae8a3297045aa7a103c7305124a78d616b2b56cd\": container with ID starting with bdee1db41786dcb6e8fd8722ae8a3297045aa7a103c7305124a78d616b2b56cd not found: ID does not exist" containerID="bdee1db41786dcb6e8fd8722ae8a3297045aa7a103c7305124a78d616b2b56cd" Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.525552 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdee1db41786dcb6e8fd8722ae8a3297045aa7a103c7305124a78d616b2b56cd"} err="failed to get container status \"bdee1db41786dcb6e8fd8722ae8a3297045aa7a103c7305124a78d616b2b56cd\": rpc error: code = NotFound desc = could not find container \"bdee1db41786dcb6e8fd8722ae8a3297045aa7a103c7305124a78d616b2b56cd\": container with ID starting with bdee1db41786dcb6e8fd8722ae8a3297045aa7a103c7305124a78d616b2b56cd not found: ID does not exist" Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.525566 4746 scope.go:117] "RemoveContainer" containerID="ffafd5f22fa77a5ce7fe4d3c526983d51dd9cfd8dba3d60dfd7053a17ed6cd5f" Jan 28 21:46:02 crc kubenswrapper[4746]: E0128 21:46:02.525767 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffafd5f22fa77a5ce7fe4d3c526983d51dd9cfd8dba3d60dfd7053a17ed6cd5f\": container with ID starting with ffafd5f22fa77a5ce7fe4d3c526983d51dd9cfd8dba3d60dfd7053a17ed6cd5f not found: ID does not exist" containerID="ffafd5f22fa77a5ce7fe4d3c526983d51dd9cfd8dba3d60dfd7053a17ed6cd5f" Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.525791 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffafd5f22fa77a5ce7fe4d3c526983d51dd9cfd8dba3d60dfd7053a17ed6cd5f"} err="failed to get container status \"ffafd5f22fa77a5ce7fe4d3c526983d51dd9cfd8dba3d60dfd7053a17ed6cd5f\": rpc error: code = NotFound desc = could not find container \"ffafd5f22fa77a5ce7fe4d3c526983d51dd9cfd8dba3d60dfd7053a17ed6cd5f\": container with ID starting with ffafd5f22fa77a5ce7fe4d3c526983d51dd9cfd8dba3d60dfd7053a17ed6cd5f not found: ID does not exist" Jan 28 21:46:02 crc kubenswrapper[4746]: I0128 21:46:02.850218 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54897ff3-c2b3-450b-97c9-59f657902640" path="/var/lib/kubelet/pods/54897ff3-c2b3-450b-97c9-59f657902640/volumes" Jan 28 21:46:06 crc kubenswrapper[4746]: I0128 21:46:06.836319 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:46:06 crc kubenswrapper[4746]: E0128 21:46:06.837112 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:46:08 crc kubenswrapper[4746]: I0128 21:46:08.304345 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-5dcxq_ca33d567-a88a-4cad-b323-ffbb4ac0e02e/cert-manager-controller/0.log" Jan 28 21:46:08 crc kubenswrapper[4746]: I0128 21:46:08.483936 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-m56r8_e669e571-cde2-4753-a233-bd4ff6c76f02/cert-manager-cainjector/0.log" Jan 28 21:46:08 crc kubenswrapper[4746]: I0128 21:46:08.605745 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-bzzv6_6ff603d5-0f8d-415a-8616-55be576956bf/cert-manager-webhook/0.log" Jan 28 21:46:18 crc kubenswrapper[4746]: I0128 21:46:18.837393 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:46:18 crc kubenswrapper[4746]: E0128 21:46:18.838180 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:46:24 crc kubenswrapper[4746]: I0128 21:46:24.298506 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-sbwgb_30f828e0-bffb-4b84-be14-53eac55a3ca3/nmstate-handler/0.log" Jan 28 21:46:24 crc kubenswrapper[4746]: I0128 21:46:24.303726 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-4s5cl_6078a6ee-9b98-476e-89f3-5430a34e7ec9/nmstate-console-plugin/0.log" Jan 28 21:46:24 crc kubenswrapper[4746]: I0128 21:46:24.462294 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-v6k5d_48781ec4-e4a7-402c-a111-22310cfe0305/kube-rbac-proxy/0.log" Jan 28 21:46:24 crc kubenswrapper[4746]: I0128 21:46:24.511142 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-v6k5d_48781ec4-e4a7-402c-a111-22310cfe0305/nmstate-metrics/0.log" Jan 28 21:46:24 crc kubenswrapper[4746]: I0128 21:46:24.689687 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-x2kbm_e8f9251b-82e6-4fde-8a14-4430af400661/nmstate-operator/0.log" Jan 28 21:46:24 crc kubenswrapper[4746]: I0128 21:46:24.728728 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-vffsx_4108ee2d-3096-4956-95b4-7c2b8327175c/nmstate-webhook/0.log" Jan 28 21:46:30 crc kubenswrapper[4746]: I0128 21:46:30.836364 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:46:30 crc kubenswrapper[4746]: E0128 21:46:30.837026 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:46:38 crc kubenswrapper[4746]: I0128 21:46:38.922385 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6866b6794-24l8g_cfda6c5a-4e09-4579-9149-ba5c87aaf387/kube-rbac-proxy/0.log" Jan 28 21:46:38 crc kubenswrapper[4746]: I0128 21:46:38.994543 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6866b6794-24l8g_cfda6c5a-4e09-4579-9149-ba5c87aaf387/manager/0.log" Jan 28 21:46:42 crc kubenswrapper[4746]: I0128 21:46:42.843494 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:46:42 crc kubenswrapper[4746]: E0128 21:46:42.844356 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:46:54 crc kubenswrapper[4746]: I0128 21:46:54.928326 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-sx9pg_0e1b10c8-2491-403a-9ea3-9805d8167d7a/prometheus-operator/0.log" Jan 28 21:46:55 crc kubenswrapper[4746]: I0128 21:46:55.735959 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_09345bfc-4171-49c5-85e3-32616db6ce17/prometheus-operator-admission-webhook/0.log" Jan 28 21:46:55 crc kubenswrapper[4746]: I0128 21:46:55.796100 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_10acdec7-69f6-42e1-b065-c84b8d82fd03/prometheus-operator-admission-webhook/0.log" Jan 28 21:46:55 crc kubenswrapper[4746]: I0128 21:46:55.836343 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:46:55 crc kubenswrapper[4746]: E0128 21:46:55.836613 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:46:55 crc kubenswrapper[4746]: I0128 21:46:55.957905 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-m2mx9_2788b8ac-4eb0-46cb-8861-c55d6b302dd7/operator/0.log" Jan 28 21:46:55 crc kubenswrapper[4746]: I0128 21:46:55.986234 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-jnzwc_f13f3a63-44b1-4644-8bea-99e25a6764c3/perses-operator/0.log" Jan 28 21:47:09 crc kubenswrapper[4746]: I0128 21:47:09.836467 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:47:09 crc kubenswrapper[4746]: E0128 21:47:09.837055 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:47:10 crc kubenswrapper[4746]: I0128 21:47:10.776523 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-r2vlm_b5150ca9-e86d-4087-bc5d-c2dd26234ecd/kube-rbac-proxy/0.log" Jan 28 21:47:10 crc kubenswrapper[4746]: I0128 21:47:10.843390 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-r2vlm_b5150ca9-e86d-4087-bc5d-c2dd26234ecd/controller/0.log" Jan 28 21:47:11 crc kubenswrapper[4746]: I0128 21:47:11.021633 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-frr-files/0.log" Jan 28 21:47:11 crc kubenswrapper[4746]: I0128 21:47:11.156445 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-frr-files/0.log" Jan 28 21:47:11 crc kubenswrapper[4746]: I0128 21:47:11.182645 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-reloader/0.log" Jan 28 21:47:11 crc kubenswrapper[4746]: I0128 21:47:11.210043 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-metrics/0.log" Jan 28 21:47:11 crc kubenswrapper[4746]: I0128 21:47:11.242048 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-reloader/0.log" Jan 28 21:47:11 crc kubenswrapper[4746]: I0128 21:47:11.465656 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-reloader/0.log" Jan 28 21:47:11 crc kubenswrapper[4746]: I0128 21:47:11.500621 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-metrics/0.log" Jan 28 21:47:11 crc kubenswrapper[4746]: I0128 21:47:11.513677 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-frr-files/0.log" Jan 28 21:47:11 crc kubenswrapper[4746]: I0128 21:47:11.518401 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-metrics/0.log" Jan 28 21:47:11 crc kubenswrapper[4746]: I0128 21:47:11.774734 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-metrics/0.log" Jan 28 21:47:11 crc kubenswrapper[4746]: I0128 21:47:11.774770 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-frr-files/0.log" Jan 28 21:47:11 crc kubenswrapper[4746]: I0128 21:47:11.791465 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/cp-reloader/0.log" Jan 28 21:47:11 crc kubenswrapper[4746]: I0128 21:47:11.803378 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/controller/0.log" Jan 28 21:47:11 crc kubenswrapper[4746]: I0128 21:47:11.958169 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/kube-rbac-proxy/0.log" Jan 28 21:47:12 crc kubenswrapper[4746]: I0128 21:47:12.023632 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/kube-rbac-proxy-frr/0.log" Jan 28 21:47:12 crc kubenswrapper[4746]: I0128 21:47:12.049048 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/frr-metrics/0.log" Jan 28 21:47:12 crc kubenswrapper[4746]: I0128 21:47:12.215345 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/reloader/0.log" Jan 28 21:47:12 crc kubenswrapper[4746]: I0128 21:47:12.274124 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-5crvf_64704f76-28dc-42cf-a696-9473b337eee9/frr-k8s-webhook-server/0.log" Jan 28 21:47:12 crc kubenswrapper[4746]: I0128 21:47:12.515524 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5999cb5f6c-ndf7t_eb7a3d58-a895-43a6-8f29-240cfb61ed98/manager/0.log" Jan 28 21:47:12 crc kubenswrapper[4746]: I0128 21:47:12.716307 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-79f4bb6c4-wm9hh_1d703849-bf20-4333-9213-23b52999ae43/webhook-server/0.log" Jan 28 21:47:12 crc kubenswrapper[4746]: I0128 21:47:12.840618 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m55jn_c3d285c6-0abf-4c0b-92f5-1c91659d1de1/kube-rbac-proxy/0.log" Jan 28 21:47:13 crc kubenswrapper[4746]: I0128 21:47:13.477393 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m55jn_c3d285c6-0abf-4c0b-92f5-1c91659d1de1/speaker/0.log" Jan 28 21:47:13 crc kubenswrapper[4746]: I0128 21:47:13.570333 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hws9w_be75902a-e591-4378-89b8-9cab1f53dc5f/frr/0.log" Jan 28 21:47:23 crc kubenswrapper[4746]: I0128 21:47:23.836735 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:47:23 crc kubenswrapper[4746]: E0128 21:47:23.840586 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:47:29 crc kubenswrapper[4746]: I0128 21:47:29.307861 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x_b238ee1e-c43f-4eb7-8f69-de9f58747168/util/0.log" Jan 28 21:47:29 crc kubenswrapper[4746]: I0128 21:47:29.553042 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x_b238ee1e-c43f-4eb7-8f69-de9f58747168/pull/0.log" Jan 28 21:47:29 crc kubenswrapper[4746]: I0128 21:47:29.584367 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x_b238ee1e-c43f-4eb7-8f69-de9f58747168/pull/0.log" Jan 28 21:47:29 crc kubenswrapper[4746]: I0128 21:47:29.598433 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x_b238ee1e-c43f-4eb7-8f69-de9f58747168/util/0.log" Jan 28 21:47:29 crc kubenswrapper[4746]: I0128 21:47:29.839760 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x_b238ee1e-c43f-4eb7-8f69-de9f58747168/util/0.log" Jan 28 21:47:29 crc kubenswrapper[4746]: I0128 21:47:29.898905 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x_b238ee1e-c43f-4eb7-8f69-de9f58747168/extract/0.log" Jan 28 21:47:29 crc kubenswrapper[4746]: I0128 21:47:29.899037 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjnv2x_b238ee1e-c43f-4eb7-8f69-de9f58747168/pull/0.log" Jan 28 21:47:30 crc kubenswrapper[4746]: I0128 21:47:30.018006 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg_2e9f38fa-c869-458e-8a7c-fb15ec9acccd/util/0.log" Jan 28 21:47:30 crc kubenswrapper[4746]: I0128 21:47:30.238069 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg_2e9f38fa-c869-458e-8a7c-fb15ec9acccd/util/0.log" Jan 28 21:47:30 crc kubenswrapper[4746]: I0128 21:47:30.300338 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg_2e9f38fa-c869-458e-8a7c-fb15ec9acccd/pull/0.log" Jan 28 21:47:30 crc kubenswrapper[4746]: I0128 21:47:30.307645 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg_2e9f38fa-c869-458e-8a7c-fb15ec9acccd/pull/0.log" Jan 28 21:47:30 crc kubenswrapper[4746]: I0128 21:47:30.477725 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg_2e9f38fa-c869-458e-8a7c-fb15ec9acccd/util/0.log" Jan 28 21:47:30 crc kubenswrapper[4746]: I0128 21:47:30.478503 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg_2e9f38fa-c869-458e-8a7c-fb15ec9acccd/extract/0.log" Jan 28 21:47:30 crc kubenswrapper[4746]: I0128 21:47:30.528270 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3e572a74f8b8ca2bcfe04329d4f26bd9689911be5d166a7403bd6ae773vp7fg_2e9f38fa-c869-458e-8a7c-fb15ec9acccd/pull/0.log" Jan 28 21:47:30 crc kubenswrapper[4746]: I0128 21:47:30.688773 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr_743dd52c-2031-4ffc-a4f2-57dfa9438e4e/util/0.log" Jan 28 21:47:31 crc kubenswrapper[4746]: I0128 21:47:31.404302 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr_743dd52c-2031-4ffc-a4f2-57dfa9438e4e/pull/0.log" Jan 28 21:47:31 crc kubenswrapper[4746]: I0128 21:47:31.417596 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr_743dd52c-2031-4ffc-a4f2-57dfa9438e4e/pull/0.log" Jan 28 21:47:31 crc kubenswrapper[4746]: I0128 21:47:31.469885 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr_743dd52c-2031-4ffc-a4f2-57dfa9438e4e/util/0.log" Jan 28 21:47:31 crc kubenswrapper[4746]: I0128 21:47:31.632999 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr_743dd52c-2031-4ffc-a4f2-57dfa9438e4e/util/0.log" Jan 28 21:47:31 crc kubenswrapper[4746]: I0128 21:47:31.675822 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr_743dd52c-2031-4ffc-a4f2-57dfa9438e4e/extract/0.log" Jan 28 21:47:31 crc kubenswrapper[4746]: I0128 21:47:31.681749 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713xz8cr_743dd52c-2031-4ffc-a4f2-57dfa9438e4e/pull/0.log" Jan 28 21:47:31 crc kubenswrapper[4746]: I0128 21:47:31.916332 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8_3e93805e-6f5f-4618-b962-d9fca6cfe272/util/0.log" Jan 28 21:47:32 crc kubenswrapper[4746]: I0128 21:47:32.204508 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8_3e93805e-6f5f-4618-b962-d9fca6cfe272/pull/0.log" Jan 28 21:47:32 crc kubenswrapper[4746]: I0128 21:47:32.204705 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8_3e93805e-6f5f-4618-b962-d9fca6cfe272/util/0.log" Jan 28 21:47:32 crc kubenswrapper[4746]: I0128 21:47:32.209269 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8_3e93805e-6f5f-4618-b962-d9fca6cfe272/pull/0.log" Jan 28 21:47:32 crc kubenswrapper[4746]: I0128 21:47:32.892414 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8_3e93805e-6f5f-4618-b962-d9fca6cfe272/util/0.log" Jan 28 21:47:32 crc kubenswrapper[4746]: I0128 21:47:32.921943 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8_3e93805e-6f5f-4618-b962-d9fca6cfe272/extract/0.log" Jan 28 21:47:32 crc kubenswrapper[4746]: I0128 21:47:32.969094 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08m6kc8_3e93805e-6f5f-4618-b962-d9fca6cfe272/pull/0.log" Jan 28 21:47:33 crc kubenswrapper[4746]: I0128 21:47:33.181922 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s6mqb_f854deb7-4783-4ba8-8357-ffe4d1124a12/extract-utilities/0.log" Jan 28 21:47:33 crc kubenswrapper[4746]: I0128 21:47:33.287251 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s6mqb_f854deb7-4783-4ba8-8357-ffe4d1124a12/extract-utilities/0.log" Jan 28 21:47:33 crc kubenswrapper[4746]: I0128 21:47:33.294726 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s6mqb_f854deb7-4783-4ba8-8357-ffe4d1124a12/extract-content/0.log" Jan 28 21:47:33 crc kubenswrapper[4746]: I0128 21:47:33.331808 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s6mqb_f854deb7-4783-4ba8-8357-ffe4d1124a12/extract-content/0.log" Jan 28 21:47:33 crc kubenswrapper[4746]: I0128 21:47:33.507992 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s6mqb_f854deb7-4783-4ba8-8357-ffe4d1124a12/extract-content/0.log" Jan 28 21:47:33 crc kubenswrapper[4746]: I0128 21:47:33.514167 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s6mqb_f854deb7-4783-4ba8-8357-ffe4d1124a12/extract-utilities/0.log" Jan 28 21:47:33 crc kubenswrapper[4746]: I0128 21:47:33.532690 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-88s2s_2f7998d2-06e5-4567-8502-79dadd37daec/extract-utilities/0.log" Jan 28 21:47:33 crc kubenswrapper[4746]: I0128 21:47:33.779534 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-88s2s_2f7998d2-06e5-4567-8502-79dadd37daec/extract-content/0.log" Jan 28 21:47:33 crc kubenswrapper[4746]: I0128 21:47:33.788731 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-88s2s_2f7998d2-06e5-4567-8502-79dadd37daec/extract-content/0.log" Jan 28 21:47:33 crc kubenswrapper[4746]: I0128 21:47:33.842975 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-88s2s_2f7998d2-06e5-4567-8502-79dadd37daec/extract-utilities/0.log" Jan 28 21:47:34 crc kubenswrapper[4746]: I0128 21:47:34.012199 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-88s2s_2f7998d2-06e5-4567-8502-79dadd37daec/extract-content/0.log" Jan 28 21:47:34 crc kubenswrapper[4746]: I0128 21:47:34.048787 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-88s2s_2f7998d2-06e5-4567-8502-79dadd37daec/extract-utilities/0.log" Jan 28 21:47:34 crc kubenswrapper[4746]: I0128 21:47:34.057346 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s6mqb_f854deb7-4783-4ba8-8357-ffe4d1124a12/registry-server/0.log" Jan 28 21:47:34 crc kubenswrapper[4746]: I0128 21:47:34.345801 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-525kj_13f17d99-49bb-4710-8d6b-e01933b5d396/extract-utilities/0.log" Jan 28 21:47:34 crc kubenswrapper[4746]: I0128 21:47:34.372038 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-bgtlc_6663df81-0144-46d7-90a2-a1ff5edb9474/marketplace-operator/0.log" Jan 28 21:47:34 crc kubenswrapper[4746]: I0128 21:47:34.538046 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-88s2s_2f7998d2-06e5-4567-8502-79dadd37daec/registry-server/0.log" Jan 28 21:47:34 crc kubenswrapper[4746]: I0128 21:47:34.647317 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-525kj_13f17d99-49bb-4710-8d6b-e01933b5d396/extract-utilities/0.log" Jan 28 21:47:34 crc kubenswrapper[4746]: I0128 21:47:34.670649 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-525kj_13f17d99-49bb-4710-8d6b-e01933b5d396/extract-content/0.log" Jan 28 21:47:34 crc kubenswrapper[4746]: I0128 21:47:34.700712 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-525kj_13f17d99-49bb-4710-8d6b-e01933b5d396/extract-content/0.log" Jan 28 21:47:34 crc kubenswrapper[4746]: I0128 21:47:34.886883 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-525kj_13f17d99-49bb-4710-8d6b-e01933b5d396/extract-utilities/0.log" Jan 28 21:47:34 crc kubenswrapper[4746]: I0128 21:47:34.968580 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-525kj_13f17d99-49bb-4710-8d6b-e01933b5d396/extract-content/0.log" Jan 28 21:47:34 crc kubenswrapper[4746]: I0128 21:47:34.974716 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hp8gs_b5a70df4-251e-4d72-b220-9772f0b70727/extract-utilities/0.log" Jan 28 21:47:35 crc kubenswrapper[4746]: I0128 21:47:35.027283 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-525kj_13f17d99-49bb-4710-8d6b-e01933b5d396/registry-server/0.log" Jan 28 21:47:35 crc kubenswrapper[4746]: I0128 21:47:35.180127 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hp8gs_b5a70df4-251e-4d72-b220-9772f0b70727/extract-utilities/0.log" Jan 28 21:47:35 crc kubenswrapper[4746]: I0128 21:47:35.186051 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hp8gs_b5a70df4-251e-4d72-b220-9772f0b70727/extract-content/0.log" Jan 28 21:47:35 crc kubenswrapper[4746]: I0128 21:47:35.233306 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hp8gs_b5a70df4-251e-4d72-b220-9772f0b70727/extract-content/0.log" Jan 28 21:47:35 crc kubenswrapper[4746]: I0128 21:47:35.507621 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hp8gs_b5a70df4-251e-4d72-b220-9772f0b70727/extract-content/0.log" Jan 28 21:47:35 crc kubenswrapper[4746]: I0128 21:47:35.524260 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hp8gs_b5a70df4-251e-4d72-b220-9772f0b70727/extract-utilities/0.log" Jan 28 21:47:35 crc kubenswrapper[4746]: I0128 21:47:35.835603 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:47:35 crc kubenswrapper[4746]: E0128 21:47:35.835869 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:47:36 crc kubenswrapper[4746]: I0128 21:47:36.011569 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hp8gs_b5a70df4-251e-4d72-b220-9772f0b70727/registry-server/0.log" Jan 28 21:47:48 crc kubenswrapper[4746]: I0128 21:47:48.844845 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:47:48 crc kubenswrapper[4746]: E0128 21:47:48.845719 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:47:51 crc kubenswrapper[4746]: I0128 21:47:51.827501 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-565bff74c4-9nt8k_10acdec7-69f6-42e1-b065-c84b8d82fd03/prometheus-operator-admission-webhook/0.log" Jan 28 21:47:51 crc kubenswrapper[4746]: I0128 21:47:51.881897 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-sx9pg_0e1b10c8-2491-403a-9ea3-9805d8167d7a/prometheus-operator/0.log" Jan 28 21:47:51 crc kubenswrapper[4746]: I0128 21:47:51.928965 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-565bff74c4-stlnx_09345bfc-4171-49c5-85e3-32616db6ce17/prometheus-operator-admission-webhook/0.log" Jan 28 21:47:52 crc kubenswrapper[4746]: I0128 21:47:52.101874 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-m2mx9_2788b8ac-4eb0-46cb-8861-c55d6b302dd7/operator/0.log" Jan 28 21:47:52 crc kubenswrapper[4746]: I0128 21:47:52.159447 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-jnzwc_f13f3a63-44b1-4644-8bea-99e25a6764c3/perses-operator/0.log" Jan 28 21:47:59 crc kubenswrapper[4746]: I0128 21:47:59.836283 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:47:59 crc kubenswrapper[4746]: E0128 21:47:59.837203 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:48:09 crc kubenswrapper[4746]: I0128 21:48:09.331272 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6866b6794-24l8g_cfda6c5a-4e09-4579-9149-ba5c87aaf387/kube-rbac-proxy/0.log" Jan 28 21:48:09 crc kubenswrapper[4746]: I0128 21:48:09.426194 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6866b6794-24l8g_cfda6c5a-4e09-4579-9149-ba5c87aaf387/manager/0.log" Jan 28 21:48:13 crc kubenswrapper[4746]: I0128 21:48:13.836490 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:48:13 crc kubenswrapper[4746]: E0128 21:48:13.837201 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:48:13 crc kubenswrapper[4746]: E0128 21:48:13.918683 4746 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.201:57430->38.102.83.201:46663: write tcp 38.102.83.201:57430->38.102.83.201:46663: write: broken pipe Jan 28 21:48:28 crc kubenswrapper[4746]: I0128 21:48:28.843095 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:48:28 crc kubenswrapper[4746]: E0128 21:48:28.843703 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:48:43 crc kubenswrapper[4746]: I0128 21:48:43.835895 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:48:43 crc kubenswrapper[4746]: E0128 21:48:43.836639 4746 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-6wrnw_openshift-machine-config-operator(6dc8b546-9734-4082-b2b3-2bafe3f1564d)\"" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" Jan 28 21:48:55 crc kubenswrapper[4746]: I0128 21:48:55.835857 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:48:56 crc kubenswrapper[4746]: I0128 21:48:56.720049 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerStarted","Data":"e44f7a99cb35a4c17a34fd0bb8ec4a9a97ec79a22d9a40b49d2e2b3b03759773"} Jan 28 21:49:00 crc kubenswrapper[4746]: I0128 21:49:00.196588 4746 scope.go:117] "RemoveContainer" containerID="e94f89637d4cf11d99b951b4498a6b179d665cf3e7df3b49b37df37d82854e94" Jan 28 21:49:12 crc kubenswrapper[4746]: I0128 21:49:12.629717 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pnwhm"] Jan 28 21:49:12 crc kubenswrapper[4746]: E0128 21:49:12.631799 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54897ff3-c2b3-450b-97c9-59f657902640" containerName="extract-content" Jan 28 21:49:12 crc kubenswrapper[4746]: I0128 21:49:12.631910 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="54897ff3-c2b3-450b-97c9-59f657902640" containerName="extract-content" Jan 28 21:49:12 crc kubenswrapper[4746]: E0128 21:49:12.632012 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54897ff3-c2b3-450b-97c9-59f657902640" containerName="extract-utilities" Jan 28 21:49:12 crc kubenswrapper[4746]: I0128 21:49:12.632119 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="54897ff3-c2b3-450b-97c9-59f657902640" containerName="extract-utilities" Jan 28 21:49:12 crc kubenswrapper[4746]: E0128 21:49:12.632201 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54897ff3-c2b3-450b-97c9-59f657902640" containerName="registry-server" Jan 28 21:49:12 crc kubenswrapper[4746]: I0128 21:49:12.632273 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="54897ff3-c2b3-450b-97c9-59f657902640" containerName="registry-server" Jan 28 21:49:12 crc kubenswrapper[4746]: I0128 21:49:12.632641 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="54897ff3-c2b3-450b-97c9-59f657902640" containerName="registry-server" Jan 28 21:49:12 crc kubenswrapper[4746]: I0128 21:49:12.634597 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pnwhm" Jan 28 21:49:12 crc kubenswrapper[4746]: I0128 21:49:12.654688 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pnwhm"] Jan 28 21:49:12 crc kubenswrapper[4746]: I0128 21:49:12.710975 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g294t\" (UniqueName: \"kubernetes.io/projected/af9530cb-0330-4a1d-82b4-1513755271b8-kube-api-access-g294t\") pod \"certified-operators-pnwhm\" (UID: \"af9530cb-0330-4a1d-82b4-1513755271b8\") " pod="openshift-marketplace/certified-operators-pnwhm" Jan 28 21:49:12 crc kubenswrapper[4746]: I0128 21:49:12.711061 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9530cb-0330-4a1d-82b4-1513755271b8-catalog-content\") pod \"certified-operators-pnwhm\" (UID: \"af9530cb-0330-4a1d-82b4-1513755271b8\") " pod="openshift-marketplace/certified-operators-pnwhm" Jan 28 21:49:12 crc kubenswrapper[4746]: I0128 21:49:12.711147 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9530cb-0330-4a1d-82b4-1513755271b8-utilities\") pod \"certified-operators-pnwhm\" (UID: \"af9530cb-0330-4a1d-82b4-1513755271b8\") " pod="openshift-marketplace/certified-operators-pnwhm" Jan 28 21:49:12 crc kubenswrapper[4746]: I0128 21:49:12.813013 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g294t\" (UniqueName: \"kubernetes.io/projected/af9530cb-0330-4a1d-82b4-1513755271b8-kube-api-access-g294t\") pod \"certified-operators-pnwhm\" (UID: \"af9530cb-0330-4a1d-82b4-1513755271b8\") " pod="openshift-marketplace/certified-operators-pnwhm" Jan 28 21:49:12 crc kubenswrapper[4746]: I0128 21:49:12.813071 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9530cb-0330-4a1d-82b4-1513755271b8-catalog-content\") pod \"certified-operators-pnwhm\" (UID: \"af9530cb-0330-4a1d-82b4-1513755271b8\") " pod="openshift-marketplace/certified-operators-pnwhm" Jan 28 21:49:12 crc kubenswrapper[4746]: I0128 21:49:12.813169 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9530cb-0330-4a1d-82b4-1513755271b8-utilities\") pod \"certified-operators-pnwhm\" (UID: \"af9530cb-0330-4a1d-82b4-1513755271b8\") " pod="openshift-marketplace/certified-operators-pnwhm" Jan 28 21:49:12 crc kubenswrapper[4746]: I0128 21:49:12.813949 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9530cb-0330-4a1d-82b4-1513755271b8-catalog-content\") pod \"certified-operators-pnwhm\" (UID: \"af9530cb-0330-4a1d-82b4-1513755271b8\") " pod="openshift-marketplace/certified-operators-pnwhm" Jan 28 21:49:12 crc kubenswrapper[4746]: I0128 21:49:12.814340 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9530cb-0330-4a1d-82b4-1513755271b8-utilities\") pod \"certified-operators-pnwhm\" (UID: \"af9530cb-0330-4a1d-82b4-1513755271b8\") " pod="openshift-marketplace/certified-operators-pnwhm" Jan 28 21:49:12 crc kubenswrapper[4746]: I0128 21:49:12.831267 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g294t\" (UniqueName: \"kubernetes.io/projected/af9530cb-0330-4a1d-82b4-1513755271b8-kube-api-access-g294t\") pod \"certified-operators-pnwhm\" (UID: \"af9530cb-0330-4a1d-82b4-1513755271b8\") " pod="openshift-marketplace/certified-operators-pnwhm" Jan 28 21:49:12 crc kubenswrapper[4746]: I0128 21:49:12.984986 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pnwhm" Jan 28 21:49:13 crc kubenswrapper[4746]: I0128 21:49:13.561393 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pnwhm"] Jan 28 21:49:13 crc kubenswrapper[4746]: I0128 21:49:13.914423 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pnwhm" event={"ID":"af9530cb-0330-4a1d-82b4-1513755271b8","Type":"ContainerStarted","Data":"5b75b5177e8419b1876d3b78c988d6339c20b4f27735a40531c7480c47b6f0ce"} Jan 28 21:49:14 crc kubenswrapper[4746]: I0128 21:49:14.927547 4746 generic.go:334] "Generic (PLEG): container finished" podID="af9530cb-0330-4a1d-82b4-1513755271b8" containerID="36b6732cf9e995d56ab3219e6ab3995f5baa7d5c756dc23f80aa1c95c612028b" exitCode=0 Jan 28 21:49:14 crc kubenswrapper[4746]: I0128 21:49:14.927601 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pnwhm" event={"ID":"af9530cb-0330-4a1d-82b4-1513755271b8","Type":"ContainerDied","Data":"36b6732cf9e995d56ab3219e6ab3995f5baa7d5c756dc23f80aa1c95c612028b"} Jan 28 21:49:14 crc kubenswrapper[4746]: I0128 21:49:14.930335 4746 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 28 21:49:20 crc kubenswrapper[4746]: I0128 21:49:20.993970 4746 generic.go:334] "Generic (PLEG): container finished" podID="af9530cb-0330-4a1d-82b4-1513755271b8" containerID="ac6c8e0605d4ef018069dade33dd94c2f9ab95d8f9f090d4f1c33c4691ffa7dd" exitCode=0 Jan 28 21:49:20 crc kubenswrapper[4746]: I0128 21:49:20.994072 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pnwhm" event={"ID":"af9530cb-0330-4a1d-82b4-1513755271b8","Type":"ContainerDied","Data":"ac6c8e0605d4ef018069dade33dd94c2f9ab95d8f9f090d4f1c33c4691ffa7dd"} Jan 28 21:49:22 crc kubenswrapper[4746]: I0128 21:49:22.009035 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pnwhm" event={"ID":"af9530cb-0330-4a1d-82b4-1513755271b8","Type":"ContainerStarted","Data":"eae9bc1963bcf4b5536fc327e048842809041e9347d8b3bf5aac3ea6a324d410"} Jan 28 21:49:22 crc kubenswrapper[4746]: I0128 21:49:22.985659 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pnwhm" Jan 28 21:49:22 crc kubenswrapper[4746]: I0128 21:49:22.985714 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pnwhm" Jan 28 21:49:24 crc kubenswrapper[4746]: I0128 21:49:24.041812 4746 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-pnwhm" podUID="af9530cb-0330-4a1d-82b4-1513755271b8" containerName="registry-server" probeResult="failure" output=< Jan 28 21:49:24 crc kubenswrapper[4746]: timeout: failed to connect service ":50051" within 1s Jan 28 21:49:24 crc kubenswrapper[4746]: > Jan 28 21:49:33 crc kubenswrapper[4746]: I0128 21:49:33.036605 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pnwhm" Jan 28 21:49:33 crc kubenswrapper[4746]: I0128 21:49:33.054961 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pnwhm" podStartSLOduration=14.589329842 podStartE2EDuration="21.054943526s" podCreationTimestamp="2026-01-28 21:49:12 +0000 UTC" firstStartedPulling="2026-01-28 21:49:14.930045575 +0000 UTC m=+4182.886231929" lastFinishedPulling="2026-01-28 21:49:21.395659259 +0000 UTC m=+4189.351845613" observedRunningTime="2026-01-28 21:49:22.036391746 +0000 UTC m=+4189.992578100" watchObservedRunningTime="2026-01-28 21:49:33.054943526 +0000 UTC m=+4201.011129880" Jan 28 21:49:33 crc kubenswrapper[4746]: I0128 21:49:33.089716 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pnwhm" Jan 28 21:49:33 crc kubenswrapper[4746]: I0128 21:49:33.201621 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pnwhm"] Jan 28 21:49:33 crc kubenswrapper[4746]: I0128 21:49:33.273470 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s6mqb"] Jan 28 21:49:33 crc kubenswrapper[4746]: I0128 21:49:33.273713 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s6mqb" podUID="f854deb7-4783-4ba8-8357-ffe4d1124a12" containerName="registry-server" containerID="cri-o://dcefab010bdb0c8724a3b3e438a70f00b63153f577ea95b2a1a2f6aaef94bd83" gracePeriod=2 Jan 28 21:49:33 crc kubenswrapper[4746]: I0128 21:49:33.999177 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s6mqb" Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.067127 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn9jz\" (UniqueName: \"kubernetes.io/projected/f854deb7-4783-4ba8-8357-ffe4d1124a12-kube-api-access-zn9jz\") pod \"f854deb7-4783-4ba8-8357-ffe4d1124a12\" (UID: \"f854deb7-4783-4ba8-8357-ffe4d1124a12\") " Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.067248 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f854deb7-4783-4ba8-8357-ffe4d1124a12-utilities\") pod \"f854deb7-4783-4ba8-8357-ffe4d1124a12\" (UID: \"f854deb7-4783-4ba8-8357-ffe4d1124a12\") " Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.067343 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f854deb7-4783-4ba8-8357-ffe4d1124a12-catalog-content\") pod \"f854deb7-4783-4ba8-8357-ffe4d1124a12\" (UID: \"f854deb7-4783-4ba8-8357-ffe4d1124a12\") " Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.070720 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f854deb7-4783-4ba8-8357-ffe4d1124a12-utilities" (OuterVolumeSpecName: "utilities") pod "f854deb7-4783-4ba8-8357-ffe4d1124a12" (UID: "f854deb7-4783-4ba8-8357-ffe4d1124a12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.079391 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f854deb7-4783-4ba8-8357-ffe4d1124a12-kube-api-access-zn9jz" (OuterVolumeSpecName: "kube-api-access-zn9jz") pod "f854deb7-4783-4ba8-8357-ffe4d1124a12" (UID: "f854deb7-4783-4ba8-8357-ffe4d1124a12"). InnerVolumeSpecName "kube-api-access-zn9jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.137175 4746 generic.go:334] "Generic (PLEG): container finished" podID="f854deb7-4783-4ba8-8357-ffe4d1124a12" containerID="dcefab010bdb0c8724a3b3e438a70f00b63153f577ea95b2a1a2f6aaef94bd83" exitCode=0 Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.138153 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s6mqb" Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.138245 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6mqb" event={"ID":"f854deb7-4783-4ba8-8357-ffe4d1124a12","Type":"ContainerDied","Data":"dcefab010bdb0c8724a3b3e438a70f00b63153f577ea95b2a1a2f6aaef94bd83"} Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.138396 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6mqb" event={"ID":"f854deb7-4783-4ba8-8357-ffe4d1124a12","Type":"ContainerDied","Data":"413fa9a2c20067530673b39d23f12794682407c91bc2aeec0469025bf0dddada"} Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.138417 4746 scope.go:117] "RemoveContainer" containerID="dcefab010bdb0c8724a3b3e438a70f00b63153f577ea95b2a1a2f6aaef94bd83" Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.148183 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f854deb7-4783-4ba8-8357-ffe4d1124a12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f854deb7-4783-4ba8-8357-ffe4d1124a12" (UID: "f854deb7-4783-4ba8-8357-ffe4d1124a12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.159290 4746 scope.go:117] "RemoveContainer" containerID="d5b2313d4b6fc875d1d2879c981862d46af0d11e2a6849219d9ed8db75b88804" Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.169796 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f854deb7-4783-4ba8-8357-ffe4d1124a12-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.169823 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f854deb7-4783-4ba8-8357-ffe4d1124a12-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.169838 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn9jz\" (UniqueName: \"kubernetes.io/projected/f854deb7-4783-4ba8-8357-ffe4d1124a12-kube-api-access-zn9jz\") on node \"crc\" DevicePath \"\"" Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.180022 4746 scope.go:117] "RemoveContainer" containerID="fc089671a7b7efa345349612efbe95f8f41036144c13d84a03aca4e0aa1da969" Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.227502 4746 scope.go:117] "RemoveContainer" containerID="dcefab010bdb0c8724a3b3e438a70f00b63153f577ea95b2a1a2f6aaef94bd83" Jan 28 21:49:34 crc kubenswrapper[4746]: E0128 21:49:34.227944 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcefab010bdb0c8724a3b3e438a70f00b63153f577ea95b2a1a2f6aaef94bd83\": container with ID starting with dcefab010bdb0c8724a3b3e438a70f00b63153f577ea95b2a1a2f6aaef94bd83 not found: ID does not exist" containerID="dcefab010bdb0c8724a3b3e438a70f00b63153f577ea95b2a1a2f6aaef94bd83" Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.227983 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcefab010bdb0c8724a3b3e438a70f00b63153f577ea95b2a1a2f6aaef94bd83"} err="failed to get container status \"dcefab010bdb0c8724a3b3e438a70f00b63153f577ea95b2a1a2f6aaef94bd83\": rpc error: code = NotFound desc = could not find container \"dcefab010bdb0c8724a3b3e438a70f00b63153f577ea95b2a1a2f6aaef94bd83\": container with ID starting with dcefab010bdb0c8724a3b3e438a70f00b63153f577ea95b2a1a2f6aaef94bd83 not found: ID does not exist" Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.228011 4746 scope.go:117] "RemoveContainer" containerID="d5b2313d4b6fc875d1d2879c981862d46af0d11e2a6849219d9ed8db75b88804" Jan 28 21:49:34 crc kubenswrapper[4746]: E0128 21:49:34.228315 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5b2313d4b6fc875d1d2879c981862d46af0d11e2a6849219d9ed8db75b88804\": container with ID starting with d5b2313d4b6fc875d1d2879c981862d46af0d11e2a6849219d9ed8db75b88804 not found: ID does not exist" containerID="d5b2313d4b6fc875d1d2879c981862d46af0d11e2a6849219d9ed8db75b88804" Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.228341 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5b2313d4b6fc875d1d2879c981862d46af0d11e2a6849219d9ed8db75b88804"} err="failed to get container status \"d5b2313d4b6fc875d1d2879c981862d46af0d11e2a6849219d9ed8db75b88804\": rpc error: code = NotFound desc = could not find container \"d5b2313d4b6fc875d1d2879c981862d46af0d11e2a6849219d9ed8db75b88804\": container with ID starting with d5b2313d4b6fc875d1d2879c981862d46af0d11e2a6849219d9ed8db75b88804 not found: ID does not exist" Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.228359 4746 scope.go:117] "RemoveContainer" containerID="fc089671a7b7efa345349612efbe95f8f41036144c13d84a03aca4e0aa1da969" Jan 28 21:49:34 crc kubenswrapper[4746]: E0128 21:49:34.228603 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc089671a7b7efa345349612efbe95f8f41036144c13d84a03aca4e0aa1da969\": container with ID starting with fc089671a7b7efa345349612efbe95f8f41036144c13d84a03aca4e0aa1da969 not found: ID does not exist" containerID="fc089671a7b7efa345349612efbe95f8f41036144c13d84a03aca4e0aa1da969" Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.228628 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc089671a7b7efa345349612efbe95f8f41036144c13d84a03aca4e0aa1da969"} err="failed to get container status \"fc089671a7b7efa345349612efbe95f8f41036144c13d84a03aca4e0aa1da969\": rpc error: code = NotFound desc = could not find container \"fc089671a7b7efa345349612efbe95f8f41036144c13d84a03aca4e0aa1da969\": container with ID starting with fc089671a7b7efa345349612efbe95f8f41036144c13d84a03aca4e0aa1da969 not found: ID does not exist" Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.536547 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s6mqb"] Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.562911 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s6mqb"] Jan 28 21:49:34 crc kubenswrapper[4746]: I0128 21:49:34.847531 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f854deb7-4783-4ba8-8357-ffe4d1124a12" path="/var/lib/kubelet/pods/f854deb7-4783-4ba8-8357-ffe4d1124a12/volumes" Jan 28 21:50:00 crc kubenswrapper[4746]: I0128 21:50:00.253378 4746 scope.go:117] "RemoveContainer" containerID="fa48f78d8a459990ab27d5e0d3a7362141af9dd1ef6dd9ac093376c1e06c240f" Jan 28 21:50:04 crc kubenswrapper[4746]: I0128 21:50:04.480498 4746 generic.go:334] "Generic (PLEG): container finished" podID="e439b8d8-d2b4-4169-8b41-497ec17f2018" containerID="0967d64c50b8fdb55fc3b93abc6ee7557195580632f2de7a93b54fa37c9b70c3" exitCode=0 Jan 28 21:50:04 crc kubenswrapper[4746]: I0128 21:50:04.480585 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pjp7s/must-gather-htgzp" event={"ID":"e439b8d8-d2b4-4169-8b41-497ec17f2018","Type":"ContainerDied","Data":"0967d64c50b8fdb55fc3b93abc6ee7557195580632f2de7a93b54fa37c9b70c3"} Jan 28 21:50:04 crc kubenswrapper[4746]: I0128 21:50:04.481711 4746 scope.go:117] "RemoveContainer" containerID="0967d64c50b8fdb55fc3b93abc6ee7557195580632f2de7a93b54fa37c9b70c3" Jan 28 21:50:05 crc kubenswrapper[4746]: I0128 21:50:05.199194 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pjp7s_must-gather-htgzp_e439b8d8-d2b4-4169-8b41-497ec17f2018/gather/0.log" Jan 28 21:50:17 crc kubenswrapper[4746]: I0128 21:50:17.644817 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pjp7s/must-gather-htgzp"] Jan 28 21:50:17 crc kubenswrapper[4746]: I0128 21:50:17.645807 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-pjp7s/must-gather-htgzp" podUID="e439b8d8-d2b4-4169-8b41-497ec17f2018" containerName="copy" containerID="cri-o://b07190ee5891a0c20f8eac80e364aa3a84e9446c7ac87eabf4a409ac5c5821f4" gracePeriod=2 Jan 28 21:50:17 crc kubenswrapper[4746]: I0128 21:50:17.658093 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pjp7s/must-gather-htgzp"] Jan 28 21:50:18 crc kubenswrapper[4746]: I0128 21:50:18.307316 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pjp7s_must-gather-htgzp_e439b8d8-d2b4-4169-8b41-497ec17f2018/copy/0.log" Jan 28 21:50:18 crc kubenswrapper[4746]: I0128 21:50:18.307874 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pjp7s/must-gather-htgzp" Jan 28 21:50:18 crc kubenswrapper[4746]: I0128 21:50:18.473792 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nsqt\" (UniqueName: \"kubernetes.io/projected/e439b8d8-d2b4-4169-8b41-497ec17f2018-kube-api-access-9nsqt\") pod \"e439b8d8-d2b4-4169-8b41-497ec17f2018\" (UID: \"e439b8d8-d2b4-4169-8b41-497ec17f2018\") " Jan 28 21:50:18 crc kubenswrapper[4746]: I0128 21:50:18.473959 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e439b8d8-d2b4-4169-8b41-497ec17f2018-must-gather-output\") pod \"e439b8d8-d2b4-4169-8b41-497ec17f2018\" (UID: \"e439b8d8-d2b4-4169-8b41-497ec17f2018\") " Jan 28 21:50:18 crc kubenswrapper[4746]: I0128 21:50:18.503867 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e439b8d8-d2b4-4169-8b41-497ec17f2018-kube-api-access-9nsqt" (OuterVolumeSpecName: "kube-api-access-9nsqt") pod "e439b8d8-d2b4-4169-8b41-497ec17f2018" (UID: "e439b8d8-d2b4-4169-8b41-497ec17f2018"). InnerVolumeSpecName "kube-api-access-9nsqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:50:18 crc kubenswrapper[4746]: I0128 21:50:18.576693 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nsqt\" (UniqueName: \"kubernetes.io/projected/e439b8d8-d2b4-4169-8b41-497ec17f2018-kube-api-access-9nsqt\") on node \"crc\" DevicePath \"\"" Jan 28 21:50:18 crc kubenswrapper[4746]: I0128 21:50:18.617534 4746 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pjp7s_must-gather-htgzp_e439b8d8-d2b4-4169-8b41-497ec17f2018/copy/0.log" Jan 28 21:50:18 crc kubenswrapper[4746]: I0128 21:50:18.618374 4746 generic.go:334] "Generic (PLEG): container finished" podID="e439b8d8-d2b4-4169-8b41-497ec17f2018" containerID="b07190ee5891a0c20f8eac80e364aa3a84e9446c7ac87eabf4a409ac5c5821f4" exitCode=143 Jan 28 21:50:18 crc kubenswrapper[4746]: I0128 21:50:18.618507 4746 scope.go:117] "RemoveContainer" containerID="b07190ee5891a0c20f8eac80e364aa3a84e9446c7ac87eabf4a409ac5c5821f4" Jan 28 21:50:18 crc kubenswrapper[4746]: I0128 21:50:18.618779 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pjp7s/must-gather-htgzp" Jan 28 21:50:18 crc kubenswrapper[4746]: I0128 21:50:18.644139 4746 scope.go:117] "RemoveContainer" containerID="0967d64c50b8fdb55fc3b93abc6ee7557195580632f2de7a93b54fa37c9b70c3" Jan 28 21:50:18 crc kubenswrapper[4746]: I0128 21:50:18.705287 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e439b8d8-d2b4-4169-8b41-497ec17f2018-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e439b8d8-d2b4-4169-8b41-497ec17f2018" (UID: "e439b8d8-d2b4-4169-8b41-497ec17f2018"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:50:18 crc kubenswrapper[4746]: I0128 21:50:18.717235 4746 scope.go:117] "RemoveContainer" containerID="b07190ee5891a0c20f8eac80e364aa3a84e9446c7ac87eabf4a409ac5c5821f4" Jan 28 21:50:18 crc kubenswrapper[4746]: E0128 21:50:18.727255 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b07190ee5891a0c20f8eac80e364aa3a84e9446c7ac87eabf4a409ac5c5821f4\": container with ID starting with b07190ee5891a0c20f8eac80e364aa3a84e9446c7ac87eabf4a409ac5c5821f4 not found: ID does not exist" containerID="b07190ee5891a0c20f8eac80e364aa3a84e9446c7ac87eabf4a409ac5c5821f4" Jan 28 21:50:18 crc kubenswrapper[4746]: I0128 21:50:18.727301 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b07190ee5891a0c20f8eac80e364aa3a84e9446c7ac87eabf4a409ac5c5821f4"} err="failed to get container status \"b07190ee5891a0c20f8eac80e364aa3a84e9446c7ac87eabf4a409ac5c5821f4\": rpc error: code = NotFound desc = could not find container \"b07190ee5891a0c20f8eac80e364aa3a84e9446c7ac87eabf4a409ac5c5821f4\": container with ID starting with b07190ee5891a0c20f8eac80e364aa3a84e9446c7ac87eabf4a409ac5c5821f4 not found: ID does not exist" Jan 28 21:50:18 crc kubenswrapper[4746]: I0128 21:50:18.727327 4746 scope.go:117] "RemoveContainer" containerID="0967d64c50b8fdb55fc3b93abc6ee7557195580632f2de7a93b54fa37c9b70c3" Jan 28 21:50:18 crc kubenswrapper[4746]: E0128 21:50:18.728286 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0967d64c50b8fdb55fc3b93abc6ee7557195580632f2de7a93b54fa37c9b70c3\": container with ID starting with 0967d64c50b8fdb55fc3b93abc6ee7557195580632f2de7a93b54fa37c9b70c3 not found: ID does not exist" containerID="0967d64c50b8fdb55fc3b93abc6ee7557195580632f2de7a93b54fa37c9b70c3" Jan 28 21:50:18 crc kubenswrapper[4746]: I0128 21:50:18.728332 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0967d64c50b8fdb55fc3b93abc6ee7557195580632f2de7a93b54fa37c9b70c3"} err="failed to get container status \"0967d64c50b8fdb55fc3b93abc6ee7557195580632f2de7a93b54fa37c9b70c3\": rpc error: code = NotFound desc = could not find container \"0967d64c50b8fdb55fc3b93abc6ee7557195580632f2de7a93b54fa37c9b70c3\": container with ID starting with 0967d64c50b8fdb55fc3b93abc6ee7557195580632f2de7a93b54fa37c9b70c3 not found: ID does not exist" Jan 28 21:50:18 crc kubenswrapper[4746]: I0128 21:50:18.780249 4746 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e439b8d8-d2b4-4169-8b41-497ec17f2018-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 28 21:50:18 crc kubenswrapper[4746]: I0128 21:50:18.851206 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e439b8d8-d2b4-4169-8b41-497ec17f2018" path="/var/lib/kubelet/pods/e439b8d8-d2b4-4169-8b41-497ec17f2018/volumes" Jan 28 21:51:15 crc kubenswrapper[4746]: I0128 21:51:15.871162 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:51:15 crc kubenswrapper[4746]: I0128 21:51:15.872021 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:51:45 crc kubenswrapper[4746]: I0128 21:51:45.871314 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:51:45 crc kubenswrapper[4746]: I0128 21:51:45.871953 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:52:15 crc kubenswrapper[4746]: I0128 21:52:15.871575 4746 patch_prober.go:28] interesting pod/machine-config-daemon-6wrnw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 28 21:52:15 crc kubenswrapper[4746]: I0128 21:52:15.872288 4746 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 28 21:52:15 crc kubenswrapper[4746]: I0128 21:52:15.872357 4746 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" Jan 28 21:52:15 crc kubenswrapper[4746]: I0128 21:52:15.873522 4746 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e44f7a99cb35a4c17a34fd0bb8ec4a9a97ec79a22d9a40b49d2e2b3b03759773"} pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 28 21:52:15 crc kubenswrapper[4746]: I0128 21:52:15.873632 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" podUID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerName="machine-config-daemon" containerID="cri-o://e44f7a99cb35a4c17a34fd0bb8ec4a9a97ec79a22d9a40b49d2e2b3b03759773" gracePeriod=600 Jan 28 21:52:16 crc kubenswrapper[4746]: I0128 21:52:16.810339 4746 generic.go:334] "Generic (PLEG): container finished" podID="6dc8b546-9734-4082-b2b3-2bafe3f1564d" containerID="e44f7a99cb35a4c17a34fd0bb8ec4a9a97ec79a22d9a40b49d2e2b3b03759773" exitCode=0 Jan 28 21:52:16 crc kubenswrapper[4746]: I0128 21:52:16.810424 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerDied","Data":"e44f7a99cb35a4c17a34fd0bb8ec4a9a97ec79a22d9a40b49d2e2b3b03759773"} Jan 28 21:52:16 crc kubenswrapper[4746]: I0128 21:52:16.811055 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6wrnw" event={"ID":"6dc8b546-9734-4082-b2b3-2bafe3f1564d","Type":"ContainerStarted","Data":"71838e8075629056d51d2a7543d04eb288e8352e57fec0e07bb9884c3709be53"} Jan 28 21:52:16 crc kubenswrapper[4746]: I0128 21:52:16.811132 4746 scope.go:117] "RemoveContainer" containerID="2c9e3325547a2a9e78b84305cb2707ed2def619d2d50bb736bb2f414ca2be430" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.090965 4746 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bghv8"] Jan 28 21:53:33 crc kubenswrapper[4746]: E0128 21:53:33.092109 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e439b8d8-d2b4-4169-8b41-497ec17f2018" containerName="copy" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.092123 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="e439b8d8-d2b4-4169-8b41-497ec17f2018" containerName="copy" Jan 28 21:53:33 crc kubenswrapper[4746]: E0128 21:53:33.092149 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f854deb7-4783-4ba8-8357-ffe4d1124a12" containerName="extract-content" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.092157 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f854deb7-4783-4ba8-8357-ffe4d1124a12" containerName="extract-content" Jan 28 21:53:33 crc kubenswrapper[4746]: E0128 21:53:33.092180 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e439b8d8-d2b4-4169-8b41-497ec17f2018" containerName="gather" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.092189 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="e439b8d8-d2b4-4169-8b41-497ec17f2018" containerName="gather" Jan 28 21:53:33 crc kubenswrapper[4746]: E0128 21:53:33.092200 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f854deb7-4783-4ba8-8357-ffe4d1124a12" containerName="registry-server" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.092210 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f854deb7-4783-4ba8-8357-ffe4d1124a12" containerName="registry-server" Jan 28 21:53:33 crc kubenswrapper[4746]: E0128 21:53:33.092243 4746 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f854deb7-4783-4ba8-8357-ffe4d1124a12" containerName="extract-utilities" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.092252 4746 state_mem.go:107] "Deleted CPUSet assignment" podUID="f854deb7-4783-4ba8-8357-ffe4d1124a12" containerName="extract-utilities" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.092479 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="e439b8d8-d2b4-4169-8b41-497ec17f2018" containerName="gather" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.092499 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="f854deb7-4783-4ba8-8357-ffe4d1124a12" containerName="registry-server" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.092520 4746 memory_manager.go:354] "RemoveStaleState removing state" podUID="e439b8d8-d2b4-4169-8b41-497ec17f2018" containerName="copy" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.094514 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bghv8" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.112210 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bghv8"] Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.271209 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e39f4906-ed94-4140-9456-79b5ddb1642f-catalog-content\") pod \"redhat-marketplace-bghv8\" (UID: \"e39f4906-ed94-4140-9456-79b5ddb1642f\") " pod="openshift-marketplace/redhat-marketplace-bghv8" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.271464 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e39f4906-ed94-4140-9456-79b5ddb1642f-utilities\") pod \"redhat-marketplace-bghv8\" (UID: \"e39f4906-ed94-4140-9456-79b5ddb1642f\") " pod="openshift-marketplace/redhat-marketplace-bghv8" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.271561 4746 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5vz2\" (UniqueName: \"kubernetes.io/projected/e39f4906-ed94-4140-9456-79b5ddb1642f-kube-api-access-h5vz2\") pod \"redhat-marketplace-bghv8\" (UID: \"e39f4906-ed94-4140-9456-79b5ddb1642f\") " pod="openshift-marketplace/redhat-marketplace-bghv8" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.373045 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5vz2\" (UniqueName: \"kubernetes.io/projected/e39f4906-ed94-4140-9456-79b5ddb1642f-kube-api-access-h5vz2\") pod \"redhat-marketplace-bghv8\" (UID: \"e39f4906-ed94-4140-9456-79b5ddb1642f\") " pod="openshift-marketplace/redhat-marketplace-bghv8" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.373182 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e39f4906-ed94-4140-9456-79b5ddb1642f-catalog-content\") pod \"redhat-marketplace-bghv8\" (UID: \"e39f4906-ed94-4140-9456-79b5ddb1642f\") " pod="openshift-marketplace/redhat-marketplace-bghv8" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.373211 4746 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e39f4906-ed94-4140-9456-79b5ddb1642f-utilities\") pod \"redhat-marketplace-bghv8\" (UID: \"e39f4906-ed94-4140-9456-79b5ddb1642f\") " pod="openshift-marketplace/redhat-marketplace-bghv8" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.373667 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e39f4906-ed94-4140-9456-79b5ddb1642f-catalog-content\") pod \"redhat-marketplace-bghv8\" (UID: \"e39f4906-ed94-4140-9456-79b5ddb1642f\") " pod="openshift-marketplace/redhat-marketplace-bghv8" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.373680 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e39f4906-ed94-4140-9456-79b5ddb1642f-utilities\") pod \"redhat-marketplace-bghv8\" (UID: \"e39f4906-ed94-4140-9456-79b5ddb1642f\") " pod="openshift-marketplace/redhat-marketplace-bghv8" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.395207 4746 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5vz2\" (UniqueName: \"kubernetes.io/projected/e39f4906-ed94-4140-9456-79b5ddb1642f-kube-api-access-h5vz2\") pod \"redhat-marketplace-bghv8\" (UID: \"e39f4906-ed94-4140-9456-79b5ddb1642f\") " pod="openshift-marketplace/redhat-marketplace-bghv8" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.419940 4746 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bghv8" Jan 28 21:53:33 crc kubenswrapper[4746]: I0128 21:53:33.929868 4746 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bghv8"] Jan 28 21:53:34 crc kubenswrapper[4746]: I0128 21:53:34.636963 4746 generic.go:334] "Generic (PLEG): container finished" podID="e39f4906-ed94-4140-9456-79b5ddb1642f" containerID="69ceda22c3e945142432d33d92f527796f3ac3737ea267275dc7d7e4a27714b1" exitCode=0 Jan 28 21:53:34 crc kubenswrapper[4746]: I0128 21:53:34.637019 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bghv8" event={"ID":"e39f4906-ed94-4140-9456-79b5ddb1642f","Type":"ContainerDied","Data":"69ceda22c3e945142432d33d92f527796f3ac3737ea267275dc7d7e4a27714b1"} Jan 28 21:53:34 crc kubenswrapper[4746]: I0128 21:53:34.637271 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bghv8" event={"ID":"e39f4906-ed94-4140-9456-79b5ddb1642f","Type":"ContainerStarted","Data":"58375422b9407a49d4ac8a821cdf6cda420587893fd33dba75230dfc287181ca"} Jan 28 21:53:35 crc kubenswrapper[4746]: I0128 21:53:35.646840 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bghv8" event={"ID":"e39f4906-ed94-4140-9456-79b5ddb1642f","Type":"ContainerStarted","Data":"760545a04bb02702e10666510bd96eb600f70db4d917400e99d129b6e376a80b"} Jan 28 21:53:36 crc kubenswrapper[4746]: I0128 21:53:36.675208 4746 generic.go:334] "Generic (PLEG): container finished" podID="e39f4906-ed94-4140-9456-79b5ddb1642f" containerID="760545a04bb02702e10666510bd96eb600f70db4d917400e99d129b6e376a80b" exitCode=0 Jan 28 21:53:36 crc kubenswrapper[4746]: I0128 21:53:36.676214 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bghv8" event={"ID":"e39f4906-ed94-4140-9456-79b5ddb1642f","Type":"ContainerDied","Data":"760545a04bb02702e10666510bd96eb600f70db4d917400e99d129b6e376a80b"} Jan 28 21:53:37 crc kubenswrapper[4746]: I0128 21:53:37.686131 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bghv8" event={"ID":"e39f4906-ed94-4140-9456-79b5ddb1642f","Type":"ContainerStarted","Data":"dc56738fab53a7132a6fdcd8b05f5fa32fa030ed9fcec636530a730023b2a32f"} Jan 28 21:53:37 crc kubenswrapper[4746]: I0128 21:53:37.703782 4746 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bghv8" podStartSLOduration=2.118188357 podStartE2EDuration="4.70376341s" podCreationTimestamp="2026-01-28 21:53:33 +0000 UTC" firstStartedPulling="2026-01-28 21:53:34.638917245 +0000 UTC m=+4442.595103599" lastFinishedPulling="2026-01-28 21:53:37.224492288 +0000 UTC m=+4445.180678652" observedRunningTime="2026-01-28 21:53:37.700421038 +0000 UTC m=+4445.656607402" watchObservedRunningTime="2026-01-28 21:53:37.70376341 +0000 UTC m=+4445.659949784" Jan 28 21:53:43 crc kubenswrapper[4746]: I0128 21:53:43.420184 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bghv8" Jan 28 21:53:43 crc kubenswrapper[4746]: I0128 21:53:43.420689 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bghv8" Jan 28 21:53:43 crc kubenswrapper[4746]: I0128 21:53:43.470449 4746 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bghv8" Jan 28 21:53:43 crc kubenswrapper[4746]: I0128 21:53:43.808918 4746 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bghv8" Jan 28 21:53:43 crc kubenswrapper[4746]: I0128 21:53:43.855209 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bghv8"] Jan 28 21:53:45 crc kubenswrapper[4746]: I0128 21:53:45.782173 4746 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bghv8" podUID="e39f4906-ed94-4140-9456-79b5ddb1642f" containerName="registry-server" containerID="cri-o://dc56738fab53a7132a6fdcd8b05f5fa32fa030ed9fcec636530a730023b2a32f" gracePeriod=2 Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.525525 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bghv8" Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.612836 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e39f4906-ed94-4140-9456-79b5ddb1642f-utilities\") pod \"e39f4906-ed94-4140-9456-79b5ddb1642f\" (UID: \"e39f4906-ed94-4140-9456-79b5ddb1642f\") " Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.612872 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e39f4906-ed94-4140-9456-79b5ddb1642f-catalog-content\") pod \"e39f4906-ed94-4140-9456-79b5ddb1642f\" (UID: \"e39f4906-ed94-4140-9456-79b5ddb1642f\") " Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.613063 4746 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5vz2\" (UniqueName: \"kubernetes.io/projected/e39f4906-ed94-4140-9456-79b5ddb1642f-kube-api-access-h5vz2\") pod \"e39f4906-ed94-4140-9456-79b5ddb1642f\" (UID: \"e39f4906-ed94-4140-9456-79b5ddb1642f\") " Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.613837 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e39f4906-ed94-4140-9456-79b5ddb1642f-utilities" (OuterVolumeSpecName: "utilities") pod "e39f4906-ed94-4140-9456-79b5ddb1642f" (UID: "e39f4906-ed94-4140-9456-79b5ddb1642f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.619311 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e39f4906-ed94-4140-9456-79b5ddb1642f-kube-api-access-h5vz2" (OuterVolumeSpecName: "kube-api-access-h5vz2") pod "e39f4906-ed94-4140-9456-79b5ddb1642f" (UID: "e39f4906-ed94-4140-9456-79b5ddb1642f"). InnerVolumeSpecName "kube-api-access-h5vz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.635244 4746 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e39f4906-ed94-4140-9456-79b5ddb1642f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e39f4906-ed94-4140-9456-79b5ddb1642f" (UID: "e39f4906-ed94-4140-9456-79b5ddb1642f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.714982 4746 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e39f4906-ed94-4140-9456-79b5ddb1642f-utilities\") on node \"crc\" DevicePath \"\"" Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.715021 4746 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e39f4906-ed94-4140-9456-79b5ddb1642f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.715033 4746 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5vz2\" (UniqueName: \"kubernetes.io/projected/e39f4906-ed94-4140-9456-79b5ddb1642f-kube-api-access-h5vz2\") on node \"crc\" DevicePath \"\"" Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.791978 4746 generic.go:334] "Generic (PLEG): container finished" podID="e39f4906-ed94-4140-9456-79b5ddb1642f" containerID="dc56738fab53a7132a6fdcd8b05f5fa32fa030ed9fcec636530a730023b2a32f" exitCode=0 Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.792032 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bghv8" event={"ID":"e39f4906-ed94-4140-9456-79b5ddb1642f","Type":"ContainerDied","Data":"dc56738fab53a7132a6fdcd8b05f5fa32fa030ed9fcec636530a730023b2a32f"} Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.792269 4746 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bghv8" event={"ID":"e39f4906-ed94-4140-9456-79b5ddb1642f","Type":"ContainerDied","Data":"58375422b9407a49d4ac8a821cdf6cda420587893fd33dba75230dfc287181ca"} Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.792298 4746 scope.go:117] "RemoveContainer" containerID="dc56738fab53a7132a6fdcd8b05f5fa32fa030ed9fcec636530a730023b2a32f" Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.792095 4746 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bghv8" Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.818558 4746 scope.go:117] "RemoveContainer" containerID="760545a04bb02702e10666510bd96eb600f70db4d917400e99d129b6e376a80b" Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.832661 4746 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bghv8"] Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.853531 4746 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bghv8"] Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.861457 4746 scope.go:117] "RemoveContainer" containerID="69ceda22c3e945142432d33d92f527796f3ac3737ea267275dc7d7e4a27714b1" Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.895791 4746 scope.go:117] "RemoveContainer" containerID="dc56738fab53a7132a6fdcd8b05f5fa32fa030ed9fcec636530a730023b2a32f" Jan 28 21:53:46 crc kubenswrapper[4746]: E0128 21:53:46.896241 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc56738fab53a7132a6fdcd8b05f5fa32fa030ed9fcec636530a730023b2a32f\": container with ID starting with dc56738fab53a7132a6fdcd8b05f5fa32fa030ed9fcec636530a730023b2a32f not found: ID does not exist" containerID="dc56738fab53a7132a6fdcd8b05f5fa32fa030ed9fcec636530a730023b2a32f" Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.896290 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc56738fab53a7132a6fdcd8b05f5fa32fa030ed9fcec636530a730023b2a32f"} err="failed to get container status \"dc56738fab53a7132a6fdcd8b05f5fa32fa030ed9fcec636530a730023b2a32f\": rpc error: code = NotFound desc = could not find container \"dc56738fab53a7132a6fdcd8b05f5fa32fa030ed9fcec636530a730023b2a32f\": container with ID starting with dc56738fab53a7132a6fdcd8b05f5fa32fa030ed9fcec636530a730023b2a32f not found: ID does not exist" Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.896311 4746 scope.go:117] "RemoveContainer" containerID="760545a04bb02702e10666510bd96eb600f70db4d917400e99d129b6e376a80b" Jan 28 21:53:46 crc kubenswrapper[4746]: E0128 21:53:46.896690 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"760545a04bb02702e10666510bd96eb600f70db4d917400e99d129b6e376a80b\": container with ID starting with 760545a04bb02702e10666510bd96eb600f70db4d917400e99d129b6e376a80b not found: ID does not exist" containerID="760545a04bb02702e10666510bd96eb600f70db4d917400e99d129b6e376a80b" Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.896718 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"760545a04bb02702e10666510bd96eb600f70db4d917400e99d129b6e376a80b"} err="failed to get container status \"760545a04bb02702e10666510bd96eb600f70db4d917400e99d129b6e376a80b\": rpc error: code = NotFound desc = could not find container \"760545a04bb02702e10666510bd96eb600f70db4d917400e99d129b6e376a80b\": container with ID starting with 760545a04bb02702e10666510bd96eb600f70db4d917400e99d129b6e376a80b not found: ID does not exist" Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.896739 4746 scope.go:117] "RemoveContainer" containerID="69ceda22c3e945142432d33d92f527796f3ac3737ea267275dc7d7e4a27714b1" Jan 28 21:53:46 crc kubenswrapper[4746]: E0128 21:53:46.897191 4746 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69ceda22c3e945142432d33d92f527796f3ac3737ea267275dc7d7e4a27714b1\": container with ID starting with 69ceda22c3e945142432d33d92f527796f3ac3737ea267275dc7d7e4a27714b1 not found: ID does not exist" containerID="69ceda22c3e945142432d33d92f527796f3ac3737ea267275dc7d7e4a27714b1" Jan 28 21:53:46 crc kubenswrapper[4746]: I0128 21:53:46.897212 4746 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69ceda22c3e945142432d33d92f527796f3ac3737ea267275dc7d7e4a27714b1"} err="failed to get container status \"69ceda22c3e945142432d33d92f527796f3ac3737ea267275dc7d7e4a27714b1\": rpc error: code = NotFound desc = could not find container \"69ceda22c3e945142432d33d92f527796f3ac3737ea267275dc7d7e4a27714b1\": container with ID starting with 69ceda22c3e945142432d33d92f527796f3ac3737ea267275dc7d7e4a27714b1 not found: ID does not exist" Jan 28 21:53:48 crc kubenswrapper[4746]: I0128 21:53:48.858466 4746 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e39f4906-ed94-4140-9456-79b5ddb1642f" path="/var/lib/kubelet/pods/e39f4906-ed94-4140-9456-79b5ddb1642f/volumes" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515136502622024447 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015136502622017364 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015136471503016512 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015136471504015463 5ustar corecore